Nov 28 01:42:47 localhost kernel: Linux version 5.14.0-284.11.1.el9_2.x86_64 (mockbuild@x86-vm-09.build.eng.bos.redhat.com) (gcc (GCC) 11.3.1 20221121 (Red Hat 11.3.1-4), GNU ld version 2.35.2-37.el9) #1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023 Nov 28 01:42:47 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com. Nov 28 01:42:47 localhost kernel: Command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Nov 28 01:42:47 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Nov 28 01:42:47 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Nov 28 01:42:47 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Nov 28 01:42:47 localhost kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Nov 28 01:42:47 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Nov 28 01:42:47 localhost kernel: signal: max sigframe size: 1776 Nov 28 01:42:47 localhost kernel: BIOS-provided physical RAM map: Nov 28 01:42:47 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Nov 28 01:42:47 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Nov 28 01:42:47 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Nov 28 01:42:47 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable Nov 28 01:42:47 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved Nov 28 01:42:47 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Nov 28 01:42:47 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Nov 28 01:42:47 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000043fffffff] usable Nov 28 01:42:47 localhost kernel: NX (Execute Disable) protection: active Nov 28 01:42:47 localhost kernel: SMBIOS 2.8 present. Nov 28 01:42:47 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014 Nov 28 01:42:47 localhost kernel: Hypervisor detected: KVM Nov 28 01:42:47 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Nov 28 01:42:47 localhost kernel: kvm-clock: using sched offset of 1782068177 cycles Nov 28 01:42:47 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Nov 28 01:42:47 localhost kernel: tsc: Detected 2799.998 MHz processor Nov 28 01:42:47 localhost kernel: last_pfn = 0x440000 max_arch_pfn = 0x400000000 Nov 28 01:42:47 localhost kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Nov 28 01:42:47 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000 Nov 28 01:42:47 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef] Nov 28 01:42:47 localhost kernel: Using GB pages for direct mapping Nov 28 01:42:47 localhost kernel: RAMDISK: [mem 0x2eef4000-0x33771fff] Nov 28 01:42:47 localhost kernel: ACPI: Early table checksum verification disabled Nov 28 01:42:47 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Nov 28 01:42:47 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) Nov 28 01:42:47 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) Nov 28 01:42:47 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS BXPC 00000001 BXPC 00000001) Nov 28 01:42:47 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040 Nov 28 01:42:47 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Nov 28 01:42:47 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Nov 28 01:42:47 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4] Nov 28 01:42:47 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570] Nov 28 01:42:47 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f] Nov 28 01:42:47 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694] Nov 28 01:42:47 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc] Nov 28 01:42:47 localhost kernel: No NUMA configuration found Nov 28 01:42:47 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000043fffffff] Nov 28 01:42:47 localhost kernel: NODE_DATA(0) allocated [mem 0x43ffd5000-0x43fffffff] Nov 28 01:42:47 localhost kernel: Reserving 256MB of memory at 2800MB for crashkernel (System RAM: 16383MB) Nov 28 01:42:47 localhost kernel: Zone ranges: Nov 28 01:42:47 localhost kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Nov 28 01:42:47 localhost kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Nov 28 01:42:47 localhost kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Nov 28 01:42:47 localhost kernel: Device empty Nov 28 01:42:47 localhost kernel: Movable zone start for each node Nov 28 01:42:47 localhost kernel: Early memory node ranges Nov 28 01:42:47 localhost kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Nov 28 01:42:47 localhost kernel: node 0: [mem 0x0000000000100000-0x00000000bffdafff] Nov 28 01:42:47 localhost kernel: node 0: [mem 0x0000000100000000-0x000000043fffffff] Nov 28 01:42:47 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000043fffffff] Nov 28 01:42:47 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges Nov 28 01:42:47 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges Nov 28 01:42:47 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges Nov 28 01:42:47 localhost kernel: ACPI: PM-Timer IO Port: 0x608 Nov 28 01:42:47 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Nov 28 01:42:47 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Nov 28 01:42:47 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Nov 28 01:42:47 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Nov 28 01:42:47 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Nov 28 01:42:47 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Nov 28 01:42:47 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Nov 28 01:42:47 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information Nov 28 01:42:47 localhost kernel: TSC deadline timer available Nov 28 01:42:47 localhost kernel: smpboot: Allowing 8 CPUs, 0 hotplug CPUs Nov 28 01:42:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff] Nov 28 01:42:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff] Nov 28 01:42:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff] Nov 28 01:42:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff] Nov 28 01:42:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff] Nov 28 01:42:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff] Nov 28 01:42:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff] Nov 28 01:42:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff] Nov 28 01:42:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff] Nov 28 01:42:47 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices Nov 28 01:42:47 localhost kernel: Booting paravirtualized kernel on KVM Nov 28 01:42:47 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Nov 28 01:42:47 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1 Nov 28 01:42:47 localhost kernel: percpu: Embedded 55 pages/cpu s188416 r8192 d28672 u262144 Nov 28 01:42:47 localhost kernel: kvm-guest: PV spinlocks disabled, no host support Nov 28 01:42:47 localhost kernel: Fallback order for Node 0: 0 Nov 28 01:42:47 localhost kernel: Built 1 zonelists, mobility grouping on. Total pages: 4128475 Nov 28 01:42:47 localhost kernel: Policy zone: Normal Nov 28 01:42:47 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Nov 28 01:42:47 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64", will be passed to user space. Nov 28 01:42:47 localhost kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Nov 28 01:42:47 localhost kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Nov 28 01:42:47 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Nov 28 01:42:47 localhost kernel: software IO TLB: area num 8. Nov 28 01:42:47 localhost kernel: Memory: 2873456K/16776676K available (14342K kernel code, 5536K rwdata, 10180K rodata, 2792K init, 7524K bss, 741260K reserved, 0K cma-reserved) Nov 28 01:42:47 localhost kernel: random: get_random_u64 called from kmem_cache_open+0x1e/0x210 with crng_init=0 Nov 28 01:42:47 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1 Nov 28 01:42:47 localhost kernel: ftrace: allocating 44803 entries in 176 pages Nov 28 01:42:47 localhost kernel: ftrace: allocated 176 pages with 3 groups Nov 28 01:42:47 localhost kernel: Dynamic Preempt: voluntary Nov 28 01:42:47 localhost kernel: rcu: Preemptible hierarchical RCU implementation. Nov 28 01:42:47 localhost kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8. Nov 28 01:42:47 localhost kernel: #011Trampoline variant of Tasks RCU enabled. Nov 28 01:42:47 localhost kernel: #011Rude variant of Tasks RCU enabled. Nov 28 01:42:47 localhost kernel: #011Tracing variant of Tasks RCU enabled. Nov 28 01:42:47 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Nov 28 01:42:47 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8 Nov 28 01:42:47 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16 Nov 28 01:42:47 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Nov 28 01:42:47 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____) Nov 28 01:42:47 localhost kernel: random: crng init done (trusting CPU's manufacturer) Nov 28 01:42:47 localhost kernel: Console: colour VGA+ 80x25 Nov 28 01:42:47 localhost kernel: printk: console [tty0] enabled Nov 28 01:42:47 localhost kernel: printk: console [ttyS0] enabled Nov 28 01:42:47 localhost kernel: ACPI: Core revision 20211217 Nov 28 01:42:47 localhost kernel: APIC: Switch to symmetric I/O mode setup Nov 28 01:42:47 localhost kernel: x2apic enabled Nov 28 01:42:47 localhost kernel: Switched APIC routing to physical x2apic. Nov 28 01:42:47 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Nov 28 01:42:47 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998) Nov 28 01:42:47 localhost kernel: pid_max: default: 32768 minimum: 301 Nov 28 01:42:47 localhost kernel: LSM: Security Framework initializing Nov 28 01:42:47 localhost kernel: Yama: becoming mindful. Nov 28 01:42:47 localhost kernel: SELinux: Initializing. Nov 28 01:42:47 localhost kernel: LSM support for eBPF active Nov 28 01:42:47 localhost kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Nov 28 01:42:47 localhost kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Nov 28 01:42:47 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Nov 28 01:42:47 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Nov 28 01:42:47 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Nov 28 01:42:47 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Nov 28 01:42:47 localhost kernel: Spectre V2 : Mitigation: Retpolines Nov 28 01:42:47 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Nov 28 01:42:47 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Nov 28 01:42:47 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Nov 28 01:42:47 localhost kernel: RETBleed: Mitigation: untrained return thunk Nov 28 01:42:47 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Nov 28 01:42:47 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Nov 28 01:42:47 localhost kernel: Freeing SMP alternatives memory: 36K Nov 28 01:42:47 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) Nov 28 01:42:47 localhost kernel: cblist_init_generic: Setting adjustable number of callback queues. Nov 28 01:42:47 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Nov 28 01:42:47 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Nov 28 01:42:47 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Nov 28 01:42:47 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Nov 28 01:42:47 localhost kernel: ... version: 0 Nov 28 01:42:47 localhost kernel: ... bit width: 48 Nov 28 01:42:47 localhost kernel: ... generic registers: 6 Nov 28 01:42:47 localhost kernel: ... value mask: 0000ffffffffffff Nov 28 01:42:47 localhost kernel: ... max period: 00007fffffffffff Nov 28 01:42:47 localhost kernel: ... fixed-purpose events: 0 Nov 28 01:42:47 localhost kernel: ... event mask: 000000000000003f Nov 28 01:42:47 localhost kernel: rcu: Hierarchical SRCU implementation. Nov 28 01:42:47 localhost kernel: rcu: #011Max phase no-delay instances is 400. Nov 28 01:42:47 localhost kernel: smp: Bringing up secondary CPUs ... Nov 28 01:42:47 localhost kernel: x86: Booting SMP configuration: Nov 28 01:42:47 localhost kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 Nov 28 01:42:47 localhost kernel: smp: Brought up 1 node, 8 CPUs Nov 28 01:42:47 localhost kernel: smpboot: Max logical packages: 8 Nov 28 01:42:47 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS) Nov 28 01:42:47 localhost kernel: node 0 deferred pages initialised in 22ms Nov 28 01:42:47 localhost kernel: devtmpfs: initialized Nov 28 01:42:47 localhost kernel: x86/mm: Memory block size: 128MB Nov 28 01:42:47 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Nov 28 01:42:47 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear) Nov 28 01:42:47 localhost kernel: pinctrl core: initialized pinctrl subsystem Nov 28 01:42:47 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Nov 28 01:42:47 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Nov 28 01:42:47 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Nov 28 01:42:47 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Nov 28 01:42:47 localhost kernel: audit: initializing netlink subsys (disabled) Nov 28 01:42:47 localhost kernel: audit: type=2000 audit(1764312166.498:1): state=initialized audit_enabled=0 res=1 Nov 28 01:42:47 localhost kernel: thermal_sys: Registered thermal governor 'fair_share' Nov 28 01:42:47 localhost kernel: thermal_sys: Registered thermal governor 'step_wise' Nov 28 01:42:47 localhost kernel: thermal_sys: Registered thermal governor 'user_space' Nov 28 01:42:47 localhost kernel: cpuidle: using governor menu Nov 28 01:42:47 localhost kernel: HugeTLB: can optimize 4095 vmemmap pages for hugepages-1048576kB Nov 28 01:42:47 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Nov 28 01:42:47 localhost kernel: PCI: Using configuration type 1 for base access Nov 28 01:42:47 localhost kernel: PCI: Using configuration type 1 for extended access Nov 28 01:42:47 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Nov 28 01:42:47 localhost kernel: HugeTLB: can optimize 7 vmemmap pages for hugepages-2048kB Nov 28 01:42:47 localhost kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages Nov 28 01:42:47 localhost kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages Nov 28 01:42:47 localhost kernel: cryptd: max_cpu_qlen set to 1000 Nov 28 01:42:47 localhost kernel: ACPI: Added _OSI(Module Device) Nov 28 01:42:47 localhost kernel: ACPI: Added _OSI(Processor Device) Nov 28 01:42:47 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Nov 28 01:42:47 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device) Nov 28 01:42:47 localhost kernel: ACPI: Added _OSI(Linux-Dell-Video) Nov 28 01:42:47 localhost kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio) Nov 28 01:42:47 localhost kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics) Nov 28 01:42:47 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Nov 28 01:42:47 localhost kernel: ACPI: Interpreter enabled Nov 28 01:42:47 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5) Nov 28 01:42:47 localhost kernel: ACPI: Using IOAPIC for interrupt routing Nov 28 01:42:47 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Nov 28 01:42:47 localhost kernel: PCI: Using E820 reservations for host bridge windows Nov 28 01:42:47 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F Nov 28 01:42:47 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Nov 28 01:42:47 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3] Nov 28 01:42:47 localhost kernel: acpiphp: Slot [3] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [4] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [5] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [6] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [7] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [8] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [9] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [10] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [11] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [12] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [13] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [14] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [15] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [16] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [17] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [18] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [19] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [20] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [21] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [22] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [23] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [24] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [25] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [26] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [27] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [28] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [29] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [30] registered Nov 28 01:42:47 localhost kernel: acpiphp: Slot [31] registered Nov 28 01:42:47 localhost kernel: PCI host bridge to bus 0000:00 Nov 28 01:42:47 localhost kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Nov 28 01:42:47 localhost kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Nov 28 01:42:47 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Nov 28 01:42:47 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Nov 28 01:42:47 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x440000000-0x4bfffffff window] Nov 28 01:42:47 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Nov 28 01:42:47 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 Nov 28 01:42:47 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 Nov 28 01:42:47 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 Nov 28 01:42:47 localhost kernel: pci 0000:00:01.1: reg 0x20: [io 0xc140-0xc14f] Nov 28 01:42:47 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Nov 28 01:42:47 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Nov 28 01:42:47 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Nov 28 01:42:47 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Nov 28 01:42:47 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 Nov 28 01:42:47 localhost kernel: pci 0000:00:01.2: reg 0x20: [io 0xc100-0xc11f] Nov 28 01:42:47 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 Nov 28 01:42:47 localhost kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI Nov 28 01:42:47 localhost kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB Nov 28 01:42:47 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 Nov 28 01:42:47 localhost kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] Nov 28 01:42:47 localhost kernel: pci 0000:00:02.0: reg 0x18: [mem 0xfe800000-0xfe803fff 64bit pref] Nov 28 01:42:47 localhost kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff] Nov 28 01:42:47 localhost kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref] Nov 28 01:42:47 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Nov 28 01:42:47 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Nov 28 01:42:47 localhost kernel: pci 0000:00:03.0: reg 0x10: [io 0xc080-0xc0bf] Nov 28 01:42:47 localhost kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff] Nov 28 01:42:47 localhost kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe804000-0xfe807fff 64bit pref] Nov 28 01:42:47 localhost kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref] Nov 28 01:42:47 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Nov 28 01:42:47 localhost kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Nov 28 01:42:47 localhost kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff] Nov 28 01:42:47 localhost kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe808000-0xfe80bfff 64bit pref] Nov 28 01:42:47 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 Nov 28 01:42:47 localhost kernel: pci 0000:00:05.0: reg 0x10: [io 0xc0c0-0xc0ff] Nov 28 01:42:47 localhost kernel: pci 0000:00:05.0: reg 0x20: [mem 0xfe80c000-0xfe80ffff 64bit pref] Nov 28 01:42:47 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 Nov 28 01:42:47 localhost kernel: pci 0000:00:06.0: reg 0x10: [io 0xc120-0xc13f] Nov 28 01:42:47 localhost kernel: pci 0000:00:06.0: reg 0x20: [mem 0xfe810000-0xfe813fff 64bit pref] Nov 28 01:42:47 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Nov 28 01:42:47 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Nov 28 01:42:47 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Nov 28 01:42:47 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Nov 28 01:42:47 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Nov 28 01:42:47 localhost kernel: iommu: Default domain type: Translated Nov 28 01:42:47 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode Nov 28 01:42:47 localhost kernel: SCSI subsystem initialized Nov 28 01:42:47 localhost kernel: ACPI: bus type USB registered Nov 28 01:42:47 localhost kernel: usbcore: registered new interface driver usbfs Nov 28 01:42:47 localhost kernel: usbcore: registered new interface driver hub Nov 28 01:42:47 localhost kernel: usbcore: registered new device driver usb Nov 28 01:42:47 localhost kernel: pps_core: LinuxPPS API ver. 1 registered Nov 28 01:42:47 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Nov 28 01:42:47 localhost kernel: PTP clock support registered Nov 28 01:42:47 localhost kernel: EDAC MC: Ver: 3.0.0 Nov 28 01:42:47 localhost kernel: NetLabel: Initializing Nov 28 01:42:47 localhost kernel: NetLabel: domain hash size = 128 Nov 28 01:42:47 localhost kernel: NetLabel: protocols = UNLABELED CIPSOv4 CALIPSO Nov 28 01:42:47 localhost kernel: NetLabel: unlabeled traffic allowed by default Nov 28 01:42:47 localhost kernel: PCI: Using ACPI for IRQ routing Nov 28 01:42:47 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device Nov 28 01:42:47 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible Nov 28 01:42:47 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Nov 28 01:42:47 localhost kernel: vgaarb: loaded Nov 28 01:42:47 localhost kernel: clocksource: Switched to clocksource kvm-clock Nov 28 01:42:47 localhost kernel: VFS: Disk quotas dquot_6.6.0 Nov 28 01:42:47 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Nov 28 01:42:47 localhost kernel: pnp: PnP ACPI init Nov 28 01:42:47 localhost kernel: pnp: PnP ACPI: found 5 devices Nov 28 01:42:47 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Nov 28 01:42:47 localhost kernel: NET: Registered PF_INET protocol family Nov 28 01:42:47 localhost kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Nov 28 01:42:47 localhost kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Nov 28 01:42:47 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Nov 28 01:42:47 localhost kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Nov 28 01:42:47 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear) Nov 28 01:42:47 localhost kernel: TCP: Hash tables configured (established 131072 bind 65536) Nov 28 01:42:47 localhost kernel: MPTCP token hash table entries: 16384 (order: 6, 393216 bytes, linear) Nov 28 01:42:47 localhost kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Nov 28 01:42:47 localhost kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Nov 28 01:42:47 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Nov 28 01:42:47 localhost kernel: NET: Registered PF_XDP protocol family Nov 28 01:42:47 localhost kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Nov 28 01:42:47 localhost kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Nov 28 01:42:47 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Nov 28 01:42:47 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window] Nov 28 01:42:47 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x440000000-0x4bfffffff window] Nov 28 01:42:47 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release Nov 28 01:42:47 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Nov 28 01:42:47 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Nov 28 01:42:47 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 26823 usecs Nov 28 01:42:47 localhost kernel: PCI: CLS 0 bytes, default 64 Nov 28 01:42:47 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Nov 28 01:42:47 localhost kernel: Trying to unpack rootfs image as initramfs... Nov 28 01:42:47 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB) Nov 28 01:42:47 localhost kernel: ACPI: bus type thunderbolt registered Nov 28 01:42:47 localhost kernel: Initialise system trusted keyrings Nov 28 01:42:47 localhost kernel: Key type blacklist registered Nov 28 01:42:47 localhost kernel: workingset: timestamp_bits=36 max_order=22 bucket_order=0 Nov 28 01:42:47 localhost kernel: zbud: loaded Nov 28 01:42:47 localhost kernel: integrity: Platform Keyring initialized Nov 28 01:42:47 localhost kernel: NET: Registered PF_ALG protocol family Nov 28 01:42:47 localhost kernel: xor: automatically using best checksumming function avx Nov 28 01:42:47 localhost kernel: Key type asymmetric registered Nov 28 01:42:47 localhost kernel: Asymmetric key parser 'x509' registered Nov 28 01:42:47 localhost kernel: Running certificate verification selftests Nov 28 01:42:47 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db' Nov 28 01:42:47 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246) Nov 28 01:42:47 localhost kernel: io scheduler mq-deadline registered Nov 28 01:42:47 localhost kernel: io scheduler kyber registered Nov 28 01:42:47 localhost kernel: io scheduler bfq registered Nov 28 01:42:47 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE Nov 28 01:42:47 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4 Nov 28 01:42:47 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0 Nov 28 01:42:47 localhost kernel: ACPI: button: Power Button [PWRF] Nov 28 01:42:47 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 Nov 28 01:42:47 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 Nov 28 01:42:47 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 Nov 28 01:42:47 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Nov 28 01:42:47 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Nov 28 01:42:47 localhost kernel: Non-volatile memory driver v1.3 Nov 28 01:42:47 localhost kernel: rdac: device handler registered Nov 28 01:42:47 localhost kernel: hp_sw: device handler registered Nov 28 01:42:47 localhost kernel: emc: device handler registered Nov 28 01:42:47 localhost kernel: alua: device handler registered Nov 28 01:42:47 localhost kernel: libphy: Fixed MDIO Bus: probed Nov 28 01:42:47 localhost kernel: ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver Nov 28 01:42:47 localhost kernel: ehci-pci: EHCI PCI platform driver Nov 28 01:42:47 localhost kernel: ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver Nov 28 01:42:47 localhost kernel: ohci-pci: OHCI PCI platform driver Nov 28 01:42:47 localhost kernel: uhci_hcd: USB Universal Host Controller Interface driver Nov 28 01:42:47 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller Nov 28 01:42:47 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1 Nov 28 01:42:47 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports Nov 28 01:42:47 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100 Nov 28 01:42:47 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14 Nov 28 01:42:47 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 Nov 28 01:42:47 localhost kernel: usb usb1: Product: UHCI Host Controller Nov 28 01:42:47 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-284.11.1.el9_2.x86_64 uhci_hcd Nov 28 01:42:47 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2 Nov 28 01:42:47 localhost kernel: hub 1-0:1.0: USB hub found Nov 28 01:42:47 localhost kernel: hub 1-0:1.0: 2 ports detected Nov 28 01:42:47 localhost kernel: usbcore: registered new interface driver usbserial_generic Nov 28 01:42:47 localhost kernel: usbserial: USB Serial support registered for generic Nov 28 01:42:47 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Nov 28 01:42:47 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Nov 28 01:42:47 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Nov 28 01:42:47 localhost kernel: mousedev: PS/2 mouse device common for all mice Nov 28 01:42:47 localhost kernel: rtc_cmos 00:04: RTC can wake from S4 Nov 28 01:42:47 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 Nov 28 01:42:47 localhost kernel: rtc_cmos 00:04: registered as rtc0 Nov 28 01:42:47 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4 Nov 28 01:42:47 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-11-28T06:42:46 UTC (1764312166) Nov 28 01:42:47 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3 Nov 28 01:42:47 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Nov 28 01:42:47 localhost kernel: hid: raw HID events driver (C) Jiri Kosina Nov 28 01:42:47 localhost kernel: usbcore: registered new interface driver usbhid Nov 28 01:42:47 localhost kernel: usbhid: USB HID core driver Nov 28 01:42:47 localhost kernel: drop_monitor: Initializing network drop monitor service Nov 28 01:42:47 localhost kernel: Initializing XFRM netlink socket Nov 28 01:42:47 localhost kernel: NET: Registered PF_INET6 protocol family Nov 28 01:42:47 localhost kernel: Segment Routing with IPv6 Nov 28 01:42:47 localhost kernel: NET: Registered PF_PACKET protocol family Nov 28 01:42:47 localhost kernel: mpls_gso: MPLS GSO support Nov 28 01:42:47 localhost kernel: IPI shorthand broadcast: enabled Nov 28 01:42:47 localhost kernel: AVX2 version of gcm_enc/dec engaged. Nov 28 01:42:47 localhost kernel: AES CTR mode by8 optimization enabled Nov 28 01:42:47 localhost kernel: sched_clock: Marking stable (713034875, 173887832)->(1009444400, -122521693) Nov 28 01:42:47 localhost kernel: registered taskstats version 1 Nov 28 01:42:47 localhost kernel: Loading compiled-in X.509 certificates Nov 28 01:42:47 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72' Nov 28 01:42:47 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80' Nov 28 01:42:47 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8' Nov 28 01:42:47 localhost kernel: zswap: loaded using pool lzo/zbud Nov 28 01:42:47 localhost kernel: page_owner is disabled Nov 28 01:42:47 localhost kernel: Key type big_key registered Nov 28 01:42:47 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd Nov 28 01:42:47 localhost kernel: Freeing initrd memory: 74232K Nov 28 01:42:47 localhost kernel: Key type encrypted registered Nov 28 01:42:47 localhost kernel: ima: No TPM chip found, activating TPM-bypass! Nov 28 01:42:47 localhost kernel: Loading compiled-in module X.509 certificates Nov 28 01:42:47 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72' Nov 28 01:42:47 localhost kernel: ima: Allocated hash algorithm: sha256 Nov 28 01:42:47 localhost kernel: ima: No architecture policies found Nov 28 01:42:47 localhost kernel: evm: Initialising EVM extended attributes: Nov 28 01:42:47 localhost kernel: evm: security.selinux Nov 28 01:42:47 localhost kernel: evm: security.SMACK64 (disabled) Nov 28 01:42:47 localhost kernel: evm: security.SMACK64EXEC (disabled) Nov 28 01:42:47 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled) Nov 28 01:42:47 localhost kernel: evm: security.SMACK64MMAP (disabled) Nov 28 01:42:47 localhost kernel: evm: security.apparmor (disabled) Nov 28 01:42:47 localhost kernel: evm: security.ima Nov 28 01:42:47 localhost kernel: evm: security.capability Nov 28 01:42:47 localhost kernel: evm: HMAC attrs: 0x1 Nov 28 01:42:47 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00 Nov 28 01:42:47 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10 Nov 28 01:42:47 localhost kernel: usb 1-1: Product: QEMU USB Tablet Nov 28 01:42:47 localhost kernel: usb 1-1: Manufacturer: QEMU Nov 28 01:42:47 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1 Nov 28 01:42:47 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5 Nov 28 01:42:47 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0 Nov 28 01:42:47 localhost kernel: Freeing unused decrypted memory: 2036K Nov 28 01:42:47 localhost kernel: Freeing unused kernel image (initmem) memory: 2792K Nov 28 01:42:47 localhost kernel: Write protecting the kernel read-only data: 26624k Nov 28 01:42:47 localhost kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K Nov 28 01:42:47 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 60K Nov 28 01:42:47 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found. Nov 28 01:42:47 localhost kernel: Run /init as init process Nov 28 01:42:47 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Nov 28 01:42:47 localhost systemd[1]: Detected virtualization kvm. Nov 28 01:42:47 localhost systemd[1]: Detected architecture x86-64. Nov 28 01:42:47 localhost systemd[1]: Running in initrd. Nov 28 01:42:47 localhost systemd[1]: No hostname configured, using default hostname. Nov 28 01:42:47 localhost systemd[1]: Hostname set to . Nov 28 01:42:47 localhost systemd[1]: Initializing machine ID from VM UUID. Nov 28 01:42:47 localhost systemd[1]: Queued start job for default target Initrd Default Target. Nov 28 01:42:47 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch. Nov 28 01:42:47 localhost systemd[1]: Reached target Local Encrypted Volumes. Nov 28 01:42:47 localhost systemd[1]: Reached target Initrd /usr File System. Nov 28 01:42:47 localhost systemd[1]: Reached target Local File Systems. Nov 28 01:42:47 localhost systemd[1]: Reached target Path Units. Nov 28 01:42:47 localhost systemd[1]: Reached target Slice Units. Nov 28 01:42:47 localhost systemd[1]: Reached target Swaps. Nov 28 01:42:47 localhost systemd[1]: Reached target Timer Units. Nov 28 01:42:47 localhost systemd[1]: Listening on D-Bus System Message Bus Socket. Nov 28 01:42:47 localhost systemd[1]: Listening on Journal Socket (/dev/log). Nov 28 01:42:47 localhost systemd[1]: Listening on Journal Socket. Nov 28 01:42:47 localhost systemd[1]: Listening on udev Control Socket. Nov 28 01:42:47 localhost systemd[1]: Listening on udev Kernel Socket. Nov 28 01:42:47 localhost systemd[1]: Reached target Socket Units. Nov 28 01:42:47 localhost systemd[1]: Starting Create List of Static Device Nodes... Nov 28 01:42:47 localhost systemd[1]: Starting Journal Service... Nov 28 01:42:47 localhost systemd[1]: Starting Load Kernel Modules... Nov 28 01:42:47 localhost systemd[1]: Starting Create System Users... Nov 28 01:42:47 localhost systemd[1]: Starting Setup Virtual Console... Nov 28 01:42:47 localhost systemd[1]: Finished Create List of Static Device Nodes. Nov 28 01:42:47 localhost systemd[1]: Finished Load Kernel Modules. Nov 28 01:42:47 localhost systemd-journald[284]: Journal started Nov 28 01:42:47 localhost systemd-journald[284]: Runtime Journal (/run/log/journal/eb468aede0e94528988f9267a3530b7a) is 8.0M, max 314.7M, 306.7M free. Nov 28 01:42:47 localhost systemd-modules-load[285]: Module 'msr' is built in Nov 28 01:42:47 localhost systemd[1]: Started Journal Service. Nov 28 01:42:47 localhost systemd[1]: Finished Setup Virtual Console. Nov 28 01:42:47 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met. Nov 28 01:42:47 localhost systemd[1]: Starting dracut cmdline hook... Nov 28 01:42:47 localhost systemd[1]: Starting Apply Kernel Variables... Nov 28 01:42:47 localhost systemd-sysusers[286]: Creating group 'sgx' with GID 997. Nov 28 01:42:47 localhost systemd-sysusers[286]: Creating group 'users' with GID 100. Nov 28 01:42:47 localhost systemd-sysusers[286]: Creating group 'dbus' with GID 81. Nov 28 01:42:47 localhost systemd-sysusers[286]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81. Nov 28 01:42:47 localhost systemd[1]: Finished Apply Kernel Variables. Nov 28 01:42:47 localhost systemd[1]: Finished Create System Users. Nov 28 01:42:47 localhost systemd[1]: Starting Create Static Device Nodes in /dev... Nov 28 01:42:47 localhost systemd[1]: Starting Create Volatile Files and Directories... Nov 28 01:42:47 localhost dracut-cmdline[289]: dracut-9.2 (Plow) dracut-057-21.git20230214.el9 Nov 28 01:42:47 localhost systemd[1]: Finished Create Static Device Nodes in /dev. Nov 28 01:42:47 localhost systemd[1]: Finished Create Volatile Files and Directories. Nov 28 01:42:47 localhost dracut-cmdline[289]: Using kernel command line parameters: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Nov 28 01:42:47 localhost systemd[1]: Finished dracut cmdline hook. Nov 28 01:42:47 localhost systemd[1]: Starting dracut pre-udev hook... Nov 28 01:42:47 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Nov 28 01:42:47 localhost kernel: device-mapper: uevent: version 1.0.3 Nov 28 01:42:47 localhost kernel: device-mapper: ioctl: 4.47.0-ioctl (2022-07-28) initialised: dm-devel@redhat.com Nov 28 01:42:47 localhost kernel: RPC: Registered named UNIX socket transport module. Nov 28 01:42:47 localhost kernel: RPC: Registered udp transport module. Nov 28 01:42:47 localhost kernel: RPC: Registered tcp transport module. Nov 28 01:42:47 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module. Nov 28 01:42:47 localhost rpc.statd[408]: Version 2.5.4 starting Nov 28 01:42:47 localhost rpc.statd[408]: Initializing NSM state Nov 28 01:42:47 localhost rpc.idmapd[413]: Setting log level to 0 Nov 28 01:42:47 localhost systemd[1]: Finished dracut pre-udev hook. Nov 28 01:42:47 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Nov 28 01:42:47 localhost systemd-udevd[426]: Using default interface naming scheme 'rhel-9.0'. Nov 28 01:42:47 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Nov 28 01:42:47 localhost systemd[1]: Starting dracut pre-trigger hook... Nov 28 01:42:47 localhost systemd[1]: Finished dracut pre-trigger hook. Nov 28 01:42:47 localhost systemd[1]: Starting Coldplug All udev Devices... Nov 28 01:42:47 localhost systemd[1]: Finished Coldplug All udev Devices. Nov 28 01:42:47 localhost systemd[1]: Reached target System Initialization. Nov 28 01:42:47 localhost systemd[1]: Reached target Basic System. Nov 28 01:42:47 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet). Nov 28 01:42:47 localhost systemd[1]: Reached target Network. Nov 28 01:42:47 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet). Nov 28 01:42:47 localhost systemd[1]: Starting dracut initqueue hook... Nov 28 01:42:47 localhost kernel: virtio_blk virtio2: [vda] 838860800 512-byte logical blocks (429 GB/400 GiB) Nov 28 01:42:48 localhost kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Nov 28 01:42:48 localhost kernel: GPT:20971519 != 838860799 Nov 28 01:42:48 localhost kernel: GPT:Alternate GPT header not at the end of the disk. Nov 28 01:42:48 localhost kernel: GPT:20971519 != 838860799 Nov 28 01:42:48 localhost kernel: GPT: Use GNU Parted to correct GPT errors. Nov 28 01:42:48 localhost kernel: vda: vda1 vda2 vda3 vda4 Nov 28 01:42:48 localhost kernel: scsi host0: ata_piix Nov 28 01:42:48 localhost kernel: scsi host1: ata_piix Nov 28 01:42:48 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 Nov 28 01:42:48 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 Nov 28 01:42:48 localhost systemd-udevd[471]: Network interface NamePolicy= disabled on kernel command line. Nov 28 01:42:48 localhost systemd[1]: Found device /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a. Nov 28 01:42:48 localhost systemd[1]: Reached target Initrd Root Device. Nov 28 01:42:48 localhost kernel: ata1: found unknown device (class 0) Nov 28 01:42:48 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Nov 28 01:42:48 localhost kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Nov 28 01:42:48 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5 Nov 28 01:42:48 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Nov 28 01:42:48 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Nov 28 01:42:48 localhost systemd[1]: Finished dracut initqueue hook. Nov 28 01:42:48 localhost systemd[1]: Reached target Preparation for Remote File Systems. Nov 28 01:42:48 localhost systemd[1]: Reached target Remote Encrypted Volumes. Nov 28 01:42:48 localhost systemd[1]: Reached target Remote File Systems. Nov 28 01:42:48 localhost systemd[1]: Starting dracut pre-mount hook... Nov 28 01:42:48 localhost systemd[1]: Finished dracut pre-mount hook. Nov 28 01:42:48 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a... Nov 28 01:42:48 localhost systemd-fsck[512]: /usr/sbin/fsck.xfs: XFS file system. Nov 28 01:42:48 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a. Nov 28 01:42:48 localhost systemd[1]: Mounting /sysroot... Nov 28 01:42:48 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled Nov 28 01:42:48 localhost kernel: XFS (vda4): Mounting V5 Filesystem Nov 28 01:42:48 localhost kernel: XFS (vda4): Ending clean mount Nov 28 01:42:48 localhost systemd[1]: Mounted /sysroot. Nov 28 01:42:48 localhost systemd[1]: Reached target Initrd Root File System. Nov 28 01:42:48 localhost systemd[1]: Starting Mountpoints Configured in the Real Root... Nov 28 01:42:48 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully. Nov 28 01:42:48 localhost systemd[1]: Finished Mountpoints Configured in the Real Root. Nov 28 01:42:48 localhost systemd[1]: Reached target Initrd File Systems. Nov 28 01:42:48 localhost systemd[1]: Reached target Initrd Default Target. Nov 28 01:42:48 localhost systemd[1]: Starting dracut mount hook... Nov 28 01:42:48 localhost systemd[1]: Finished dracut mount hook. Nov 28 01:42:48 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook... Nov 28 01:42:48 localhost rpc.idmapd[413]: exiting on signal 15 Nov 28 01:42:48 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully. Nov 28 01:42:48 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook. Nov 28 01:42:48 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons... Nov 28 01:42:48 localhost systemd[1]: Stopped target Network. Nov 28 01:42:48 localhost systemd[1]: Stopped target Remote Encrypted Volumes. Nov 28 01:42:48 localhost systemd[1]: Stopped target Timer Units. Nov 28 01:42:48 localhost systemd[1]: dbus.socket: Deactivated successfully. Nov 28 01:42:48 localhost systemd[1]: Closed D-Bus System Message Bus Socket. Nov 28 01:42:48 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Nov 28 01:42:48 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook. Nov 28 01:42:48 localhost systemd[1]: Stopped target Initrd Default Target. Nov 28 01:42:48 localhost systemd[1]: Stopped target Basic System. Nov 28 01:42:48 localhost systemd[1]: Stopped target Initrd Root Device. Nov 28 01:42:48 localhost systemd[1]: Stopped target Initrd /usr File System. Nov 28 01:42:48 localhost systemd[1]: Stopped target Path Units. Nov 28 01:42:48 localhost systemd[1]: Stopped target Remote File Systems. Nov 28 01:42:48 localhost systemd[1]: Stopped target Preparation for Remote File Systems. Nov 28 01:42:48 localhost systemd[1]: Stopped target Slice Units. Nov 28 01:42:48 localhost systemd[1]: Stopped target Socket Units. Nov 28 01:42:48 localhost systemd[1]: Stopped target System Initialization. Nov 28 01:42:48 localhost systemd[1]: Stopped target Local File Systems. Nov 28 01:42:48 localhost systemd[1]: Stopped target Swaps. Nov 28 01:42:48 localhost systemd[1]: dracut-mount.service: Deactivated successfully. Nov 28 01:42:48 localhost systemd[1]: Stopped dracut mount hook. Nov 28 01:42:48 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully. Nov 28 01:42:48 localhost systemd[1]: Stopped dracut pre-mount hook. Nov 28 01:42:48 localhost systemd[1]: Stopped target Local Encrypted Volumes. Nov 28 01:42:48 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Nov 28 01:42:48 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch. Nov 28 01:42:48 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully. Nov 28 01:42:48 localhost systemd[1]: Stopped dracut initqueue hook. Nov 28 01:42:48 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Nov 28 01:42:48 localhost systemd[1]: Stopped Apply Kernel Variables. Nov 28 01:42:48 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Nov 28 01:42:48 localhost systemd[1]: Stopped Load Kernel Modules. Nov 28 01:42:48 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Nov 28 01:42:48 localhost systemd[1]: Stopped Create Volatile Files and Directories. Nov 28 01:42:48 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Nov 28 01:42:48 localhost systemd[1]: Stopped Coldplug All udev Devices. Nov 28 01:42:48 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Nov 28 01:42:48 localhost systemd[1]: Stopped dracut pre-trigger hook. Nov 28 01:42:48 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files... Nov 28 01:42:48 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Nov 28 01:42:48 localhost systemd[1]: Stopped Setup Virtual Console. Nov 28 01:42:48 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Nov 28 01:42:48 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Nov 28 01:42:48 localhost systemd[1]: systemd-udevd.service: Deactivated successfully. Nov 28 01:42:48 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files. Nov 28 01:42:48 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Nov 28 01:42:48 localhost systemd[1]: Closed udev Control Socket. Nov 28 01:42:48 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Nov 28 01:42:48 localhost systemd[1]: Closed udev Kernel Socket. Nov 28 01:42:48 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully. Nov 28 01:42:48 localhost systemd[1]: Stopped dracut pre-udev hook. Nov 28 01:42:48 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully. Nov 28 01:42:48 localhost systemd[1]: Stopped dracut cmdline hook. Nov 28 01:42:48 localhost systemd[1]: Starting Cleanup udev Database... Nov 28 01:42:48 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Nov 28 01:42:49 localhost systemd[1]: Stopped Create Static Device Nodes in /dev. Nov 28 01:42:49 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully. Nov 28 01:42:49 localhost systemd[1]: Stopped Create List of Static Device Nodes. Nov 28 01:42:49 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully. Nov 28 01:42:49 localhost systemd[1]: Stopped Create System Users. Nov 28 01:42:49 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully. Nov 28 01:42:49 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons. Nov 28 01:42:49 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Nov 28 01:42:49 localhost systemd[1]: Finished Cleanup udev Database. Nov 28 01:42:49 localhost systemd[1]: Reached target Switch Root. Nov 28 01:42:49 localhost systemd[1]: Starting Switch Root... Nov 28 01:42:49 localhost systemd[1]: Switching root. Nov 28 01:42:49 localhost systemd-journald[284]: Journal stopped Nov 28 01:42:49 localhost systemd-journald[284]: Received SIGTERM from PID 1 (systemd). Nov 28 01:42:49 localhost kernel: audit: type=1404 audit(1764312169.104:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1 Nov 28 01:42:49 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 28 01:42:49 localhost kernel: SELinux: policy capability open_perms=1 Nov 28 01:42:49 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 28 01:42:49 localhost kernel: SELinux: policy capability always_check_network=0 Nov 28 01:42:49 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 28 01:42:49 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 28 01:42:49 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 28 01:42:49 localhost kernel: audit: type=1403 audit(1764312169.185:3): auid=4294967295 ses=4294967295 lsm=selinux res=1 Nov 28 01:42:49 localhost systemd[1]: Successfully loaded SELinux policy in 83.239ms. Nov 28 01:42:49 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 22.636ms. Nov 28 01:42:49 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Nov 28 01:42:49 localhost systemd[1]: Detected virtualization kvm. Nov 28 01:42:49 localhost systemd[1]: Detected architecture x86-64. Nov 28 01:42:49 localhost systemd-rc-local-generator[582]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 01:42:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 01:42:49 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully. Nov 28 01:42:49 localhost systemd[1]: Stopped Switch Root. Nov 28 01:42:49 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Nov 28 01:42:49 localhost systemd[1]: Created slice Slice /system/getty. Nov 28 01:42:49 localhost systemd[1]: Created slice Slice /system/modprobe. Nov 28 01:42:49 localhost systemd[1]: Created slice Slice /system/serial-getty. Nov 28 01:42:49 localhost systemd[1]: Created slice Slice /system/sshd-keygen. Nov 28 01:42:49 localhost systemd[1]: Created slice Slice /system/systemd-fsck. Nov 28 01:42:49 localhost systemd[1]: Created slice User and Session Slice. Nov 28 01:42:49 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch. Nov 28 01:42:49 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch. Nov 28 01:42:49 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point. Nov 28 01:42:49 localhost systemd[1]: Reached target Local Encrypted Volumes. Nov 28 01:42:49 localhost systemd[1]: Stopped target Switch Root. Nov 28 01:42:49 localhost systemd[1]: Stopped target Initrd File Systems. Nov 28 01:42:49 localhost systemd[1]: Stopped target Initrd Root File System. Nov 28 01:42:49 localhost systemd[1]: Reached target Local Integrity Protected Volumes. Nov 28 01:42:49 localhost systemd[1]: Reached target Path Units. Nov 28 01:42:49 localhost systemd[1]: Reached target rpc_pipefs.target. Nov 28 01:42:49 localhost systemd[1]: Reached target Slice Units. Nov 28 01:42:49 localhost systemd[1]: Reached target Swaps. Nov 28 01:42:49 localhost systemd[1]: Reached target Local Verity Protected Volumes. Nov 28 01:42:49 localhost systemd[1]: Listening on RPCbind Server Activation Socket. Nov 28 01:42:49 localhost systemd[1]: Reached target RPC Port Mapper. Nov 28 01:42:49 localhost systemd[1]: Listening on Process Core Dump Socket. Nov 28 01:42:49 localhost systemd[1]: Listening on initctl Compatibility Named Pipe. Nov 28 01:42:49 localhost systemd[1]: Listening on udev Control Socket. Nov 28 01:42:49 localhost systemd[1]: Listening on udev Kernel Socket. Nov 28 01:42:49 localhost systemd[1]: Mounting Huge Pages File System... Nov 28 01:42:49 localhost systemd[1]: Mounting POSIX Message Queue File System... Nov 28 01:42:49 localhost systemd[1]: Mounting Kernel Debug File System... Nov 28 01:42:49 localhost systemd[1]: Mounting Kernel Trace File System... Nov 28 01:42:49 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab). Nov 28 01:42:49 localhost systemd[1]: Starting Create List of Static Device Nodes... Nov 28 01:42:49 localhost systemd[1]: Starting Load Kernel Module configfs... Nov 28 01:42:49 localhost systemd[1]: Starting Load Kernel Module drm... Nov 28 01:42:49 localhost systemd[1]: Starting Load Kernel Module fuse... Nov 28 01:42:49 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network... Nov 28 01:42:49 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully. Nov 28 01:42:49 localhost systemd[1]: Stopped File System Check on Root Device. Nov 28 01:42:49 localhost systemd[1]: Stopped Journal Service. Nov 28 01:42:49 localhost kernel: fuse: init (API version 7.36) Nov 28 01:42:49 localhost systemd[1]: Starting Journal Service... Nov 28 01:42:49 localhost systemd[1]: Starting Load Kernel Modules... Nov 28 01:42:49 localhost systemd[1]: Starting Generate network units from Kernel command line... Nov 28 01:42:49 localhost kernel: ACPI: bus type drm_connector registered Nov 28 01:42:49 localhost systemd[1]: Starting Remount Root and Kernel File Systems... Nov 28 01:42:49 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met. Nov 28 01:42:49 localhost systemd[1]: Starting Coldplug All udev Devices... Nov 28 01:42:49 localhost systemd-journald[618]: Journal started Nov 28 01:42:49 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/5cd59ba25ae47acac865224fa46a5f9e) is 8.0M, max 314.7M, 306.7M free. Nov 28 01:42:49 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff) Nov 28 01:42:49 localhost systemd[1]: Queued start job for default target Multi-User System. Nov 28 01:42:49 localhost systemd[1]: systemd-journald.service: Deactivated successfully. Nov 28 01:42:49 localhost systemd-modules-load[619]: Module 'msr' is built in Nov 28 01:42:49 localhost systemd[1]: Started Journal Service. Nov 28 01:42:49 localhost systemd[1]: Mounted Huge Pages File System. Nov 28 01:42:49 localhost systemd[1]: Mounted POSIX Message Queue File System. Nov 28 01:42:49 localhost systemd[1]: Mounted Kernel Debug File System. Nov 28 01:42:49 localhost systemd[1]: Mounted Kernel Trace File System. Nov 28 01:42:49 localhost systemd[1]: Finished Create List of Static Device Nodes. Nov 28 01:42:49 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully. Nov 28 01:42:49 localhost systemd[1]: Finished Load Kernel Module configfs. Nov 28 01:42:49 localhost systemd[1]: modprobe@drm.service: Deactivated successfully. Nov 28 01:42:49 localhost systemd[1]: Finished Load Kernel Module drm. Nov 28 01:42:49 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully. Nov 28 01:42:49 localhost systemd[1]: Finished Load Kernel Module fuse. Nov 28 01:42:49 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network. Nov 28 01:42:49 localhost systemd[1]: Finished Load Kernel Modules. Nov 28 01:42:49 localhost systemd[1]: Finished Generate network units from Kernel command line. Nov 28 01:42:49 localhost systemd[1]: Finished Remount Root and Kernel File Systems. Nov 28 01:42:49 localhost systemd[1]: Mounting FUSE Control File System... Nov 28 01:42:49 localhost systemd[1]: Mounting Kernel Configuration File System... Nov 28 01:42:49 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes). Nov 28 01:42:49 localhost systemd[1]: Starting Rebuild Hardware Database... Nov 28 01:42:49 localhost systemd[1]: Starting Flush Journal to Persistent Storage... Nov 28 01:42:49 localhost systemd[1]: Starting Load/Save Random Seed... Nov 28 01:42:49 localhost systemd[1]: Starting Apply Kernel Variables... Nov 28 01:42:49 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/5cd59ba25ae47acac865224fa46a5f9e) is 8.0M, max 314.7M, 306.7M free. Nov 28 01:42:49 localhost systemd-journald[618]: Received client request to flush runtime journal. Nov 28 01:42:49 localhost systemd[1]: Starting Create System Users... Nov 28 01:42:49 localhost systemd[1]: Finished Coldplug All udev Devices. Nov 28 01:42:49 localhost systemd[1]: Mounted FUSE Control File System. Nov 28 01:42:49 localhost systemd[1]: Mounted Kernel Configuration File System. Nov 28 01:42:49 localhost systemd[1]: Finished Flush Journal to Persistent Storage. Nov 28 01:42:49 localhost systemd[1]: Finished Load/Save Random Seed. Nov 28 01:42:49 localhost systemd[1]: Finished Apply Kernel Variables. Nov 28 01:42:49 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes). Nov 28 01:42:49 localhost systemd-sysusers[631]: Creating group 'sgx' with GID 989. Nov 28 01:42:49 localhost systemd-sysusers[631]: Creating group 'systemd-oom' with GID 988. Nov 28 01:42:49 localhost systemd-sysusers[631]: Creating user 'systemd-oom' (systemd Userspace OOM Killer) with UID 988 and GID 988. Nov 28 01:42:49 localhost systemd[1]: Finished Create System Users. Nov 28 01:42:49 localhost systemd[1]: Starting Create Static Device Nodes in /dev... Nov 28 01:42:49 localhost systemd[1]: Finished Create Static Device Nodes in /dev. Nov 28 01:42:49 localhost systemd[1]: Reached target Preparation for Local File Systems. Nov 28 01:42:49 localhost systemd[1]: Set up automount EFI System Partition Automount. Nov 28 01:42:50 localhost systemd[1]: Finished Rebuild Hardware Database. Nov 28 01:42:50 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Nov 28 01:42:50 localhost systemd-udevd[635]: Using default interface naming scheme 'rhel-9.0'. Nov 28 01:42:50 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Nov 28 01:42:50 localhost systemd[1]: Starting Load Kernel Module configfs... Nov 28 01:42:50 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully. Nov 28 01:42:50 localhost systemd[1]: Finished Load Kernel Module configfs. Nov 28 01:42:50 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped. Nov 28 01:42:50 localhost systemd-udevd[651]: Network interface NamePolicy= disabled on kernel command line. Nov 28 01:42:50 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/7B77-95E7 being skipped. Nov 28 01:42:50 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/7B77-95E7... Nov 28 01:42:50 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/b141154b-6a70-437a-a97f-d160c9ba37eb being skipped. Nov 28 01:42:50 localhost systemd[1]: Mounting /boot... Nov 28 01:42:50 localhost kernel: XFS (vda3): Mounting V5 Filesystem Nov 28 01:42:50 localhost systemd-fsck[686]: fsck.fat 4.2 (2021-01-31) Nov 28 01:42:50 localhost systemd-fsck[686]: /dev/vda2: 12 files, 1782/51145 clusters Nov 28 01:42:50 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/7B77-95E7. Nov 28 01:42:50 localhost kernel: XFS (vda3): Ending clean mount Nov 28 01:42:50 localhost kernel: xfs filesystem being mounted at /boot supports timestamps until 2038 (0x7fffffff) Nov 28 01:42:50 localhost systemd[1]: Mounted /boot. Nov 28 01:42:50 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 Nov 28 01:42:50 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6 Nov 28 01:42:50 localhost kernel: SVM: TSC scaling supported Nov 28 01:42:50 localhost kernel: kvm: Nested Virtualization enabled Nov 28 01:42:50 localhost kernel: SVM: kvm: Nested Paging enabled Nov 28 01:42:50 localhost kernel: SVM: LBR virtualization supported Nov 28 01:42:50 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 Nov 28 01:42:50 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console Nov 28 01:42:50 localhost kernel: Console: switching to colour dummy device 80x25 Nov 28 01:42:50 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible Nov 28 01:42:50 localhost kernel: [drm] features: -context_init Nov 28 01:42:50 localhost kernel: [drm] number of scanouts: 1 Nov 28 01:42:50 localhost kernel: [drm] number of cap sets: 0 Nov 28 01:42:50 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 0 for virtio0 on minor 0 Nov 28 01:42:50 localhost kernel: virtio_gpu virtio0: [drm] drm_plane_enable_fb_damage_clips() not called Nov 28 01:42:50 localhost kernel: Console: switching to colour frame buffer device 128x48 Nov 28 01:42:50 localhost kernel: virtio_gpu virtio0: [drm] fb0: virtio_gpudrmfb frame buffer device Nov 28 01:42:50 localhost systemd[1]: Mounting /boot/efi... Nov 28 01:42:50 localhost systemd[1]: Mounted /boot/efi. Nov 28 01:42:50 localhost systemd[1]: Reached target Local File Systems. Nov 28 01:42:50 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache... Nov 28 01:42:50 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux). Nov 28 01:42:50 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Nov 28 01:42:50 localhost systemd[1]: Store a System Token in an EFI Variable was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Nov 28 01:42:50 localhost systemd[1]: Starting Automatic Boot Loader Update... Nov 28 01:42:50 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id). Nov 28 01:42:50 localhost systemd[1]: Starting Create Volatile Files and Directories... Nov 28 01:42:50 localhost systemd[1]: efi.automount: Got automount request for /efi, triggered by 712 (bootctl) Nov 28 01:42:50 localhost systemd[1]: Starting File System Check on /dev/vda2... Nov 28 01:42:50 localhost systemd[1]: Finished File System Check on /dev/vda2. Nov 28 01:42:50 localhost systemd[1]: Mounting EFI System Partition Automount... Nov 28 01:42:50 localhost systemd[1]: Mounted EFI System Partition Automount. Nov 28 01:42:50 localhost systemd[1]: Finished Automatic Boot Loader Update. Nov 28 01:42:50 localhost systemd[1]: Finished Create Volatile Files and Directories. Nov 28 01:42:50 localhost systemd[1]: Starting Security Auditing Service... Nov 28 01:42:50 localhost systemd[1]: Starting RPC Bind... Nov 28 01:42:50 localhost systemd[1]: Starting Rebuild Journal Catalog... Nov 28 01:42:50 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache. Nov 28 01:42:50 localhost auditd[725]: audit dispatcher initialized with q_depth=1200 and 1 active plugins Nov 28 01:42:50 localhost auditd[725]: Init complete, auditd 3.0.7 listening for events (startup state enable) Nov 28 01:42:50 localhost systemd[1]: Finished Rebuild Journal Catalog. Nov 28 01:42:50 localhost systemd[1]: Starting Update is Completed... Nov 28 01:42:50 localhost systemd[1]: Finished Update is Completed. Nov 28 01:42:50 localhost systemd[1]: Started RPC Bind. Nov 28 01:42:50 localhost augenrules[730]: /sbin/augenrules: No change Nov 28 01:42:50 localhost augenrules[741]: No rules Nov 28 01:42:50 localhost augenrules[741]: enabled 1 Nov 28 01:42:50 localhost augenrules[741]: failure 1 Nov 28 01:42:50 localhost augenrules[741]: pid 725 Nov 28 01:42:50 localhost augenrules[741]: rate_limit 0 Nov 28 01:42:50 localhost augenrules[741]: backlog_limit 8192 Nov 28 01:42:50 localhost augenrules[741]: lost 0 Nov 28 01:42:50 localhost augenrules[741]: backlog 2 Nov 28 01:42:50 localhost augenrules[741]: backlog_wait_time 60000 Nov 28 01:42:50 localhost augenrules[741]: backlog_wait_time_actual 0 Nov 28 01:42:50 localhost augenrules[741]: enabled 1 Nov 28 01:42:50 localhost augenrules[741]: failure 1 Nov 28 01:42:50 localhost augenrules[741]: pid 725 Nov 28 01:42:50 localhost augenrules[741]: rate_limit 0 Nov 28 01:42:50 localhost augenrules[741]: backlog_limit 8192 Nov 28 01:42:50 localhost augenrules[741]: lost 0 Nov 28 01:42:50 localhost augenrules[741]: backlog 0 Nov 28 01:42:50 localhost augenrules[741]: backlog_wait_time 60000 Nov 28 01:42:50 localhost augenrules[741]: backlog_wait_time_actual 0 Nov 28 01:42:50 localhost augenrules[741]: enabled 1 Nov 28 01:42:50 localhost augenrules[741]: failure 1 Nov 28 01:42:50 localhost augenrules[741]: pid 725 Nov 28 01:42:50 localhost augenrules[741]: rate_limit 0 Nov 28 01:42:50 localhost augenrules[741]: backlog_limit 8192 Nov 28 01:42:50 localhost augenrules[741]: lost 0 Nov 28 01:42:50 localhost augenrules[741]: backlog 0 Nov 28 01:42:50 localhost augenrules[741]: backlog_wait_time 60000 Nov 28 01:42:50 localhost augenrules[741]: backlog_wait_time_actual 0 Nov 28 01:42:50 localhost systemd[1]: Started Security Auditing Service. Nov 28 01:42:50 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP... Nov 28 01:42:50 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP. Nov 28 01:42:50 localhost systemd[1]: Reached target System Initialization. Nov 28 01:42:50 localhost systemd[1]: Started dnf makecache --timer. Nov 28 01:42:50 localhost systemd[1]: Started Daily rotation of log files. Nov 28 01:42:50 localhost systemd[1]: Started Daily Cleanup of Temporary Directories. Nov 28 01:42:50 localhost systemd[1]: Reached target Timer Units. Nov 28 01:42:50 localhost systemd[1]: Listening on D-Bus System Message Bus Socket. Nov 28 01:42:50 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket. Nov 28 01:42:50 localhost systemd[1]: Reached target Socket Units. Nov 28 01:42:50 localhost systemd[1]: Starting Initial cloud-init job (pre-networking)... Nov 28 01:42:51 localhost systemd[1]: Starting D-Bus System Message Bus... Nov 28 01:42:51 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Nov 28 01:42:51 localhost systemd[1]: Started D-Bus System Message Bus. Nov 28 01:42:51 localhost systemd[1]: Reached target Basic System. Nov 28 01:42:51 localhost journal[750]: Ready Nov 28 01:42:51 localhost systemd[1]: Starting NTP client/server... Nov 28 01:42:51 localhost systemd[1]: Starting Restore /run/initramfs on shutdown... Nov 28 01:42:51 localhost systemd[1]: Started irqbalance daemon. Nov 28 01:42:51 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload). Nov 28 01:42:51 localhost systemd[1]: Starting System Logging Service... Nov 28 01:42:51 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 28 01:42:51 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 28 01:42:51 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 28 01:42:51 localhost systemd[1]: Reached target sshd-keygen.target. Nov 28 01:42:51 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met. Nov 28 01:42:51 localhost systemd[1]: Reached target User and Group Name Lookups. Nov 28 01:42:51 localhost rsyslogd[759]: [origin software="rsyslogd" swVersion="8.2102.0-111.el9" x-pid="759" x-info="https://www.rsyslog.com"] start Nov 28 01:42:51 localhost rsyslogd[759]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2040 ] Nov 28 01:42:51 localhost systemd[1]: Starting User Login Management... Nov 28 01:42:51 localhost systemd[1]: Started System Logging Service. Nov 28 01:42:51 localhost systemd[1]: Finished Restore /run/initramfs on shutdown. Nov 28 01:42:51 localhost chronyd[766]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Nov 28 01:42:51 localhost chronyd[766]: Using right/UTC timezone to obtain leap second data Nov 28 01:42:51 localhost chronyd[766]: Loaded seccomp filter (level 2) Nov 28 01:42:51 localhost systemd[1]: Started NTP client/server. Nov 28 01:42:51 localhost systemd-logind[764]: New seat seat0. Nov 28 01:42:51 localhost systemd-logind[764]: Watching system buttons on /dev/input/event0 (Power Button) Nov 28 01:42:51 localhost systemd-logind[764]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Nov 28 01:42:51 localhost systemd[1]: Started User Login Management. Nov 28 01:42:51 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 28 01:42:51 localhost cloud-init[770]: Cloud-init v. 22.1-9.el9 running 'init-local' at Fri, 28 Nov 2025 06:42:51 +0000. Up 5.50 seconds. Nov 28 01:42:51 localhost systemd[1]: Starting Hostname Service... Nov 28 01:42:51 localhost systemd[1]: Started Hostname Service. Nov 28 01:42:51 localhost systemd-hostnamed[784]: Hostname set to (static) Nov 28 01:42:51 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpi7f25n2f.mount: Deactivated successfully. Nov 28 01:42:51 localhost systemd[1]: Finished Initial cloud-init job (pre-networking). Nov 28 01:42:51 localhost systemd[1]: Reached target Preparation for Network. Nov 28 01:42:51 localhost systemd[1]: Starting Network Manager... Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8124] NetworkManager (version 1.42.2-1.el9) is starting... (boot:590d17e7-bf7a-4d44-b812-a5de06abfb1f) Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8130] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf) Nov 28 01:42:51 localhost systemd[1]: Started Network Manager. Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8163] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager" Nov 28 01:42:51 localhost systemd[1]: Reached target Network. Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8222] manager[0x55dedc308020]: monitoring kernel firmware directory '/lib/firmware'. Nov 28 01:42:51 localhost systemd[1]: Starting Network Manager Wait Online... Nov 28 01:42:51 localhost systemd[1]: Starting GSSAPI Proxy Daemon... Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8272] hostname: hostname: using hostnamed Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8273] hostname: static hostname changed from (none) to "np0005538513.novalocal" Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8281] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto) Nov 28 01:42:51 localhost systemd[1]: Starting Enable periodic update of entitlement certificates.... Nov 28 01:42:51 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Nov 28 01:42:51 localhost systemd[1]: Started Enable periodic update of entitlement certificates.. Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8438] manager[0x55dedc308020]: rfkill: Wi-Fi hardware radio set enabled Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8439] manager[0x55dedc308020]: rfkill: WWAN hardware radio set enabled Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8475] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so) Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8476] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8478] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8478] manager: Networking is enabled by state file Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8489] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so") Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8490] settings: Loaded settings plugin: keyfile (internal) Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8508] dhcp: init: Using DHCP client 'internal' Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8510] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1) Nov 28 01:42:51 localhost systemd[1]: Started GSSAPI Proxy Daemon. Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8520] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8526] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external') Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8532] device (lo): Activation: starting connection 'lo' (dc22fba5-a55e-4101-8dc2-18071340ca35) Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8538] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2) Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8541] device (eth0): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external') Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8567] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external') Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8569] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external') Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8570] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external') Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8572] device (eth0): carrier: link connected Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8574] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external') Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8578] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed') Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8582] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8586] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8586] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed') Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8589] manager: NetworkManager state is now CONNECTING Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8590] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'managed') Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8595] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed') Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8597] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Nov 28 01:42:51 localhost systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch. Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8657] dhcp4 (eth0): state changed new lease, address=38.102.83.64 Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8659] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8673] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'managed') Nov 28 01:42:51 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Nov 28 01:42:51 localhost systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab). Nov 28 01:42:51 localhost systemd[1]: Reached target NFS client services. Nov 28 01:42:51 localhost systemd[1]: Reached target Preparation for Remote File Systems. Nov 28 01:42:51 localhost systemd[1]: Reached target Remote File Systems. Nov 28 01:42:51 localhost systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Nov 28 01:42:51 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8911] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external') Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8914] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'managed') Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8916] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external') Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8925] device (lo): Activation: successful, device activated. Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8933] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'managed') Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8938] manager: NetworkManager state is now CONNECTED_SITE Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8942] device (eth0): Activation: successful, device activated. Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8950] manager: NetworkManager state is now CONNECTED_GLOBAL Nov 28 01:42:51 localhost NetworkManager[789]: [1764312171.8956] manager: startup complete Nov 28 01:42:51 localhost systemd[1]: Finished Network Manager Wait Online. Nov 28 01:42:51 localhost systemd[1]: Starting Initial cloud-init job (metadata service crawler)... Nov 28 01:42:52 localhost cloud-init[973]: Cloud-init v. 22.1-9.el9 running 'init' at Fri, 28 Nov 2025 06:42:52 +0000. Up 6.27 seconds. Nov 28 01:42:52 localhost cloud-init[973]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++ Nov 28 01:42:52 localhost cloud-init[973]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Nov 28 01:42:52 localhost cloud-init[973]: ci-info: | Device | Up | Address | Mask | Scope | Hw-Address | Nov 28 01:42:52 localhost cloud-init[973]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Nov 28 01:42:52 localhost cloud-init[973]: ci-info: | eth0 | True | 38.102.83.64 | 255.255.255.0 | global | fa:16:3e:b0:25:93 | Nov 28 01:42:52 localhost cloud-init[973]: ci-info: | eth0 | True | fe80::f816:3eff:feb0:2593/64 | . | link | fa:16:3e:b0:25:93 | Nov 28 01:42:52 localhost cloud-init[973]: ci-info: | lo | True | 127.0.0.1 | 255.0.0.0 | host | . | Nov 28 01:42:52 localhost cloud-init[973]: ci-info: | lo | True | ::1/128 | . | host | . | Nov 28 01:42:52 localhost cloud-init[973]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Nov 28 01:42:52 localhost cloud-init[973]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++ Nov 28 01:42:52 localhost cloud-init[973]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+ Nov 28 01:42:52 localhost cloud-init[973]: ci-info: | Route | Destination | Gateway | Genmask | Interface | Flags | Nov 28 01:42:52 localhost cloud-init[973]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+ Nov 28 01:42:52 localhost cloud-init[973]: ci-info: | 0 | 0.0.0.0 | 38.102.83.1 | 0.0.0.0 | eth0 | UG | Nov 28 01:42:52 localhost cloud-init[973]: ci-info: | 1 | 38.102.83.0 | 0.0.0.0 | 255.255.255.0 | eth0 | U | Nov 28 01:42:52 localhost cloud-init[973]: ci-info: | 2 | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 | eth0 | UGH | Nov 28 01:42:52 localhost cloud-init[973]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+ Nov 28 01:42:52 localhost cloud-init[973]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++ Nov 28 01:42:52 localhost cloud-init[973]: ci-info: +-------+-------------+---------+-----------+-------+ Nov 28 01:42:52 localhost cloud-init[973]: ci-info: | Route | Destination | Gateway | Interface | Flags | Nov 28 01:42:52 localhost cloud-init[973]: ci-info: +-------+-------------+---------+-----------+-------+ Nov 28 01:42:52 localhost cloud-init[973]: ci-info: | 1 | fe80::/64 | :: | eth0 | U | Nov 28 01:42:52 localhost cloud-init[973]: ci-info: | 3 | multicast | :: | eth0 | U | Nov 28 01:42:52 localhost cloud-init[973]: ci-info: +-------+-------------+---------+-----------+-------+ Nov 28 01:42:52 localhost systemd[1]: Starting Authorization Manager... Nov 28 01:42:52 localhost polkitd[1036]: Started polkitd version 0.117 Nov 28 01:42:52 localhost systemd[1]: Started Dynamic System Tuning Daemon. Nov 28 01:42:52 localhost systemd[1]: Started Authorization Manager. Nov 28 01:42:54 localhost cloud-init[973]: Generating public/private rsa key pair. Nov 28 01:42:54 localhost cloud-init[973]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key Nov 28 01:42:54 localhost cloud-init[973]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub Nov 28 01:42:54 localhost cloud-init[973]: The key fingerprint is: Nov 28 01:42:54 localhost cloud-init[973]: SHA256:DeX7Gx9a/pmi8Yqbt882QwrPSlorkSrd97q5V7PdpFs root@np0005538513.novalocal Nov 28 01:42:54 localhost cloud-init[973]: The key's randomart image is: Nov 28 01:42:54 localhost cloud-init[973]: +---[RSA 3072]----+ Nov 28 01:42:54 localhost cloud-init[973]: | . | Nov 28 01:42:54 localhost cloud-init[973]: | o | Nov 28 01:42:54 localhost cloud-init[973]: | . . | Nov 28 01:42:54 localhost cloud-init[973]: | o . | Nov 28 01:42:54 localhost cloud-init[973]: | S o | Nov 28 01:42:54 localhost cloud-init[973]: | o . . + .| Nov 28 01:42:54 localhost cloud-init[973]: | . o .o+.* *oE| Nov 28 01:42:54 localhost cloud-init[973]: | . o o+.=*+&.++| Nov 28 01:42:54 localhost cloud-init[973]: | . .o@X=O+B=.| Nov 28 01:42:54 localhost cloud-init[973]: +----[SHA256]-----+ Nov 28 01:42:54 localhost cloud-init[973]: Generating public/private ecdsa key pair. Nov 28 01:42:54 localhost cloud-init[973]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key Nov 28 01:42:54 localhost cloud-init[973]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub Nov 28 01:42:54 localhost cloud-init[973]: The key fingerprint is: Nov 28 01:42:54 localhost cloud-init[973]: SHA256:fkEbeM8urwSNUDsbm5W9dv+B4EWJfWI0nQZ1cTyF0wk root@np0005538513.novalocal Nov 28 01:42:54 localhost cloud-init[973]: The key's randomart image is: Nov 28 01:42:54 localhost cloud-init[973]: +---[ECDSA 256]---+ Nov 28 01:42:54 localhost cloud-init[973]: | . E=*O| Nov 28 01:42:54 localhost cloud-init[973]: | . o o+ =*=| Nov 28 01:42:54 localhost cloud-init[973]: | . = *..*.o.| Nov 28 01:42:54 localhost cloud-init[973]: | . % =o.o | Nov 28 01:42:54 localhost cloud-init[973]: | S +.=.. | Nov 28 01:42:54 localhost cloud-init[973]: | . ..+o... | Nov 28 01:42:54 localhost cloud-init[973]: | . +... ..| Nov 28 01:42:54 localhost cloud-init[973]: | o o o| Nov 28 01:42:54 localhost cloud-init[973]: | ... .| Nov 28 01:42:54 localhost cloud-init[973]: +----[SHA256]-----+ Nov 28 01:42:54 localhost cloud-init[973]: Generating public/private ed25519 key pair. Nov 28 01:42:54 localhost cloud-init[973]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key Nov 28 01:42:54 localhost cloud-init[973]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub Nov 28 01:42:54 localhost cloud-init[973]: The key fingerprint is: Nov 28 01:42:54 localhost cloud-init[973]: SHA256:0SVodquVirdK6O2R8rTErbiu2+OzC15/7xgxqM68IjA root@np0005538513.novalocal Nov 28 01:42:54 localhost cloud-init[973]: The key's randomart image is: Nov 28 01:42:54 localhost cloud-init[973]: +--[ED25519 256]--+ Nov 28 01:42:54 localhost cloud-init[973]: | .. . | Nov 28 01:42:54 localhost cloud-init[973]: | +..o | Nov 28 01:42:54 localhost cloud-init[973]: | o...o | Nov 28 01:42:54 localhost cloud-init[973]: | . .+ | Nov 28 01:42:54 localhost cloud-init[973]: | ..S+ | Nov 28 01:42:54 localhost cloud-init[973]: |E +.o+o | Nov 28 01:42:54 localhost cloud-init[973]: |... = B.o. | Nov 28 01:42:54 localhost cloud-init[973]: |...OoX +oo | Nov 28 01:42:54 localhost cloud-init[973]: | .+*#OOo.oo | Nov 28 01:42:54 localhost cloud-init[973]: +----[SHA256]-----+ Nov 28 01:42:54 localhost sm-notify[1128]: Version 2.5.4 starting Nov 28 01:42:54 localhost systemd[1]: Finished Initial cloud-init job (metadata service crawler). Nov 28 01:42:54 localhost systemd[1]: Reached target Cloud-config availability. Nov 28 01:42:54 localhost systemd[1]: Reached target Network is Online. Nov 28 01:42:54 localhost systemd[1]: Starting Apply the settings specified in cloud-config... Nov 28 01:42:54 localhost systemd[1]: Run Insights Client at boot was skipped because of an unmet condition check (ConditionPathExists=/etc/insights-client/.run_insights_client_next_boot). Nov 28 01:42:54 localhost systemd[1]: Starting Crash recovery kernel arming... Nov 28 01:42:54 localhost systemd[1]: Starting Notify NFS peers of a restart... Nov 28 01:42:54 localhost systemd[1]: Starting OpenSSH server daemon... Nov 28 01:42:54 localhost systemd[1]: Starting Permit User Sessions... Nov 28 01:42:54 localhost systemd[1]: Started Notify NFS peers of a restart. Nov 28 01:42:54 localhost systemd[1]: Finished Permit User Sessions. Nov 28 01:42:54 localhost systemd[1]: Started Command Scheduler. Nov 28 01:42:54 localhost sshd[1129]: main: sshd: ssh-rsa algorithm is disabled Nov 28 01:42:54 localhost systemd[1]: Started Getty on tty1. Nov 28 01:42:54 localhost systemd[1]: Started Serial Getty on ttyS0. Nov 28 01:42:54 localhost systemd[1]: Reached target Login Prompts. Nov 28 01:42:54 localhost systemd[1]: Started OpenSSH server daemon. Nov 28 01:42:54 localhost systemd[1]: Reached target Multi-User System. Nov 28 01:42:54 localhost systemd[1]: Starting Record Runlevel Change in UTMP... Nov 28 01:42:54 localhost sshd[1146]: main: sshd: ssh-rsa algorithm is disabled Nov 28 01:42:54 localhost systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully. Nov 28 01:42:54 localhost systemd[1]: Finished Record Runlevel Change in UTMP. Nov 28 01:42:54 localhost sshd[1165]: main: sshd: ssh-rsa algorithm is disabled Nov 28 01:42:54 localhost sshd[1173]: main: sshd: ssh-rsa algorithm is disabled Nov 28 01:42:54 localhost sshd[1185]: main: sshd: ssh-rsa algorithm is disabled Nov 28 01:42:54 localhost kdumpctl[1133]: kdump: No kdump initial ramdisk found. Nov 28 01:42:54 localhost kdumpctl[1133]: kdump: Rebuilding /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img Nov 28 01:42:54 localhost sshd[1190]: main: sshd: ssh-rsa algorithm is disabled Nov 28 01:42:54 localhost sshd[1195]: main: sshd: ssh-rsa algorithm is disabled Nov 28 01:42:54 localhost sshd[1208]: main: sshd: ssh-rsa algorithm is disabled Nov 28 01:42:54 localhost sshd[1227]: main: sshd: ssh-rsa algorithm is disabled Nov 28 01:42:54 localhost sshd[1237]: main: sshd: ssh-rsa algorithm is disabled Nov 28 01:42:54 localhost cloud-init[1266]: Cloud-init v. 22.1-9.el9 running 'modules:config' at Fri, 28 Nov 2025 06:42:54 +0000. Up 8.49 seconds. Nov 28 01:42:54 localhost systemd[1]: Finished Apply the settings specified in cloud-config. Nov 28 01:42:54 localhost systemd[1]: Starting Execute cloud user/final scripts... Nov 28 01:42:54 localhost dracut[1433]: dracut-057-21.git20230214.el9 Nov 28 01:42:54 localhost cloud-init[1437]: Cloud-init v. 22.1-9.el9 running 'modules:final' at Fri, 28 Nov 2025 06:42:54 +0000. Up 8.85 seconds. Nov 28 01:42:54 localhost cloud-init[1451]: ############################################################# Nov 28 01:42:54 localhost cloud-init[1453]: -----BEGIN SSH HOST KEY FINGERPRINTS----- Nov 28 01:42:54 localhost cloud-init[1457]: 256 SHA256:fkEbeM8urwSNUDsbm5W9dv+B4EWJfWI0nQZ1cTyF0wk root@np0005538513.novalocal (ECDSA) Nov 28 01:42:54 localhost cloud-init[1461]: 256 SHA256:0SVodquVirdK6O2R8rTErbiu2+OzC15/7xgxqM68IjA root@np0005538513.novalocal (ED25519) Nov 28 01:42:54 localhost dracut[1435]: Executing: /usr/bin/dracut --add kdumpbase --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics -o "plymouth resume ifcfg earlykdump" --mount "/dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device -f /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img 5.14.0-284.11.1.el9_2.x86_64 Nov 28 01:42:54 localhost cloud-init[1468]: 3072 SHA256:DeX7Gx9a/pmi8Yqbt882QwrPSlorkSrd97q5V7PdpFs root@np0005538513.novalocal (RSA) Nov 28 01:42:54 localhost cloud-init[1472]: -----END SSH HOST KEY FINGERPRINTS----- Nov 28 01:42:54 localhost cloud-init[1475]: ############################################################# Nov 28 01:42:54 localhost cloud-init[1437]: Cloud-init v. 22.1-9.el9 finished at Fri, 28 Nov 2025 06:42:54 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0]. Up 9.08 seconds Nov 28 01:42:54 localhost dracut[1435]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found! Nov 28 01:42:54 localhost dracut[1435]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found! Nov 28 01:42:54 localhost dracut[1435]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found! Nov 28 01:42:54 localhost dracut[1435]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found! Nov 28 01:42:54 localhost dracut[1435]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found! Nov 28 01:42:54 localhost systemd[1]: Reloading Network Manager... Nov 28 01:42:54 localhost dracut[1435]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found! Nov 28 01:42:54 localhost dracut[1435]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found! Nov 28 01:42:55 localhost NetworkManager[789]: [1764312175.0024] audit: op="reload" arg="0" pid=1589 uid=0 result="success" Nov 28 01:42:55 localhost NetworkManager[789]: [1764312175.0031] config: signal: SIGHUP (no changes from disk) Nov 28 01:42:55 localhost systemd[1]: Reloaded Network Manager. Nov 28 01:42:55 localhost systemd[1]: Finished Execute cloud user/final scripts. Nov 28 01:42:55 localhost dracut[1435]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found! Nov 28 01:42:55 localhost systemd[1]: Reached target Cloud-init target. Nov 28 01:42:55 localhost dracut[1435]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'connman' will not be installed, because command 'connmand' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found! Nov 28 01:42:55 localhost dracut[1435]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found! Nov 28 01:42:55 localhost dracut[1435]: memstrack is not available Nov 28 01:42:55 localhost dracut[1435]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng Nov 28 01:42:55 localhost dracut[1435]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'connman' will not be installed, because command 'connmand' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found! Nov 28 01:42:55 localhost dracut[1435]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found! Nov 28 01:42:55 localhost dracut[1435]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found! Nov 28 01:42:55 localhost dracut[1435]: memstrack is not available Nov 28 01:42:55 localhost dracut[1435]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng Nov 28 01:42:55 localhost dracut[1435]: *** Including module: systemd *** Nov 28 01:42:56 localhost dracut[1435]: *** Including module: systemd-initrd *** Nov 28 01:42:56 localhost dracut[1435]: *** Including module: i18n *** Nov 28 01:42:56 localhost dracut[1435]: No KEYMAP configured. Nov 28 01:42:56 localhost dracut[1435]: *** Including module: drm *** Nov 28 01:42:56 localhost dracut[1435]: *** Including module: prefixdevname *** Nov 28 01:42:56 localhost dracut[1435]: *** Including module: kernel-modules *** Nov 28 01:42:56 localhost chronyd[766]: Selected source 149.56.19.163 (2.rhel.pool.ntp.org) Nov 28 01:42:56 localhost chronyd[766]: System clock TAI offset set to 37 seconds Nov 28 01:42:57 localhost dracut[1435]: *** Including module: kernel-modules-extra *** Nov 28 01:42:57 localhost dracut[1435]: *** Including module: qemu *** Nov 28 01:42:57 localhost dracut[1435]: *** Including module: fstab-sys *** Nov 28 01:42:57 localhost dracut[1435]: *** Including module: rootfs-block *** Nov 28 01:42:57 localhost dracut[1435]: *** Including module: terminfo *** Nov 28 01:42:57 localhost dracut[1435]: *** Including module: udev-rules *** Nov 28 01:42:57 localhost dracut[1435]: Skipping udev rule: 91-permissions.rules Nov 28 01:42:57 localhost dracut[1435]: Skipping udev rule: 80-drivers-modprobe.rules Nov 28 01:42:57 localhost dracut[1435]: *** Including module: virtiofs *** Nov 28 01:42:57 localhost dracut[1435]: *** Including module: dracut-systemd *** Nov 28 01:42:58 localhost dracut[1435]: *** Including module: usrmount *** Nov 28 01:42:58 localhost dracut[1435]: *** Including module: base *** Nov 28 01:42:58 localhost dracut[1435]: *** Including module: fs-lib *** Nov 28 01:42:58 localhost dracut[1435]: *** Including module: kdumpbase *** Nov 28 01:42:58 localhost chronyd[766]: Selected source 206.108.0.131 (2.rhel.pool.ntp.org) Nov 28 01:42:58 localhost dracut[1435]: *** Including module: microcode_ctl-fw_dir_override *** Nov 28 01:42:58 localhost dracut[1435]: microcode_ctl module: mangling fw_dir Nov 28 01:42:58 localhost dracut[1435]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel"... Nov 28 01:42:58 localhost dracut[1435]: microcode_ctl: configuration "intel" is ignored Nov 28 01:42:58 localhost dracut[1435]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"... Nov 28 01:42:58 localhost dracut[1435]: microcode_ctl: configuration "intel-06-2d-07" is ignored Nov 28 01:42:58 localhost dracut[1435]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"... Nov 28 01:42:58 localhost dracut[1435]: microcode_ctl: configuration "intel-06-4e-03" is ignored Nov 28 01:42:58 localhost dracut[1435]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"... Nov 28 01:42:58 localhost dracut[1435]: microcode_ctl: configuration "intel-06-4f-01" is ignored Nov 28 01:42:58 localhost dracut[1435]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"... Nov 28 01:42:58 localhost dracut[1435]: microcode_ctl: configuration "intel-06-55-04" is ignored Nov 28 01:42:58 localhost dracut[1435]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"... Nov 28 01:42:58 localhost dracut[1435]: microcode_ctl: configuration "intel-06-5e-03" is ignored Nov 28 01:42:58 localhost dracut[1435]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"... Nov 28 01:42:58 localhost dracut[1435]: microcode_ctl: configuration "intel-06-8c-01" is ignored Nov 28 01:42:58 localhost dracut[1435]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"... Nov 28 01:42:58 localhost dracut[1435]: microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored Nov 28 01:42:58 localhost dracut[1435]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"... Nov 28 01:42:58 localhost dracut[1435]: microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored Nov 28 01:42:58 localhost dracut[1435]: microcode_ctl: final fw_dir: "/lib/firmware/updates/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware/updates /lib/firmware/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware" Nov 28 01:42:58 localhost dracut[1435]: *** Including module: shutdown *** Nov 28 01:42:58 localhost dracut[1435]: *** Including module: squash *** Nov 28 01:42:58 localhost dracut[1435]: *** Including modules done *** Nov 28 01:42:58 localhost dracut[1435]: *** Installing kernel module dependencies *** Nov 28 01:42:59 localhost dracut[1435]: *** Installing kernel module dependencies done *** Nov 28 01:42:59 localhost dracut[1435]: *** Resolving executable dependencies *** Nov 28 01:43:00 localhost dracut[1435]: *** Resolving executable dependencies done *** Nov 28 01:43:00 localhost dracut[1435]: *** Hardlinking files *** Nov 28 01:43:00 localhost dracut[1435]: Mode: real Nov 28 01:43:00 localhost dracut[1435]: Files: 1099 Nov 28 01:43:00 localhost dracut[1435]: Linked: 3 files Nov 28 01:43:00 localhost dracut[1435]: Compared: 0 xattrs Nov 28 01:43:00 localhost dracut[1435]: Compared: 373 files Nov 28 01:43:00 localhost dracut[1435]: Saved: 61.04 KiB Nov 28 01:43:00 localhost dracut[1435]: Duration: 0.024039 seconds Nov 28 01:43:00 localhost dracut[1435]: *** Hardlinking files done *** Nov 28 01:43:00 localhost dracut[1435]: Could not find 'strip'. Not stripping the initramfs. Nov 28 01:43:00 localhost dracut[1435]: *** Generating early-microcode cpio image *** Nov 28 01:43:00 localhost dracut[1435]: *** Constructing AuthenticAMD.bin *** Nov 28 01:43:00 localhost dracut[1435]: *** Store current command line parameters *** Nov 28 01:43:00 localhost dracut[1435]: Stored kernel commandline: Nov 28 01:43:00 localhost dracut[1435]: No dracut internal kernel commandline stored in the initramfs Nov 28 01:43:01 localhost dracut[1435]: *** Install squash loader *** Nov 28 01:43:01 localhost dracut[1435]: *** Squashing the files inside the initramfs *** Nov 28 01:43:02 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Nov 28 01:43:02 localhost dracut[1435]: *** Squashing the files inside the initramfs done *** Nov 28 01:43:02 localhost dracut[1435]: *** Creating image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' *** Nov 28 01:43:02 localhost dracut[1435]: *** Creating initramfs image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' done *** Nov 28 01:43:03 localhost kdumpctl[1133]: kdump: kexec: loaded kdump kernel Nov 28 01:43:03 localhost kdumpctl[1133]: kdump: Starting kdump: [OK] Nov 28 01:43:03 localhost systemd[1]: Finished Crash recovery kernel arming. Nov 28 01:43:03 localhost systemd[1]: Startup finished in 1.239s (kernel) + 2.034s (initrd) + 14.145s (userspace) = 17.419s. Nov 28 01:43:21 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Nov 28 01:43:40 localhost sshd[4173]: main: sshd: ssh-rsa algorithm is disabled Nov 28 01:43:40 localhost systemd[1]: Created slice User Slice of UID 1000. Nov 28 01:43:40 localhost systemd[1]: Starting User Runtime Directory /run/user/1000... Nov 28 01:43:40 localhost systemd-logind[764]: New session 1 of user zuul. Nov 28 01:43:40 localhost systemd[1]: Finished User Runtime Directory /run/user/1000. Nov 28 01:43:40 localhost systemd[1]: Starting User Manager for UID 1000... Nov 28 01:43:41 localhost systemd[4177]: Queued start job for default target Main User Target. Nov 28 01:43:41 localhost systemd[4177]: Created slice User Application Slice. Nov 28 01:43:41 localhost systemd[4177]: Started Mark boot as successful after the user session has run 2 minutes. Nov 28 01:43:41 localhost systemd[4177]: Started Daily Cleanup of User's Temporary Directories. Nov 28 01:43:41 localhost systemd[4177]: Reached target Paths. Nov 28 01:43:41 localhost systemd[4177]: Reached target Timers. Nov 28 01:43:41 localhost systemd[4177]: Starting D-Bus User Message Bus Socket... Nov 28 01:43:41 localhost systemd[4177]: Starting Create User's Volatile Files and Directories... Nov 28 01:43:41 localhost systemd[4177]: Listening on D-Bus User Message Bus Socket. Nov 28 01:43:41 localhost systemd[4177]: Reached target Sockets. Nov 28 01:43:41 localhost systemd[4177]: Finished Create User's Volatile Files and Directories. Nov 28 01:43:41 localhost systemd[4177]: Reached target Basic System. Nov 28 01:43:41 localhost systemd[4177]: Reached target Main User Target. Nov 28 01:43:41 localhost systemd[4177]: Startup finished in 115ms. Nov 28 01:43:41 localhost systemd[1]: Started User Manager for UID 1000. Nov 28 01:43:41 localhost systemd[1]: Started Session 1 of User zuul. Nov 28 01:43:41 localhost python3[4229]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 01:43:49 localhost python3[4247]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 01:43:56 localhost python3[4300]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 01:43:58 localhost python3[4330]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present Nov 28 01:44:01 localhost python3[4346]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsnBivukZgTjr1SoC29hE3ofwUMxTaKeXh9gXvDwMJASbvK4q9943cbJ2j47GUf8sEgY38kkU/dxSMQWULl4d2oquIgZpJbJuXMU1WNxwGNSrS74OecQ3Or4VxTiDmu/HV83nIWHqfpDCra4DlrIBPPNwhBK4u0QYy87AJaML6NGEDaubbHgVCg1UpW1ho/sDoXptAehoCEaaeRz5tPHiXRnHpIXu44Sp8fRcyU9rBqdv+/lgachTcMYadsD2WBHIL+pptEDHB5TvQTDpnU58YdGFarn8uuGPP4t8H6xcqXbaJS9/oZa5Fb5Mh3vORBbR65jvlGg4PYGzCuI/xllY5+lGK7eyOleFyRqWKa2uAIaGoRBT4ZLKAssOFwCIaGfOAFFOBMkuylg4+MtbYiMJYRARPSRAufAROqhUDOo73y5lBrXh07aiWuSn8fU4mclWu+Xw382ryxW+XeHPc12d7S46TvGJaRvzsLtlyerRxGI77xOHRexq1Z/SFjOWLOwc= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:01 localhost python3[4360]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:44:02 localhost python3[4419]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 01:44:03 localhost python3[4460]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764312242.7142704-389-233679186952224/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=4237e6bc560e46d9aa55417d911b2c55_id_rsa follow=False checksum=47f6a2f8fa426c1f34aad346f88073a22928af4e backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:44:04 localhost python3[4533]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 01:44:04 localhost python3[4574]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764312244.3381405-485-201455792218243/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=4237e6bc560e46d9aa55417d911b2c55_id_rsa.pub follow=False checksum=d1f12d852c72cfefab089d88337552962cfbc93d backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:44:06 localhost python3[4602]: ansible-ping Invoked with data=pong Nov 28 01:44:09 localhost python3[4616]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 01:44:13 localhost python3[4670]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None Nov 28 01:44:15 localhost python3[4692]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:44:16 localhost python3[4706]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:44:16 localhost python3[4720]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:44:17 localhost python3[4734]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:44:17 localhost python3[4748]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:44:17 localhost python3[4762]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:44:20 localhost python3[4778]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:44:22 localhost python3[4826]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 01:44:22 localhost python3[4869]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764312262.0988226-99-122024860704302/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:44:30 localhost python3[4897]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:30 localhost python3[4911]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:31 localhost python3[4925]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:31 localhost python3[4939]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:31 localhost python3[4953]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:31 localhost python3[4967]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:32 localhost python3[4981]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:32 localhost python3[4995]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:32 localhost python3[5009]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:32 localhost python3[5023]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:33 localhost python3[5037]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:33 localhost python3[5051]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:33 localhost python3[5065]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:33 localhost python3[5079]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:34 localhost python3[5093]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:34 localhost python3[5107]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:34 localhost python3[5121]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:34 localhost python3[5135]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:35 localhost python3[5149]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:35 localhost python3[5163]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:35 localhost python3[5177]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:35 localhost python3[5191]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:36 localhost python3[5205]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:36 localhost python3[5219]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:36 localhost python3[5233]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:36 localhost python3[5247]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 01:44:38 localhost python3[5263]: ansible-community.general.timezone Invoked with name=UTC hwclock=None Nov 28 01:44:38 localhost systemd[1]: Starting Time & Date Service... Nov 28 01:44:38 localhost systemd[1]: Started Time & Date Service. Nov 28 01:44:39 localhost systemd-timedated[5265]: Changed time zone to 'UTC' (UTC). Nov 28 01:44:39 localhost python3[5284]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:44:40 localhost python3[5330]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 01:44:41 localhost python3[5371]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764312280.7328932-492-37995062897735/source _original_basename=tmp5eyxkp02 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:44:42 localhost python3[5431]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 01:44:42 localhost python3[5472]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764312282.215266-581-230248070919413/source _original_basename=tmp586k1ny0 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:44:44 localhost python3[5534]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 01:44:44 localhost python3[5577]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764312284.3016577-724-11816403856429/source _original_basename=tmpt5fb9mp7 follow=False checksum=d1fb5b4f9f73b8c84cf3b5af0e2af5367a435780 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:44:46 localhost python3[5605]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 01:44:46 localhost python3[5621]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 01:44:47 localhost python3[5671]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 01:44:48 localhost python3[5714]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764312287.381199-852-275465695180306/source _original_basename=tmpfes8jtas follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:44:49 localhost python3[5745]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-161e-20ee-000000000023-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 01:45:00 localhost python3[5764]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-161e-20ee-000000000024-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None Nov 28 01:45:09 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Nov 28 01:45:12 localhost python3[5784]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:45:30 localhost python3[5800]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:46:05 localhost systemd[4177]: Starting Mark boot as successful... Nov 28 01:46:05 localhost systemd[4177]: Finished Mark boot as successful. Nov 28 01:46:30 localhost systemd-logind[764]: Session 1 logged out. Waiting for processes to exit. Nov 28 01:46:52 localhost systemd[1]: Unmounting EFI System Partition Automount... Nov 28 01:46:52 localhost systemd[1]: efi.mount: Deactivated successfully. Nov 28 01:46:52 localhost systemd[1]: Unmounted EFI System Partition Automount. Nov 28 01:47:56 localhost sshd[5806]: main: sshd: ssh-rsa algorithm is disabled Nov 28 01:48:52 localhost kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 Nov 28 01:48:52 localhost kernel: pci 0000:00:07.0: reg 0x10: [io 0x0000-0x003f] Nov 28 01:48:52 localhost kernel: pci 0000:00:07.0: reg 0x14: [mem 0x00000000-0x00000fff] Nov 28 01:48:52 localhost kernel: pci 0000:00:07.0: reg 0x20: [mem 0x00000000-0x00003fff 64bit pref] Nov 28 01:48:52 localhost kernel: pci 0000:00:07.0: reg 0x30: [mem 0x00000000-0x0007ffff pref] Nov 28 01:48:52 localhost kernel: pci 0000:00:07.0: BAR 6: assigned [mem 0xc0000000-0xc007ffff pref] Nov 28 01:48:52 localhost kernel: pci 0000:00:07.0: BAR 4: assigned [mem 0x440000000-0x440003fff 64bit pref] Nov 28 01:48:52 localhost kernel: pci 0000:00:07.0: BAR 1: assigned [mem 0xc0080000-0xc0080fff] Nov 28 01:48:52 localhost kernel: pci 0000:00:07.0: BAR 0: assigned [io 0x1000-0x103f] Nov 28 01:48:52 localhost kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003) Nov 28 01:48:52 localhost NetworkManager[789]: [1764312532.1258] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3) Nov 28 01:48:52 localhost systemd-udevd[5810]: Network interface NamePolicy= disabled on kernel command line. Nov 28 01:48:52 localhost NetworkManager[789]: [1764312532.1378] device (eth1): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external') Nov 28 01:48:52 localhost systemd[4177]: Created slice User Background Tasks Slice. Nov 28 01:48:52 localhost NetworkManager[789]: [1764312532.1411] settings: (eth1): created default wired connection 'Wired connection 1' Nov 28 01:48:52 localhost NetworkManager[789]: [1764312532.1415] device (eth1): carrier: link connected Nov 28 01:48:52 localhost NetworkManager[789]: [1764312532.1417] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed') Nov 28 01:48:52 localhost NetworkManager[789]: [1764312532.1422] policy: auto-activating connection 'Wired connection 1' (5001363f-9ee2-3450-b752-8866e660be65) Nov 28 01:48:52 localhost NetworkManager[789]: [1764312532.1428] device (eth1): Activation: starting connection 'Wired connection 1' (5001363f-9ee2-3450-b752-8866e660be65) Nov 28 01:48:52 localhost NetworkManager[789]: [1764312532.1429] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed') Nov 28 01:48:52 localhost NetworkManager[789]: [1764312532.1432] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'managed') Nov 28 01:48:52 localhost NetworkManager[789]: [1764312532.1437] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed') Nov 28 01:48:52 localhost systemd[4177]: Starting Cleanup of User's Temporary Files and Directories... Nov 28 01:48:52 localhost NetworkManager[789]: [1764312532.1440] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Nov 28 01:48:52 localhost systemd[4177]: Finished Cleanup of User's Temporary Files and Directories. Nov 28 01:48:53 localhost sshd[5814]: main: sshd: ssh-rsa algorithm is disabled Nov 28 01:48:53 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth1: link becomes ready Nov 28 01:48:53 localhost systemd-logind[764]: New session 3 of user zuul. Nov 28 01:48:53 localhost systemd[1]: Started Session 3 of User zuul. Nov 28 01:48:53 localhost python3[5831]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-b1a9-fc65-00000000039b-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 01:49:06 localhost python3[5882]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 01:49:07 localhost python3[5925]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764312546.3786542-435-234083149484200/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=e5e10e3b8898b1550d26d78981826d3ea337ef09 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:49:07 localhost python3[5955]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 01:49:07 localhost systemd[1]: NetworkManager-wait-online.service: Deactivated successfully. Nov 28 01:49:07 localhost systemd[1]: Stopped Network Manager Wait Online. Nov 28 01:49:07 localhost systemd[1]: Stopping Network Manager Wait Online... Nov 28 01:49:07 localhost systemd[1]: Stopping Network Manager... Nov 28 01:49:07 localhost NetworkManager[789]: [1764312547.6537] caught SIGTERM, shutting down normally. Nov 28 01:49:07 localhost NetworkManager[789]: [1764312547.6642] dhcp4 (eth0): canceled DHCP transaction Nov 28 01:49:07 localhost NetworkManager[789]: [1764312547.6643] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Nov 28 01:49:07 localhost NetworkManager[789]: [1764312547.6643] dhcp4 (eth0): state changed no lease Nov 28 01:49:07 localhost NetworkManager[789]: [1764312547.6647] manager: NetworkManager state is now CONNECTING Nov 28 01:49:07 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Nov 28 01:49:07 localhost NetworkManager[789]: [1764312547.6782] dhcp4 (eth1): canceled DHCP transaction Nov 28 01:49:07 localhost NetworkManager[789]: [1764312547.6783] dhcp4 (eth1): state changed no lease Nov 28 01:49:07 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Nov 28 01:49:07 localhost NetworkManager[789]: [1764312547.6860] exiting (success) Nov 28 01:49:07 localhost systemd[1]: NetworkManager.service: Deactivated successfully. Nov 28 01:49:07 localhost systemd[1]: Stopped Network Manager. Nov 28 01:49:07 localhost systemd[1]: NetworkManager.service: Consumed 2.301s CPU time. Nov 28 01:49:07 localhost systemd[1]: Starting Network Manager... Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.7431] NetworkManager (version 1.42.2-1.el9) is starting... (after a restart, boot:590d17e7-bf7a-4d44-b812-a5de06abfb1f) Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.7435] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf) Nov 28 01:49:07 localhost systemd[1]: Started Network Manager. Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.7467] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager" Nov 28 01:49:07 localhost systemd[1]: Starting Network Manager Wait Online... Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.7565] manager[0x55d3ec1f6090]: monitoring kernel firmware directory '/lib/firmware'. Nov 28 01:49:07 localhost systemd[1]: Starting Hostname Service... Nov 28 01:49:07 localhost systemd[1]: Started Hostname Service. Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8442] hostname: hostname: using hostnamed Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8443] hostname: static hostname changed from (none) to "np0005538513.novalocal" Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8450] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto) Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8458] manager[0x55d3ec1f6090]: rfkill: Wi-Fi hardware radio set enabled Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8458] manager[0x55d3ec1f6090]: rfkill: WWAN hardware radio set enabled Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8504] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so) Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8505] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8506] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8506] manager: Networking is enabled by state file Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8518] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so") Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8521] settings: Loaded settings plugin: keyfile (internal) Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8569] dhcp: init: Using DHCP client 'internal' Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8573] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1) Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8581] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8589] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external') Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8601] device (lo): Activation: starting connection 'lo' (dc22fba5-a55e-4101-8dc2-18071340ca35) Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8609] device (eth0): carrier: link connected Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8616] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2) Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8623] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated) Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8624] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume') Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8635] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume') Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8649] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8658] device (eth1): carrier: link connected Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8664] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3) Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8671] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (5001363f-9ee2-3450-b752-8866e660be65) (indicated) Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8671] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume') Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8678] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume') Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8689] device (eth1): Activation: starting connection 'Wired connection 1' (5001363f-9ee2-3450-b752-8866e660be65) Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8723] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external') Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8731] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external') Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8736] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external') Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8740] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume') Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8746] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'assume') Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8750] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume') Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8757] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'assume') Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8804] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external') Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8817] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume') Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8824] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8835] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume') Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8839] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8864] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external') Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8874] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external') Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8885] device (lo): Activation: successful, device activated. Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8964] dhcp4 (eth0): state changed new lease, address=38.102.83.64 Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.8974] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.9079] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume') Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.9108] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume') Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.9111] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume') Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.9116] manager: NetworkManager state is now CONNECTED_SITE Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.9120] device (eth0): Activation: successful, device activated. Nov 28 01:49:07 localhost NetworkManager[5967]: [1764312547.9127] manager: NetworkManager state is now CONNECTED_GLOBAL Nov 28 01:49:08 localhost python3[6024]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-b1a9-fc65-000000000120-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 01:49:17 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Nov 28 01:49:37 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Nov 28 01:49:52 localhost NetworkManager[5967]: [1764312592.8293] device (eth1): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume') Nov 28 01:49:52 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Nov 28 01:49:52 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Nov 28 01:49:52 localhost NetworkManager[5967]: [1764312592.8526] device (eth1): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume') Nov 28 01:49:52 localhost NetworkManager[5967]: [1764312592.8529] device (eth1): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume') Nov 28 01:49:52 localhost NetworkManager[5967]: [1764312592.8534] device (eth1): Activation: successful, device activated. Nov 28 01:49:52 localhost NetworkManager[5967]: [1764312592.8539] manager: startup complete Nov 28 01:49:52 localhost systemd[1]: Finished Network Manager Wait Online. Nov 28 01:50:02 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Nov 28 01:50:08 localhost systemd[1]: session-3.scope: Deactivated successfully. Nov 28 01:50:08 localhost systemd[1]: session-3.scope: Consumed 1.481s CPU time. Nov 28 01:50:08 localhost systemd-logind[764]: Session 3 logged out. Waiting for processes to exit. Nov 28 01:50:08 localhost systemd-logind[764]: Removed session 3. Nov 28 01:51:23 localhost sshd[6056]: main: sshd: ssh-rsa algorithm is disabled Nov 28 01:51:23 localhost systemd-logind[764]: New session 4 of user zuul. Nov 28 01:51:23 localhost systemd[1]: Started Session 4 of User zuul. Nov 28 01:51:23 localhost python3[6107]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 01:51:23 localhost python3[6150]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764312683.3882318-628-62124278769557/source _original_basename=tmpk5fxeurt follow=False checksum=10225105ecbcb8380becb3ed8e03293c5f034347 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:51:28 localhost systemd[1]: session-4.scope: Deactivated successfully. Nov 28 01:51:28 localhost systemd-logind[764]: Session 4 logged out. Waiting for processes to exit. Nov 28 01:51:28 localhost systemd-logind[764]: Removed session 4. Nov 28 01:54:49 localhost chronyd[766]: Selected source 149.56.19.163 (2.rhel.pool.ntp.org) Nov 28 01:58:05 localhost systemd[1]: Starting Cleanup of Temporary Directories... Nov 28 01:58:05 localhost systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully. Nov 28 01:58:05 localhost systemd[1]: Finished Cleanup of Temporary Directories. Nov 28 01:58:05 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully. Nov 28 01:58:49 localhost sshd[6170]: main: sshd: ssh-rsa algorithm is disabled Nov 28 01:58:49 localhost systemd-logind[764]: New session 5 of user zuul. Nov 28 01:58:49 localhost systemd[1]: Started Session 5 of User zuul. Nov 28 01:58:49 localhost python3[6189]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-0218-9e4d-000000001d10-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 01:59:00 localhost python3[6208]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:59:01 localhost python3[6224]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:59:01 localhost python3[6240]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:59:01 localhost python3[6256]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:59:02 localhost python3[6272]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:59:03 localhost python3[6320]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 01:59:04 localhost python3[6363]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764313143.4514127-643-59764353429855/source _original_basename=tmpzu2o480j follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 01:59:05 localhost python3[6393]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 28 01:59:05 localhost systemd[1]: Reloading. Nov 28 01:59:05 localhost systemd-rc-local-generator[6410]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 01:59:05 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 01:59:07 localhost python3[6439]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None Nov 28 01:59:08 localhost python3[6455]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 01:59:08 localhost python3[6473]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 01:59:09 localhost python3[6491]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 01:59:09 localhost python3[6509]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 01:59:10 localhost python3[6526]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init"; cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system"; cat /sys/fs/cgroup/system.slice/io.max; echo "user"; cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-0218-9e4d-000000001d17-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 01:59:21 localhost python3[6547]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 01:59:24 localhost systemd[1]: session-5.scope: Deactivated successfully. Nov 28 01:59:24 localhost systemd[1]: session-5.scope: Consumed 4.015s CPU time. Nov 28 01:59:24 localhost systemd-logind[764]: Session 5 logged out. Waiting for processes to exit. Nov 28 01:59:24 localhost systemd-logind[764]: Removed session 5. Nov 28 02:00:38 localhost sshd[6554]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:00:39 localhost systemd-logind[764]: New session 6 of user zuul. Nov 28 02:00:39 localhost systemd[1]: Started Session 6 of User zuul. Nov 28 02:00:39 localhost systemd[1]: Starting RHSM dbus service... Nov 28 02:00:39 localhost systemd[1]: Started RHSM dbus service. Nov 28 02:00:39 localhost rhsm-service[6578]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Nov 28 02:00:39 localhost rhsm-service[6578]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Nov 28 02:00:39 localhost rhsm-service[6578]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Nov 28 02:00:39 localhost rhsm-service[6578]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Nov 28 02:00:42 localhost rhsm-service[6578]: INFO [subscription_manager.managerlib:90] Consumer created: np0005538513.novalocal (f7b9b60d-6b81-4721-85a2-48be6d80ec8a) Nov 28 02:00:42 localhost subscription-manager[6578]: Registered system with identity: f7b9b60d-6b81-4721-85a2-48be6d80ec8a Nov 28 02:00:44 localhost rhsm-service[6578]: INFO [subscription_manager.entcertlib:131] certs updated: Nov 28 02:00:44 localhost rhsm-service[6578]: Total updates: 1 Nov 28 02:00:44 localhost rhsm-service[6578]: Found (local) serial# [] Nov 28 02:00:44 localhost rhsm-service[6578]: Expected (UEP) serial# [9132065098899233728] Nov 28 02:00:44 localhost rhsm-service[6578]: Added (new) Nov 28 02:00:44 localhost rhsm-service[6578]: [sn:9132065098899233728 ( Content Access,) @ /etc/pki/entitlement/9132065098899233728.pem] Nov 28 02:00:44 localhost rhsm-service[6578]: Deleted (rogue): Nov 28 02:00:44 localhost rhsm-service[6578]: Nov 28 02:00:44 localhost subscription-manager[6578]: Added subscription for 'Content Access' contract 'None' Nov 28 02:00:44 localhost subscription-manager[6578]: Added subscription for product ' Content Access' Nov 28 02:00:47 localhost rhsm-service[6578]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Nov 28 02:00:47 localhost rhsm-service[6578]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Nov 28 02:00:47 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:00:48 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:00:48 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:00:48 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:00:49 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:00:59 localhost python3[6669]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-cf29-7b10-00000000000d-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:01:00 localhost python3[6688]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 28 02:01:31 localhost setsebool[6778]: The virt_use_nfs policy boolean was changed to 1 by root Nov 28 02:01:31 localhost setsebool[6778]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root Nov 28 02:01:42 localhost kernel: SELinux: Converting 407 SID table entries... Nov 28 02:01:42 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 28 02:01:42 localhost kernel: SELinux: policy capability open_perms=1 Nov 28 02:01:42 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 28 02:01:42 localhost kernel: SELinux: policy capability always_check_network=0 Nov 28 02:01:42 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 28 02:01:42 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 28 02:01:42 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 28 02:01:54 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=3 res=1 Nov 28 02:01:54 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 28 02:01:54 localhost systemd[1]: Starting man-db-cache-update.service... Nov 28 02:01:54 localhost systemd[1]: Reloading. Nov 28 02:01:55 localhost systemd-rc-local-generator[7654]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:01:55 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:01:55 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 28 02:01:56 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:01:56 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:02:04 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 28 02:02:04 localhost systemd[1]: Finished man-db-cache-update.service. Nov 28 02:02:04 localhost systemd[1]: man-db-cache-update.service: Consumed 11.174s CPU time. Nov 28 02:02:04 localhost systemd[1]: run-rae0a51aa065742f38ba6caf1fb97a20f.service: Deactivated successfully. Nov 28 02:02:47 localhost podman[18374]: 2025-11-28 07:02:47.936305556 +0000 UTC m=+0.097922582 system refresh Nov 28 02:02:48 localhost systemd[4177]: Starting D-Bus User Message Bus... Nov 28 02:02:48 localhost dbus-broker-launch[18433]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored Nov 28 02:02:48 localhost dbus-broker-launch[18433]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored Nov 28 02:02:48 localhost systemd[4177]: Started D-Bus User Message Bus. Nov 28 02:02:48 localhost journal[18433]: Ready Nov 28 02:02:48 localhost systemd[4177]: selinux: avc: op=load_policy lsm=selinux seqno=3 res=1 Nov 28 02:02:48 localhost systemd[4177]: Created slice Slice /user. Nov 28 02:02:48 localhost systemd[4177]: podman-18416.scope: unit configures an IP firewall, but not running as root. Nov 28 02:02:48 localhost systemd[4177]: (This warning is only shown for the first unit using IP firewalling.) Nov 28 02:02:48 localhost systemd[4177]: Started podman-18416.scope. Nov 28 02:02:48 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 28 02:02:49 localhost systemd[4177]: Started podman-pause-90600334.scope. Nov 28 02:02:51 localhost systemd[1]: session-6.scope: Deactivated successfully. Nov 28 02:02:51 localhost systemd[1]: session-6.scope: Consumed 53.073s CPU time. Nov 28 02:02:51 localhost systemd-logind[764]: Session 6 logged out. Waiting for processes to exit. Nov 28 02:02:51 localhost systemd-logind[764]: Removed session 6. Nov 28 02:03:06 localhost sshd[18436]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:03:06 localhost sshd[18437]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:03:06 localhost sshd[18440]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:03:06 localhost sshd[18438]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:03:06 localhost sshd[18439]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:03:11 localhost sshd[18446]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:03:11 localhost systemd-logind[764]: New session 7 of user zuul. Nov 28 02:03:11 localhost systemd[1]: Started Session 7 of User zuul. Nov 28 02:03:11 localhost python3[18463]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCwbdeJV6VDDXadsSf2RG5X7kz/GTOF493/FPhPlXmY8LaEjIgaNVgahbrG06qkZx72vk0TqexyzHBymiNAuWIc= zuul@np0005538507.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 02:03:12 localhost python3[18479]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCwbdeJV6VDDXadsSf2RG5X7kz/GTOF493/FPhPlXmY8LaEjIgaNVgahbrG06qkZx72vk0TqexyzHBymiNAuWIc= zuul@np0005538507.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 02:03:14 localhost systemd[1]: session-7.scope: Deactivated successfully. Nov 28 02:03:14 localhost systemd-logind[764]: Session 7 logged out. Waiting for processes to exit. Nov 28 02:03:14 localhost systemd-logind[764]: Removed session 7. Nov 28 02:04:48 localhost sshd[18482]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:04:49 localhost systemd-logind[764]: New session 8 of user zuul. Nov 28 02:04:49 localhost systemd[1]: Started Session 8 of User zuul. Nov 28 02:04:49 localhost python3[18501]: ansible-authorized_key Invoked with user=root manage_dir=True key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsnBivukZgTjr1SoC29hE3ofwUMxTaKeXh9gXvDwMJASbvK4q9943cbJ2j47GUf8sEgY38kkU/dxSMQWULl4d2oquIgZpJbJuXMU1WNxwGNSrS74OecQ3Or4VxTiDmu/HV83nIWHqfpDCra4DlrIBPPNwhBK4u0QYy87AJaML6NGEDaubbHgVCg1UpW1ho/sDoXptAehoCEaaeRz5tPHiXRnHpIXu44Sp8fRcyU9rBqdv+/lgachTcMYadsD2WBHIL+pptEDHB5TvQTDpnU58YdGFarn8uuGPP4t8H6xcqXbaJS9/oZa5Fb5Mh3vORBbR65jvlGg4PYGzCuI/xllY5+lGK7eyOleFyRqWKa2uAIaGoRBT4ZLKAssOFwCIaGfOAFFOBMkuylg4+MtbYiMJYRARPSRAufAROqhUDOo73y5lBrXh07aiWuSn8fU4mclWu+Xw382ryxW+XeHPc12d7S46TvGJaRvzsLtlyerRxGI77xOHRexq1Z/SFjOWLOwc= zuul-build-sshkey state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 02:04:50 localhost python3[18517]: ansible-user Invoked with name=root state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005538513.novalocal update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Nov 28 02:04:52 localhost python3[18567]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:04:52 localhost python3[18610]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764313491.7259095-133-223867814579/source dest=/root/.ssh/id_rsa mode=384 owner=root force=False _original_basename=4237e6bc560e46d9aa55417d911b2c55_id_rsa follow=False checksum=47f6a2f8fa426c1f34aad346f88073a22928af4e backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:04:53 localhost python3[18672]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:04:53 localhost python3[18715]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764313493.310652-219-167592154753351/source dest=/root/.ssh/id_rsa.pub mode=420 owner=root force=False _original_basename=4237e6bc560e46d9aa55417d911b2c55_id_rsa.pub follow=False checksum=d1f12d852c72cfefab089d88337552962cfbc93d backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:04:56 localhost python3[18745]: ansible-ansible.builtin.file Invoked with path=/etc/nodepool state=directory mode=0777 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:04:57 localhost python3[18791]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:04:57 localhost python3[18807]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes _original_basename=tmpzn48ue4i recurse=False state=file path=/etc/nodepool/sub_nodes force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:04:58 localhost python3[18867]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:04:58 localhost python3[18883]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes_private _original_basename=tmpdld9nvxw recurse=False state=file path=/etc/nodepool/sub_nodes_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:05:00 localhost python3[18943]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:05:00 localhost python3[18959]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/node_private _original_basename=tmpesfm0j2o recurse=False state=file path=/etc/nodepool/node_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:05:01 localhost systemd-logind[764]: Session 8 logged out. Waiting for processes to exit. Nov 28 02:05:01 localhost systemd[1]: session-8.scope: Deactivated successfully. Nov 28 02:05:01 localhost systemd[1]: session-8.scope: Consumed 3.638s CPU time. Nov 28 02:05:01 localhost systemd-logind[764]: Removed session 8. Nov 28 02:07:23 localhost sshd[18975]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:07:23 localhost systemd-logind[764]: New session 9 of user zuul. Nov 28 02:07:23 localhost systemd[1]: Started Session 9 of User zuul. Nov 28 02:07:23 localhost python3[19021]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:10:35 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:10:35 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:12:08 localhost sshd[19144]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:12:22 localhost sshd[19146]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:12:23 localhost systemd[1]: session-9.scope: Deactivated successfully. Nov 28 02:12:23 localhost systemd-logind[764]: Session 9 logged out. Waiting for processes to exit. Nov 28 02:12:23 localhost systemd-logind[764]: Removed session 9. Nov 28 02:12:34 localhost sshd[19151]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:12:46 localhost sshd[19153]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:12:58 localhost sshd[19155]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:13:10 localhost sshd[19157]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:13:21 localhost sshd[19159]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:13:34 localhost sshd[19162]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:13:46 localhost sshd[19164]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:13:58 localhost sshd[19166]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:14:00 localhost sshd[19168]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:14:09 localhost sshd[19170]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:14:21 localhost sshd[19172]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:14:33 localhost sshd[19174]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:14:45 localhost sshd[19176]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:14:57 localhost sshd[19178]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:15:08 localhost sshd[19180]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:15:21 localhost sshd[19182]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:15:33 localhost sshd[19184]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:15:44 localhost sshd[19187]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:15:56 localhost sshd[19189]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:16:08 localhost sshd[19191]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:16:20 localhost sshd[19193]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:16:32 localhost sshd[19195]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:16:44 localhost sshd[19197]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:16:56 localhost sshd[19199]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:17:08 localhost sshd[19201]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:17:20 localhost sshd[19203]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:17:31 localhost sshd[19205]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:17:44 localhost sshd[19207]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:17:55 localhost sshd[19209]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:18:08 localhost sshd[19211]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:18:20 localhost sshd[19214]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:18:32 localhost sshd[19216]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:18:44 localhost sshd[19218]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:18:56 localhost sshd[19220]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:19:08 localhost sshd[19222]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:19:21 localhost sshd[19224]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:19:32 localhost sshd[19226]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:19:39 localhost sshd[19230]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:19:39 localhost systemd-logind[764]: New session 10 of user zuul. Nov 28 02:19:39 localhost systemd[1]: Started Session 10 of User zuul. Nov 28 02:19:39 localhost python3[19247]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-f34b-e95a-00000000000c-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:19:41 localhost python3[19267]: ansible-ansible.legacy.command Invoked with _raw_params=yum clean all zuul_log_id=fa163ef9-e89a-f34b-e95a-00000000000d-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:19:44 localhost sshd[19271]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:19:46 localhost python3[19287]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-baseos-eus-rpms'] state=enabled purge=False Nov 28 02:19:48 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:19:49 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:19:56 localhost sshd[19481]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:20:08 localhost sshd[19491]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:20:20 localhost sshd[19493]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:20:32 localhost sshd[19495]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:20:43 localhost python3[19513]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-appstream-eus-rpms'] state=enabled purge=False Nov 28 02:20:44 localhost sshd[19515]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:20:46 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:20:46 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:20:54 localhost python3[19656]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-highavailability-eus-rpms'] state=enabled purge=False Nov 28 02:20:56 localhost sshd[19659]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:20:57 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:20:57 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:21:02 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:21:02 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:21:08 localhost sshd[19920]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:21:20 localhost sshd[19922]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:21:25 localhost python3[19939]: ansible-community.general.rhsm_repository Invoked with name=['fast-datapath-for-rhel-9-x86_64-rpms'] state=enabled purge=False Nov 28 02:21:28 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:21:28 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:21:32 localhost sshd[20126]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:21:33 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:21:33 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:21:44 localhost sshd[20321]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:21:56 localhost sshd[20323]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:21:57 localhost python3[20339]: ansible-community.general.rhsm_repository Invoked with name=['openstack-17.1-for-rhel-9-x86_64-rpms'] state=enabled purge=False Nov 28 02:21:59 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:22:00 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:22:04 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:22:09 localhost sshd[20597]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:22:20 localhost sshd[20605]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:22:30 localhost python3[20623]: ansible-ansible.legacy.command Invoked with _raw_params=yum repolist --enabled#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-f34b-e95a-000000000013-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:22:32 localhost sshd[20627]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:22:35 localhost python3[20644]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch', 'os-net-config', 'ansible-core'] state=present update_cache=True allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 28 02:22:44 localhost sshd[20691]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:22:55 localhost kernel: SELinux: Converting 486 SID table entries... Nov 28 02:22:55 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 28 02:22:55 localhost kernel: SELinux: policy capability open_perms=1 Nov 28 02:22:55 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 28 02:22:55 localhost kernel: SELinux: policy capability always_check_network=0 Nov 28 02:22:55 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 28 02:22:55 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 28 02:22:55 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 28 02:22:55 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=4 res=1 Nov 28 02:22:55 localhost systemd[1]: Started daily update of the root trust anchor for DNSSEC. Nov 28 02:22:57 localhost sshd[20806]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:22:59 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 28 02:22:59 localhost systemd[1]: Starting man-db-cache-update.service... Nov 28 02:22:59 localhost systemd[1]: Reloading. Nov 28 02:22:59 localhost systemd-sysv-generator[21313]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:22:59 localhost systemd-rc-local-generator[21307]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:22:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:22:59 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 28 02:23:00 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 28 02:23:00 localhost systemd[1]: Finished man-db-cache-update.service. Nov 28 02:23:00 localhost systemd[1]: run-rff007e7a76b34c89b61f49a553509fca.service: Deactivated successfully. Nov 28 02:23:01 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:23:01 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 02:23:09 localhost sshd[21929]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:23:20 localhost sshd[21931]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:23:27 localhost python3[21949]: ansible-ansible.legacy.command Invoked with _raw_params=ansible-galaxy collection install ansible.posix#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-f34b-e95a-000000000015-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:23:33 localhost sshd[21953]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:23:45 localhost sshd[21956]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:23:57 localhost sshd[21958]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:23:58 localhost python3[21974]: ansible-ansible.builtin.file Invoked with path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:23:59 localhost python3[22022]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/tripleo_config.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:23:59 localhost python3[22065]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764314638.924288-290-148855971992213/source dest=/etc/os-net-config/tripleo_config.yaml mode=None follow=False _original_basename=overcloud_net_config.j2 checksum=3358dfc6c6ce646155135d0cad900026cb34ba08 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:24:01 localhost python3[22096]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Nov 28 02:24:01 localhost systemd-journald[618]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 89.2 (297 of 333 items), suggesting rotation. Nov 28 02:24:01 localhost systemd-journald[618]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 28 02:24:01 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 28 02:24:01 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 28 02:24:01 localhost python3[22117]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-20 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Nov 28 02:24:01 localhost python3[22137]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-21 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Nov 28 02:24:02 localhost python3[22157]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-22 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Nov 28 02:24:03 localhost python3[22177]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-23 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Nov 28 02:24:05 localhost python3[22197]: ansible-ansible.builtin.systemd Invoked with name=network state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 02:24:05 localhost systemd[1]: Starting LSB: Bring up/down networking... Nov 28 02:24:05 localhost network[22200]: WARN : [network] You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 28 02:24:05 localhost network[22211]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 28 02:24:05 localhost network[22200]: WARN : [network] 'network-scripts' will be removed from distribution in near future. Nov 28 02:24:05 localhost network[22212]: 'network-scripts' will be removed from distribution in near future. Nov 28 02:24:05 localhost network[22200]: WARN : [network] It is advised to switch to 'NetworkManager' instead for network management. Nov 28 02:24:05 localhost network[22213]: It is advised to switch to 'NetworkManager' instead for network management. Nov 28 02:24:05 localhost NetworkManager[5967]: [1764314645.8113] audit: op="connections-reload" pid=22241 uid=0 result="success" Nov 28 02:24:05 localhost network[22200]: Bringing up loopback interface: [ OK ] Nov 28 02:24:06 localhost NetworkManager[5967]: [1764314646.0274] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth0" pid=22329 uid=0 result="success" Nov 28 02:24:06 localhost network[22200]: Bringing up interface eth0: [ OK ] Nov 28 02:24:06 localhost systemd[1]: Started LSB: Bring up/down networking. Nov 28 02:24:06 localhost python3[22370]: ansible-ansible.builtin.systemd Invoked with name=openvswitch state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 02:24:07 localhost systemd[1]: Starting Open vSwitch Database Unit... Nov 28 02:24:07 localhost chown[22374]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory Nov 28 02:24:07 localhost ovs-ctl[22379]: /etc/openvswitch/conf.db does not exist ... (warning). Nov 28 02:24:07 localhost ovs-ctl[22379]: Creating empty database /etc/openvswitch/conf.db [ OK ] Nov 28 02:24:07 localhost ovs-ctl[22379]: Starting ovsdb-server [ OK ] Nov 28 02:24:07 localhost ovs-vsctl[22428]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1 Nov 28 02:24:07 localhost ovs-vsctl[22448]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.6-141.el9fdp "external-ids:system-id=\"c85299c6-8e38-42c8-8509-2eaaf15c050c\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"rhel\"" "system-version=\"9.2\"" Nov 28 02:24:07 localhost ovs-ctl[22379]: Configuring Open vSwitch system IDs [ OK ] Nov 28 02:24:07 localhost ovs-ctl[22379]: Enabling remote OVSDB managers [ OK ] Nov 28 02:24:07 localhost systemd[1]: Started Open vSwitch Database Unit. Nov 28 02:24:07 localhost ovs-vsctl[22454]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005538513.novalocal Nov 28 02:24:07 localhost systemd[1]: Starting Open vSwitch Delete Transient Ports... Nov 28 02:24:07 localhost systemd[1]: Finished Open vSwitch Delete Transient Ports. Nov 28 02:24:07 localhost systemd[1]: Starting Open vSwitch Forwarding Unit... Nov 28 02:24:08 localhost kernel: openvswitch: Open vSwitch switching datapath Nov 28 02:24:08 localhost ovs-ctl[22498]: Inserting openvswitch module [ OK ] Nov 28 02:24:08 localhost ovs-ctl[22467]: Starting ovs-vswitchd [ OK ] Nov 28 02:24:08 localhost ovs-vsctl[22515]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005538513.novalocal Nov 28 02:24:08 localhost ovs-ctl[22467]: Enabling remote OVSDB managers [ OK ] Nov 28 02:24:08 localhost systemd[1]: Started Open vSwitch Forwarding Unit. Nov 28 02:24:08 localhost systemd[1]: Starting Open vSwitch... Nov 28 02:24:08 localhost systemd[1]: Finished Open vSwitch. Nov 28 02:24:09 localhost sshd[22536]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:24:09 localhost python3[22535]: ansible-ansible.legacy.command Invoked with _raw_params=os-net-config -c /etc/os-net-config/tripleo_config.yaml#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-f34b-e95a-00000000001a-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:24:10 localhost NetworkManager[5967]: [1764314650.5745] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22694 uid=0 result="success" Nov 28 02:24:10 localhost ifup[22695]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 28 02:24:10 localhost ifup[22696]: 'network-scripts' will be removed from distribution in near future. Nov 28 02:24:10 localhost ifup[22697]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 28 02:24:10 localhost NetworkManager[5967]: [1764314650.6071] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22703 uid=0 result="success" Nov 28 02:24:10 localhost ovs-vsctl[22705]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --may-exist add-br br-ex -- set bridge br-ex other-config:mac-table-size=50000 -- set bridge br-ex other-config:hwaddr=fa:16:3e:11:ad:79 -- set bridge br-ex fail_mode=standalone -- del-controller br-ex Nov 28 02:24:10 localhost kernel: device ovs-system entered promiscuous mode Nov 28 02:24:10 localhost systemd-udevd[22439]: Network interface NamePolicy= disabled on kernel command line. Nov 28 02:24:10 localhost NetworkManager[5967]: [1764314650.6362] manager: (ovs-system): new Generic device (/org/freedesktop/NetworkManager/Devices/4) Nov 28 02:24:10 localhost kernel: Timeout policy base is empty Nov 28 02:24:10 localhost kernel: Failed to associated timeout policy `ovs_test_tp' Nov 28 02:24:10 localhost kernel: device br-ex entered promiscuous mode Nov 28 02:24:10 localhost NetworkManager[5967]: [1764314650.6852] manager: (br-ex): new Generic device (/org/freedesktop/NetworkManager/Devices/5) Nov 28 02:24:10 localhost NetworkManager[5967]: [1764314650.7115] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22730 uid=0 result="success" Nov 28 02:24:10 localhost NetworkManager[5967]: [1764314650.7316] device (br-ex): carrier: link connected Nov 28 02:24:13 localhost NetworkManager[5967]: [1764314653.7930] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22760 uid=0 result="success" Nov 28 02:24:13 localhost NetworkManager[5967]: [1764314653.8399] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22775 uid=0 result="success" Nov 28 02:24:13 localhost NET[22800]: /etc/sysconfig/network-scripts/ifup-post : updated /etc/resolv.conf Nov 28 02:24:13 localhost NetworkManager[5967]: [1764314653.9280] device (eth1): state change: activated -> unmanaged (reason 'unmanaged', sys-iface-state: 'managed') Nov 28 02:24:13 localhost NetworkManager[5967]: [1764314653.9422] dhcp4 (eth1): canceled DHCP transaction Nov 28 02:24:13 localhost NetworkManager[5967]: [1764314653.9423] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Nov 28 02:24:13 localhost NetworkManager[5967]: [1764314653.9423] dhcp4 (eth1): state changed no lease Nov 28 02:24:13 localhost NetworkManager[5967]: [1764314653.9470] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22809 uid=0 result="success" Nov 28 02:24:13 localhost ifup[22810]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 28 02:24:13 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Nov 28 02:24:13 localhost ifup[22811]: 'network-scripts' will be removed from distribution in near future. Nov 28 02:24:13 localhost ifup[22813]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 28 02:24:13 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Nov 28 02:24:13 localhost NetworkManager[5967]: [1764314653.9860] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22828 uid=0 result="success" Nov 28 02:24:14 localhost NetworkManager[5967]: [1764314654.0526] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22837 uid=0 result="success" Nov 28 02:24:14 localhost NetworkManager[5967]: [1764314654.0592] device (eth1): carrier: link connected Nov 28 02:24:14 localhost NetworkManager[5967]: [1764314654.0815] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22846 uid=0 result="success" Nov 28 02:24:14 localhost ipv6_wait_tentative[22858]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state Nov 28 02:24:15 localhost ipv6_wait_tentative[22863]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state Nov 28 02:24:16 localhost NetworkManager[5967]: [1764314656.1526] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22872 uid=0 result="success" Nov 28 02:24:16 localhost ovs-vsctl[22887]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex eth1 -- add-port br-ex eth1 Nov 28 02:24:16 localhost kernel: device eth1 entered promiscuous mode Nov 28 02:24:16 localhost NetworkManager[5967]: [1764314656.2278] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22895 uid=0 result="success" Nov 28 02:24:16 localhost ifup[22896]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 28 02:24:16 localhost ifup[22897]: 'network-scripts' will be removed from distribution in near future. Nov 28 02:24:16 localhost ifup[22898]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 28 02:24:16 localhost NetworkManager[5967]: [1764314656.2593] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22904 uid=0 result="success" Nov 28 02:24:16 localhost NetworkManager[5967]: [1764314656.3010] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22914 uid=0 result="success" Nov 28 02:24:16 localhost ifup[22915]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 28 02:24:16 localhost ifup[22916]: 'network-scripts' will be removed from distribution in near future. Nov 28 02:24:16 localhost ifup[22917]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 28 02:24:16 localhost NetworkManager[5967]: [1764314656.3321] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22923 uid=0 result="success" Nov 28 02:24:16 localhost ovs-vsctl[22926]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal Nov 28 02:24:16 localhost kernel: device vlan20 entered promiscuous mode Nov 28 02:24:16 localhost NetworkManager[5967]: [1764314656.3747] manager: (vlan20): new Generic device (/org/freedesktop/NetworkManager/Devices/6) Nov 28 02:24:16 localhost systemd-udevd[22928]: Network interface NamePolicy= disabled on kernel command line. Nov 28 02:24:16 localhost NetworkManager[5967]: [1764314656.3990] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22937 uid=0 result="success" Nov 28 02:24:16 localhost NetworkManager[5967]: [1764314656.4204] device (vlan20): carrier: link connected Nov 28 02:24:19 localhost NetworkManager[5967]: [1764314659.4736] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22967 uid=0 result="success" Nov 28 02:24:19 localhost NetworkManager[5967]: [1764314659.5240] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22982 uid=0 result="success" Nov 28 02:24:19 localhost NetworkManager[5967]: [1764314659.5836] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23003 uid=0 result="success" Nov 28 02:24:19 localhost ifup[23004]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 28 02:24:19 localhost ifup[23005]: 'network-scripts' will be removed from distribution in near future. Nov 28 02:24:19 localhost ifup[23006]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 28 02:24:19 localhost NetworkManager[5967]: [1764314659.6146] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23012 uid=0 result="success" Nov 28 02:24:19 localhost ovs-vsctl[23015]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal Nov 28 02:24:19 localhost systemd-udevd[23017]: Network interface NamePolicy= disabled on kernel command line. Nov 28 02:24:19 localhost kernel: device vlan23 entered promiscuous mode Nov 28 02:24:19 localhost NetworkManager[5967]: [1764314659.6559] manager: (vlan23): new Generic device (/org/freedesktop/NetworkManager/Devices/7) Nov 28 02:24:19 localhost NetworkManager[5967]: [1764314659.6821] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23027 uid=0 result="success" Nov 28 02:24:19 localhost NetworkManager[5967]: [1764314659.7039] device (vlan23): carrier: link connected Nov 28 02:24:22 localhost sshd[23048]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:24:22 localhost NetworkManager[5967]: [1764314662.7644] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23058 uid=0 result="success" Nov 28 02:24:22 localhost NetworkManager[5967]: [1764314662.8115] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23073 uid=0 result="success" Nov 28 02:24:22 localhost NetworkManager[5967]: [1764314662.8999] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23094 uid=0 result="success" Nov 28 02:24:22 localhost ifup[23095]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 28 02:24:22 localhost ifup[23096]: 'network-scripts' will be removed from distribution in near future. Nov 28 02:24:22 localhost ifup[23097]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 28 02:24:22 localhost NetworkManager[5967]: [1764314662.9347] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23103 uid=0 result="success" Nov 28 02:24:22 localhost ovs-vsctl[23106]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal Nov 28 02:24:22 localhost kernel: device vlan21 entered promiscuous mode Nov 28 02:24:22 localhost NetworkManager[5967]: [1764314662.9676] manager: (vlan21): new Generic device (/org/freedesktop/NetworkManager/Devices/8) Nov 28 02:24:22 localhost systemd-udevd[23108]: Network interface NamePolicy= disabled on kernel command line. Nov 28 02:24:22 localhost NetworkManager[5967]: [1764314662.9960] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23118 uid=0 result="success" Nov 28 02:24:23 localhost NetworkManager[5967]: [1764314663.0188] device (vlan21): carrier: link connected Nov 28 02:24:23 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Nov 28 02:24:26 localhost NetworkManager[5967]: [1764314666.0709] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23149 uid=0 result="success" Nov 28 02:24:26 localhost NetworkManager[5967]: [1764314666.1187] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23164 uid=0 result="success" Nov 28 02:24:26 localhost NetworkManager[5967]: [1764314666.1835] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23185 uid=0 result="success" Nov 28 02:24:26 localhost ifup[23186]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 28 02:24:26 localhost ifup[23187]: 'network-scripts' will be removed from distribution in near future. Nov 28 02:24:26 localhost ifup[23188]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 28 02:24:26 localhost NetworkManager[5967]: [1764314666.2222] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23194 uid=0 result="success" Nov 28 02:24:26 localhost ovs-vsctl[23197]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal Nov 28 02:24:26 localhost kernel: device vlan22 entered promiscuous mode Nov 28 02:24:26 localhost NetworkManager[5967]: [1764314666.2636] manager: (vlan22): new Generic device (/org/freedesktop/NetworkManager/Devices/9) Nov 28 02:24:26 localhost systemd-udevd[23199]: Network interface NamePolicy= disabled on kernel command line. Nov 28 02:24:26 localhost NetworkManager[5967]: [1764314666.2933] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23209 uid=0 result="success" Nov 28 02:24:26 localhost NetworkManager[5967]: [1764314666.3141] device (vlan22): carrier: link connected Nov 28 02:24:29 localhost NetworkManager[5967]: [1764314669.3601] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23239 uid=0 result="success" Nov 28 02:24:29 localhost NetworkManager[5967]: [1764314669.4064] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23254 uid=0 result="success" Nov 28 02:24:29 localhost NetworkManager[5967]: [1764314669.4603] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23275 uid=0 result="success" Nov 28 02:24:29 localhost ifup[23276]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 28 02:24:29 localhost ifup[23277]: 'network-scripts' will be removed from distribution in near future. Nov 28 02:24:29 localhost ifup[23278]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 28 02:24:29 localhost NetworkManager[5967]: [1764314669.4905] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23284 uid=0 result="success" Nov 28 02:24:29 localhost ovs-vsctl[23287]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal Nov 28 02:24:29 localhost kernel: device vlan44 entered promiscuous mode Nov 28 02:24:29 localhost NetworkManager[5967]: [1764314669.5199] manager: (vlan44): new Generic device (/org/freedesktop/NetworkManager/Devices/10) Nov 28 02:24:29 localhost systemd-udevd[23290]: Network interface NamePolicy= disabled on kernel command line. Nov 28 02:24:29 localhost NetworkManager[5967]: [1764314669.5390] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23299 uid=0 result="success" Nov 28 02:24:29 localhost NetworkManager[5967]: [1764314669.5556] device (vlan44): carrier: link connected Nov 28 02:24:32 localhost NetworkManager[5967]: [1764314672.6068] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23329 uid=0 result="success" Nov 28 02:24:32 localhost NetworkManager[5967]: [1764314672.6544] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23344 uid=0 result="success" Nov 28 02:24:32 localhost NetworkManager[5967]: [1764314672.7155] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23365 uid=0 result="success" Nov 28 02:24:32 localhost ifup[23366]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 28 02:24:32 localhost ifup[23367]: 'network-scripts' will be removed from distribution in near future. Nov 28 02:24:32 localhost ifup[23368]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 28 02:24:32 localhost NetworkManager[5967]: [1764314672.7470] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23374 uid=0 result="success" Nov 28 02:24:32 localhost ovs-vsctl[23377]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal Nov 28 02:24:32 localhost NetworkManager[5967]: [1764314672.8010] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23384 uid=0 result="success" Nov 28 02:24:33 localhost NetworkManager[5967]: [1764314673.8562] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23411 uid=0 result="success" Nov 28 02:24:33 localhost NetworkManager[5967]: [1764314673.9006] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23426 uid=0 result="success" Nov 28 02:24:33 localhost NetworkManager[5967]: [1764314673.9591] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23447 uid=0 result="success" Nov 28 02:24:33 localhost ifup[23448]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 28 02:24:33 localhost ifup[23449]: 'network-scripts' will be removed from distribution in near future. Nov 28 02:24:33 localhost ifup[23450]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 28 02:24:33 localhost NetworkManager[5967]: [1764314673.9906] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23456 uid=0 result="success" Nov 28 02:24:34 localhost ovs-vsctl[23459]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal Nov 28 02:24:34 localhost NetworkManager[5967]: [1764314674.0627] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23466 uid=0 result="success" Nov 28 02:24:34 localhost sshd[23484]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:24:35 localhost NetworkManager[5967]: [1764314675.1236] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23495 uid=0 result="success" Nov 28 02:24:35 localhost NetworkManager[5967]: [1764314675.1693] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23510 uid=0 result="success" Nov 28 02:24:35 localhost NetworkManager[5967]: [1764314675.2299] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23531 uid=0 result="success" Nov 28 02:24:35 localhost ifup[23532]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 28 02:24:35 localhost ifup[23533]: 'network-scripts' will be removed from distribution in near future. Nov 28 02:24:35 localhost ifup[23534]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 28 02:24:35 localhost NetworkManager[5967]: [1764314675.2651] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23540 uid=0 result="success" Nov 28 02:24:35 localhost ovs-vsctl[23543]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal Nov 28 02:24:35 localhost NetworkManager[5967]: [1764314675.3239] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23550 uid=0 result="success" Nov 28 02:24:36 localhost NetworkManager[5967]: [1764314676.3797] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23578 uid=0 result="success" Nov 28 02:24:36 localhost NetworkManager[5967]: [1764314676.4269] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23593 uid=0 result="success" Nov 28 02:24:36 localhost NetworkManager[5967]: [1764314676.4950] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23614 uid=0 result="success" Nov 28 02:24:36 localhost ifup[23616]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 28 02:24:36 localhost ifup[23617]: 'network-scripts' will be removed from distribution in near future. Nov 28 02:24:36 localhost ifup[23618]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 28 02:24:36 localhost NetworkManager[5967]: [1764314676.5306] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23624 uid=0 result="success" Nov 28 02:24:36 localhost ovs-vsctl[23627]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal Nov 28 02:24:36 localhost NetworkManager[5967]: [1764314676.5889] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23634 uid=0 result="success" Nov 28 02:24:37 localhost NetworkManager[5967]: [1764314677.6500] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23662 uid=0 result="success" Nov 28 02:24:37 localhost NetworkManager[5967]: [1764314677.6979] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23677 uid=0 result="success" Nov 28 02:24:37 localhost NetworkManager[5967]: [1764314677.7654] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23698 uid=0 result="success" Nov 28 02:24:37 localhost ifup[23699]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Nov 28 02:24:37 localhost ifup[23700]: 'network-scripts' will be removed from distribution in near future. Nov 28 02:24:37 localhost ifup[23701]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Nov 28 02:24:37 localhost NetworkManager[5967]: [1764314677.7999] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23707 uid=0 result="success" Nov 28 02:24:37 localhost ovs-vsctl[23710]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal Nov 28 02:24:37 localhost NetworkManager[5967]: [1764314677.8609] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23717 uid=0 result="success" Nov 28 02:24:38 localhost NetworkManager[5967]: [1764314678.9278] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23745 uid=0 result="success" Nov 28 02:24:38 localhost NetworkManager[5967]: [1764314678.9792] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23760 uid=0 result="success" Nov 28 02:24:46 localhost sshd[23778]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:24:58 localhost sshd[23780]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:25:10 localhost sshd[23782]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:25:22 localhost sshd[23784]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:25:32 localhost python3[23800]: ansible-ansible.legacy.command Invoked with _raw_params=ip a#012ping -c 2 -W 2 192.168.122.10#012ping -c 2 -W 2 192.168.122.11#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-f34b-e95a-00000000001b-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:25:35 localhost sshd[23806]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:25:37 localhost python3[23822]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsnBivukZgTjr1SoC29hE3ofwUMxTaKeXh9gXvDwMJASbvK4q9943cbJ2j47GUf8sEgY38kkU/dxSMQWULl4d2oquIgZpJbJuXMU1WNxwGNSrS74OecQ3Or4VxTiDmu/HV83nIWHqfpDCra4DlrIBPPNwhBK4u0QYy87AJaML6NGEDaubbHgVCg1UpW1ho/sDoXptAehoCEaaeRz5tPHiXRnHpIXu44Sp8fRcyU9rBqdv+/lgachTcMYadsD2WBHIL+pptEDHB5TvQTDpnU58YdGFarn8uuGPP4t8H6xcqXbaJS9/oZa5Fb5Mh3vORBbR65jvlGg4PYGzCuI/xllY5+lGK7eyOleFyRqWKa2uAIaGoRBT4ZLKAssOFwCIaGfOAFFOBMkuylg4+MtbYiMJYRARPSRAufAROqhUDOo73y5lBrXh07aiWuSn8fU4mclWu+Xw382ryxW+XeHPc12d7S46TvGJaRvzsLtlyerRxGI77xOHRexq1Z/SFjOWLOwc= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 02:25:38 localhost python3[23838]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsnBivukZgTjr1SoC29hE3ofwUMxTaKeXh9gXvDwMJASbvK4q9943cbJ2j47GUf8sEgY38kkU/dxSMQWULl4d2oquIgZpJbJuXMU1WNxwGNSrS74OecQ3Or4VxTiDmu/HV83nIWHqfpDCra4DlrIBPPNwhBK4u0QYy87AJaML6NGEDaubbHgVCg1UpW1ho/sDoXptAehoCEaaeRz5tPHiXRnHpIXu44Sp8fRcyU9rBqdv+/lgachTcMYadsD2WBHIL+pptEDHB5TvQTDpnU58YdGFarn8uuGPP4t8H6xcqXbaJS9/oZa5Fb5Mh3vORBbR65jvlGg4PYGzCuI/xllY5+lGK7eyOleFyRqWKa2uAIaGoRBT4ZLKAssOFwCIaGfOAFFOBMkuylg4+MtbYiMJYRARPSRAufAROqhUDOo73y5lBrXh07aiWuSn8fU4mclWu+Xw382ryxW+XeHPc12d7S46TvGJaRvzsLtlyerRxGI77xOHRexq1Z/SFjOWLOwc= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 02:25:39 localhost python3[23852]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsnBivukZgTjr1SoC29hE3ofwUMxTaKeXh9gXvDwMJASbvK4q9943cbJ2j47GUf8sEgY38kkU/dxSMQWULl4d2oquIgZpJbJuXMU1WNxwGNSrS74OecQ3Or4VxTiDmu/HV83nIWHqfpDCra4DlrIBPPNwhBK4u0QYy87AJaML6NGEDaubbHgVCg1UpW1ho/sDoXptAehoCEaaeRz5tPHiXRnHpIXu44Sp8fRcyU9rBqdv+/lgachTcMYadsD2WBHIL+pptEDHB5TvQTDpnU58YdGFarn8uuGPP4t8H6xcqXbaJS9/oZa5Fb5Mh3vORBbR65jvlGg4PYGzCuI/xllY5+lGK7eyOleFyRqWKa2uAIaGoRBT4ZLKAssOFwCIaGfOAFFOBMkuylg4+MtbYiMJYRARPSRAufAROqhUDOo73y5lBrXh07aiWuSn8fU4mclWu+Xw382ryxW+XeHPc12d7S46TvGJaRvzsLtlyerRxGI77xOHRexq1Z/SFjOWLOwc= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 02:25:40 localhost python3[23868]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsnBivukZgTjr1SoC29hE3ofwUMxTaKeXh9gXvDwMJASbvK4q9943cbJ2j47GUf8sEgY38kkU/dxSMQWULl4d2oquIgZpJbJuXMU1WNxwGNSrS74OecQ3Or4VxTiDmu/HV83nIWHqfpDCra4DlrIBPPNwhBK4u0QYy87AJaML6NGEDaubbHgVCg1UpW1ho/sDoXptAehoCEaaeRz5tPHiXRnHpIXu44Sp8fRcyU9rBqdv+/lgachTcMYadsD2WBHIL+pptEDHB5TvQTDpnU58YdGFarn8uuGPP4t8H6xcqXbaJS9/oZa5Fb5Mh3vORBbR65jvlGg4PYGzCuI/xllY5+lGK7eyOleFyRqWKa2uAIaGoRBT4ZLKAssOFwCIaGfOAFFOBMkuylg4+MtbYiMJYRARPSRAufAROqhUDOo73y5lBrXh07aiWuSn8fU4mclWu+Xw382ryxW+XeHPc12d7S46TvGJaRvzsLtlyerRxGI77xOHRexq1Z/SFjOWLOwc= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Nov 28 02:25:41 localhost python3[23882]: ansible-ansible.builtin.slurp Invoked with path=/etc/hostname src=/etc/hostname Nov 28 02:25:41 localhost python3[23897]: ansible-ansible.legacy.command Invoked with _raw_params=hostname="np0005538513.novalocal"#012hostname_str_array=(${hostname//./ })#012echo ${hostname_str_array[0]} > /home/zuul/ansible_hostname#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-f34b-e95a-000000000022-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:25:42 localhost python3[23917]: ansible-ansible.legacy.command Invoked with _raw_params=hostname=$(cat /home/zuul/ansible_hostname)#012hostnamectl hostname "$hostname.localdomain"#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-f34b-e95a-000000000023-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:25:42 localhost systemd[1]: Starting Hostname Service... Nov 28 02:25:42 localhost systemd[1]: Started Hostname Service. Nov 28 02:25:42 localhost systemd-hostnamed[23921]: Hostname set to (static) Nov 28 02:25:42 localhost NetworkManager[5967]: [1764314742.8166] hostname: static hostname changed from "np0005538513.novalocal" to "np0005538513.localdomain" Nov 28 02:25:42 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Nov 28 02:25:42 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Nov 28 02:25:44 localhost systemd[1]: session-10.scope: Deactivated successfully. Nov 28 02:25:44 localhost systemd[1]: session-10.scope: Consumed 1min 44.334s CPU time. Nov 28 02:25:44 localhost systemd-logind[764]: Session 10 logged out. Waiting for processes to exit. Nov 28 02:25:44 localhost systemd-logind[764]: Removed session 10. Nov 28 02:25:46 localhost sshd[23932]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:25:46 localhost systemd-logind[764]: New session 11 of user zuul. Nov 28 02:25:46 localhost systemd[1]: Started Session 11 of User zuul. Nov 28 02:25:47 localhost python3[23949]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname Nov 28 02:25:47 localhost sshd[23950]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:25:49 localhost systemd[1]: session-11.scope: Deactivated successfully. Nov 28 02:25:49 localhost systemd-logind[764]: Session 11 logged out. Waiting for processes to exit. Nov 28 02:25:49 localhost systemd-logind[764]: Removed session 11. Nov 28 02:25:52 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Nov 28 02:25:59 localhost sshd[23953]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:26:11 localhost sshd[23955]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:26:12 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Nov 28 02:26:23 localhost sshd[23959]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:26:26 localhost sshd[23961]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:26:26 localhost systemd-logind[764]: New session 12 of user zuul. Nov 28 02:26:26 localhost systemd[1]: Started Session 12 of User zuul. Nov 28 02:26:26 localhost python3[23980]: ansible-ansible.legacy.dnf Invoked with name=['lvm2', 'jq'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 28 02:26:28 localhost sshd[23982]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:26:28 localhost sshd[23983]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:26:29 localhost systemd[1]: Reloading. Nov 28 02:26:30 localhost systemd-sysv-generator[24027]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:26:30 localhost systemd-rc-local-generator[24022]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:26:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:26:30 localhost systemd[1]: Starting dnf makecache... Nov 28 02:26:30 localhost systemd[1]: Listening on Device-mapper event daemon FIFOs. Nov 28 02:26:30 localhost systemd[1]: Reloading. Nov 28 02:26:30 localhost dnf[24037]: Updating Subscription Management repositories. Nov 28 02:26:30 localhost systemd-sysv-generator[24070]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:26:30 localhost systemd-rc-local-generator[24067]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:26:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:26:30 localhost systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling... Nov 28 02:26:30 localhost systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling. Nov 28 02:26:30 localhost systemd[1]: Reloading. Nov 28 02:26:30 localhost systemd-rc-local-generator[24103]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:26:30 localhost systemd-sysv-generator[24106]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:26:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:26:30 localhost systemd[1]: Listening on LVM2 poll daemon socket. Nov 28 02:26:31 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 28 02:26:31 localhost systemd[1]: Starting man-db-cache-update.service... Nov 28 02:26:31 localhost systemd[1]: Reloading. Nov 28 02:26:31 localhost systemd-sysv-generator[24171]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:26:31 localhost systemd-rc-local-generator[24165]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:26:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:26:31 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 28 02:26:31 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 28 02:26:31 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 28 02:26:31 localhost systemd[1]: Finished man-db-cache-update.service. Nov 28 02:26:31 localhost systemd[1]: run-r3770fc8bd18e41ec8a8f6bcf3d2a9a9c.service: Deactivated successfully. Nov 28 02:26:31 localhost systemd[1]: run-r3f41cd1c2eb74830b43e7fd95f8b8cc3.service: Deactivated successfully. Nov 28 02:26:32 localhost dnf[24037]: Failed determining last makecache time. Nov 28 02:26:32 localhost dnf[24037]: Red Hat OpenStack Platform 17.1 for RHEL 9 x86_ 51 kB/s | 4.0 kB 00:00 Nov 28 02:26:32 localhost dnf[24037]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS 39 kB/s | 4.1 kB 00:00 Nov 28 02:26:32 localhost dnf[24037]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS 41 kB/s | 4.1 kB 00:00 Nov 28 02:26:32 localhost dnf[24037]: Red Hat Enterprise Linux 9 for x86_64 - AppStre 46 kB/s | 4.5 kB 00:00 Nov 28 02:26:32 localhost dnf[24037]: Red Hat Enterprise Linux 9 for x86_64 - AppStre 42 kB/s | 4.5 kB 00:00 Nov 28 02:26:33 localhost dnf[24037]: Red Hat Enterprise Linux 9 for x86_64 - High Av 37 kB/s | 4.0 kB 00:00 Nov 28 02:26:33 localhost dnf[24037]: Fast Datapath for RHEL 9 x86_64 (RPMs) 42 kB/s | 4.0 kB 00:00 Nov 28 02:26:33 localhost dnf[24037]: Metadata cache created. Nov 28 02:26:33 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Nov 28 02:26:33 localhost systemd[1]: Finished dnf makecache. Nov 28 02:26:33 localhost systemd[1]: dnf-makecache.service: Consumed 2.705s CPU time. Nov 28 02:26:35 localhost sshd[24764]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:26:47 localhost sshd[24766]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:26:59 localhost sshd[24768]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:27:32 localhost systemd[1]: session-12.scope: Deactivated successfully. Nov 28 02:27:32 localhost systemd[1]: session-12.scope: Consumed 4.535s CPU time. Nov 28 02:27:32 localhost systemd-logind[764]: Session 12 logged out. Waiting for processes to exit. Nov 28 02:27:32 localhost systemd-logind[764]: Removed session 12. Nov 28 02:39:19 localhost sshd[24774]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:39:20 localhost sshd[24775]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:39:38 localhost sshd[24776]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:39:39 localhost sshd[24778]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:40:18 localhost sshd[24779]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:43:42 localhost sshd[24784]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:43:42 localhost systemd-logind[764]: New session 13 of user zuul. Nov 28 02:43:42 localhost systemd[1]: Started Session 13 of User zuul. Nov 28 02:43:43 localhost python3[24832]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 02:43:45 localhost python3[24918]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 28 02:43:48 localhost python3[24935]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 02:43:48 localhost python3[24953]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:43:49 localhost kernel: loop: module loaded Nov 28 02:43:49 localhost kernel: loop3: detected capacity change from 0 to 14680064 Nov 28 02:43:49 localhost python3[24978]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:43:49 localhost lvm[24981]: PV /dev/loop3 not used. Nov 28 02:43:49 localhost lvm[24983]: PV /dev/loop3 online, VG ceph_vg0 is complete. Nov 28 02:43:49 localhost systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0. Nov 28 02:43:49 localhost lvm[24992]: 1 logical volume(s) in volume group "ceph_vg0" now active Nov 28 02:43:49 localhost systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully. Nov 28 02:43:50 localhost python3[25041]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:43:50 localhost python3[25084]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764315830.0177555-54708-186905177275905/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:43:51 localhost python3[25114]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 02:43:51 localhost systemd[1]: Reloading. Nov 28 02:43:51 localhost systemd-rc-local-generator[25143]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:43:51 localhost systemd-sysv-generator[25147]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:43:51 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:43:51 localhost systemd[1]: Starting Ceph OSD losetup... Nov 28 02:43:51 localhost bash[25155]: /dev/loop3: [64516]:8400144 (/var/lib/ceph-osd-0.img) Nov 28 02:43:51 localhost systemd[1]: Finished Ceph OSD losetup. Nov 28 02:43:52 localhost lvm[25156]: PV /dev/loop3 online, VG ceph_vg0 is complete. Nov 28 02:43:52 localhost lvm[25156]: VG ceph_vg0 finished Nov 28 02:43:52 localhost python3[25173]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 28 02:43:55 localhost python3[25190]: ansible-ansible.builtin.stat Invoked with path=/dev/loop4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 02:43:56 localhost python3[25206]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-1.img bs=1 count=0 seek=7G#012losetup /dev/loop4 /var/lib/ceph-osd-1.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:43:56 localhost kernel: loop4: detected capacity change from 0 to 14680064 Nov 28 02:43:56 localhost python3[25228]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop4#012vgcreate ceph_vg1 /dev/loop4#012lvcreate -n ceph_lv1 -l +100%FREE ceph_vg1#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:43:56 localhost lvm[25231]: PV /dev/loop4 not used. Nov 28 02:43:56 localhost lvm[25233]: PV /dev/loop4 online, VG ceph_vg1 is complete. Nov 28 02:43:56 localhost systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg1. Nov 28 02:43:56 localhost lvm[25239]: 1 logical volume(s) in volume group "ceph_vg1" now active Nov 28 02:43:56 localhost lvm[25244]: PV /dev/loop4 online, VG ceph_vg1 is complete. Nov 28 02:43:56 localhost lvm[25244]: VG ceph_vg1 finished Nov 28 02:43:56 localhost systemd[1]: lvm-activate-ceph_vg1.service: Deactivated successfully. Nov 28 02:43:57 localhost python3[25292]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-1.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:43:57 localhost python3[25335]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764315837.220584-54792-11099961487404/source dest=/etc/systemd/system/ceph-osd-losetup-1.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=19612168ea279db4171b94ee1f8625de1ec44b58 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:43:58 localhost python3[25365]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-1.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 02:43:58 localhost systemd[1]: Reloading. Nov 28 02:43:58 localhost systemd-rc-local-generator[25390]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:43:58 localhost systemd-sysv-generator[25395]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:43:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:43:58 localhost systemd[1]: Starting Ceph OSD losetup... Nov 28 02:43:58 localhost bash[25406]: /dev/loop4: [64516]:8401550 (/var/lib/ceph-osd-1.img) Nov 28 02:43:58 localhost systemd[1]: Finished Ceph OSD losetup. Nov 28 02:43:58 localhost lvm[25407]: PV /dev/loop4 online, VG ceph_vg1 is complete. Nov 28 02:43:58 localhost lvm[25407]: VG ceph_vg1 finished Nov 28 02:44:07 localhost python3[25453]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d Nov 28 02:44:08 localhost python3[25473]: ansible-hostname Invoked with name=np0005538513.localdomain use=None Nov 28 02:44:08 localhost systemd[1]: Starting Hostname Service... Nov 28 02:44:08 localhost systemd[1]: Started Hostname Service. Nov 28 02:44:11 localhost python3[25496]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None Nov 28 02:44:12 localhost python3[25544]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.l5sb_fgttmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:44:13 localhost python3[25574]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.l5sb_fgttmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:44:13 localhost python3[25590]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.l5sb_fgttmphosts insertbefore=BOF block=192.168.122.106 np0005538513.localdomain np0005538513#012192.168.122.106 np0005538513.ctlplane.localdomain np0005538513.ctlplane#012192.168.122.107 np0005538514.localdomain np0005538514#012192.168.122.107 np0005538514.ctlplane.localdomain np0005538514.ctlplane#012192.168.122.108 np0005538515.localdomain np0005538515#012192.168.122.108 np0005538515.ctlplane.localdomain np0005538515.ctlplane#012192.168.122.103 np0005538510.localdomain np0005538510#012192.168.122.103 np0005538510.ctlplane.localdomain np0005538510.ctlplane#012192.168.122.104 np0005538511.localdomain np0005538511#012192.168.122.104 np0005538511.ctlplane.localdomain np0005538511.ctlplane#012192.168.122.105 np0005538512.localdomain np0005538512#012192.168.122.105 np0005538512.ctlplane.localdomain np0005538512.ctlplane#012#012192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane#012 marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:44:14 localhost python3[25606]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.l5sb_fgttmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:44:14 localhost python3[25623]: ansible-file Invoked with path=/tmp/ansible.l5sb_fgttmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:44:16 localhost python3[25639]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:44:18 localhost python3[25657]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 28 02:44:22 localhost python3[25706]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:44:22 localhost python3[25751]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764315861.8813846-55736-173232186215563/source dest=/etc/chrony.conf owner=root group=root mode=420 follow=False _original_basename=chrony.conf.j2 checksum=4fd4fbbb2de00c70a54478b7feb8ef8adf6a3362 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:44:24 localhost python3[25781]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 02:44:24 localhost python3[25799]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 02:44:24 localhost chronyd[766]: chronyd exiting Nov 28 02:44:24 localhost systemd[1]: Stopping NTP client/server... Nov 28 02:44:24 localhost systemd[1]: chronyd.service: Deactivated successfully. Nov 28 02:44:24 localhost systemd[1]: Stopped NTP client/server. Nov 28 02:44:24 localhost systemd[1]: chronyd.service: Consumed 119ms CPU time, read 1.9M from disk, written 0B to disk. Nov 28 02:44:24 localhost systemd[1]: Starting NTP client/server... Nov 28 02:44:24 localhost chronyd[25806]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Nov 28 02:44:24 localhost chronyd[25806]: Frequency -30.600 +/- 0.236 ppm read from /var/lib/chrony/drift Nov 28 02:44:24 localhost chronyd[25806]: Loaded seccomp filter (level 2) Nov 28 02:44:24 localhost systemd[1]: Started NTP client/server. Nov 28 02:44:26 localhost python3[25855]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:44:26 localhost python3[25898]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764315866.2825744-55884-269850070627229/source dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service follow=False checksum=d4d85e046d61f558ac7ec8178c6d529d893e81e1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:44:27 localhost python3[25928]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 02:44:27 localhost systemd[1]: Reloading. Nov 28 02:44:27 localhost systemd-rc-local-generator[25953]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:44:27 localhost systemd-sysv-generator[25958]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:44:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:44:27 localhost systemd[1]: Reloading. Nov 28 02:44:27 localhost systemd-rc-local-generator[25990]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:44:27 localhost systemd-sysv-generator[25995]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:44:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:44:28 localhost systemd[1]: Starting chronyd online sources service... Nov 28 02:44:28 localhost chronyc[26004]: 200 OK Nov 28 02:44:28 localhost systemd[1]: chrony-online.service: Deactivated successfully. Nov 28 02:44:28 localhost systemd[1]: Finished chronyd online sources service. Nov 28 02:44:28 localhost python3[26021]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:44:28 localhost chronyd[25806]: System clock was stepped by 0.000000 seconds Nov 28 02:44:29 localhost chronyd[25806]: Selected source 162.159.200.1 (pool.ntp.org) Nov 28 02:44:29 localhost python3[26038]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:44:38 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Nov 28 02:44:39 localhost python3[26058]: ansible-timezone Invoked with name=UTC hwclock=None Nov 28 02:44:39 localhost systemd[1]: Starting Time & Date Service... Nov 28 02:44:39 localhost systemd[1]: Started Time & Date Service. Nov 28 02:44:41 localhost python3[26078]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 02:44:41 localhost chronyd[25806]: chronyd exiting Nov 28 02:44:41 localhost systemd[1]: Stopping NTP client/server... Nov 28 02:44:41 localhost systemd[1]: chronyd.service: Deactivated successfully. Nov 28 02:44:41 localhost systemd[1]: Stopped NTP client/server. Nov 28 02:44:41 localhost systemd[1]: Starting NTP client/server... Nov 28 02:44:41 localhost chronyd[26085]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Nov 28 02:44:41 localhost chronyd[26085]: Frequency -30.600 +/- 0.249 ppm read from /var/lib/chrony/drift Nov 28 02:44:41 localhost chronyd[26085]: Loaded seccomp filter (level 2) Nov 28 02:44:41 localhost systemd[1]: Started NTP client/server. Nov 28 02:44:46 localhost chronyd[26085]: Selected source 174.138.193.90 (pool.ntp.org) Nov 28 02:45:09 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Nov 28 02:46:39 localhost sshd[26282]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:46:39 localhost systemd-logind[764]: New session 14 of user ceph-admin. Nov 28 02:46:39 localhost systemd[1]: Created slice User Slice of UID 1002. Nov 28 02:46:39 localhost systemd[1]: Starting User Runtime Directory /run/user/1002... Nov 28 02:46:40 localhost systemd[1]: Finished User Runtime Directory /run/user/1002. Nov 28 02:46:40 localhost systemd[1]: Starting User Manager for UID 1002... Nov 28 02:46:40 localhost systemd[26286]: Queued start job for default target Main User Target. Nov 28 02:46:40 localhost systemd[26286]: Created slice User Application Slice. Nov 28 02:46:40 localhost systemd[26286]: Started Mark boot as successful after the user session has run 2 minutes. Nov 28 02:46:40 localhost systemd[26286]: Started Daily Cleanup of User's Temporary Directories. Nov 28 02:46:40 localhost systemd[26286]: Reached target Paths. Nov 28 02:46:40 localhost systemd[26286]: Reached target Timers. Nov 28 02:46:40 localhost systemd[26286]: Starting D-Bus User Message Bus Socket... Nov 28 02:46:40 localhost systemd[26286]: Starting Create User's Volatile Files and Directories... Nov 28 02:46:40 localhost sshd[26299]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:46:40 localhost systemd[26286]: Listening on D-Bus User Message Bus Socket. Nov 28 02:46:40 localhost systemd[26286]: Reached target Sockets. Nov 28 02:46:40 localhost systemd[26286]: Finished Create User's Volatile Files and Directories. Nov 28 02:46:40 localhost systemd[26286]: Reached target Basic System. Nov 28 02:46:40 localhost systemd[26286]: Reached target Main User Target. Nov 28 02:46:40 localhost systemd[26286]: Startup finished in 118ms. Nov 28 02:46:40 localhost systemd[1]: Started User Manager for UID 1002. Nov 28 02:46:40 localhost systemd[1]: Started Session 14 of User ceph-admin. Nov 28 02:46:40 localhost systemd-logind[764]: New session 16 of user ceph-admin. Nov 28 02:46:40 localhost systemd[1]: Started Session 16 of User ceph-admin. Nov 28 02:46:40 localhost sshd[26321]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:46:40 localhost systemd-logind[764]: New session 17 of user ceph-admin. Nov 28 02:46:40 localhost systemd[1]: Started Session 17 of User ceph-admin. Nov 28 02:46:40 localhost sshd[26340]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:46:41 localhost systemd-logind[764]: New session 18 of user ceph-admin. Nov 28 02:46:41 localhost systemd[1]: Started Session 18 of User ceph-admin. Nov 28 02:46:41 localhost sshd[26359]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:46:41 localhost systemd-logind[764]: New session 19 of user ceph-admin. Nov 28 02:46:41 localhost systemd[1]: Started Session 19 of User ceph-admin. Nov 28 02:46:41 localhost sshd[26378]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:46:41 localhost systemd-logind[764]: New session 20 of user ceph-admin. Nov 28 02:46:41 localhost systemd[1]: Started Session 20 of User ceph-admin. Nov 28 02:46:42 localhost sshd[26397]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:46:42 localhost systemd-logind[764]: New session 21 of user ceph-admin. Nov 28 02:46:42 localhost systemd[1]: Started Session 21 of User ceph-admin. Nov 28 02:46:42 localhost sshd[26416]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:46:42 localhost systemd-logind[764]: New session 22 of user ceph-admin. Nov 28 02:46:42 localhost systemd[1]: Started Session 22 of User ceph-admin. Nov 28 02:46:42 localhost sshd[26435]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:46:42 localhost systemd-logind[764]: New session 23 of user ceph-admin. Nov 28 02:46:42 localhost systemd[1]: Started Session 23 of User ceph-admin. Nov 28 02:46:43 localhost sshd[26454]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:46:43 localhost systemd-logind[764]: New session 24 of user ceph-admin. Nov 28 02:46:43 localhost systemd[1]: Started Session 24 of User ceph-admin. Nov 28 02:46:43 localhost sshd[26471]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:46:43 localhost systemd-logind[764]: New session 25 of user ceph-admin. Nov 28 02:46:43 localhost systemd[1]: Started Session 25 of User ceph-admin. Nov 28 02:46:44 localhost sshd[26490]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:46:44 localhost systemd-logind[764]: New session 26 of user ceph-admin. Nov 28 02:46:44 localhost systemd[1]: Started Session 26 of User ceph-admin. Nov 28 02:46:44 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 28 02:47:10 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 28 02:47:10 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 28 02:47:11 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 28 02:47:11 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 28 02:47:11 localhost systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 26705 (sysctl) Nov 28 02:47:11 localhost systemd[1]: Mounting Arbitrary Executable File Formats File System... Nov 28 02:47:11 localhost systemd[1]: Mounted Arbitrary Executable File Formats File System. Nov 28 02:47:12 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 28 02:47:12 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 28 02:47:16 localhost kernel: VFS: idmapped mount is not enabled. Nov 28 02:47:35 localhost podman[26843]: Nov 28 02:47:35 localhost podman[26843]: 2025-11-28 07:47:35.544459656 +0000 UTC m=+22.458551621 container create 49d84c226d95e993c15293ea02b7fe971016efdfeadbf8407c7cf2b65927d060 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_proskuriakova, build-date=2025-09-24T08:57:55, version=7, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_CLEAN=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, release=553, vcs-type=git, maintainer=Guillaume Abrioux ) Nov 28 02:47:35 localhost systemd[1]: Created slice Slice /machine. Nov 28 02:47:35 localhost systemd[1]: Started libpod-conmon-49d84c226d95e993c15293ea02b7fe971016efdfeadbf8407c7cf2b65927d060.scope. Nov 28 02:47:35 localhost podman[26843]: 2025-11-28 07:47:13.132295829 +0000 UTC m=+0.046387814 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 02:47:35 localhost systemd[1]: Started libcrun container. Nov 28 02:47:35 localhost podman[26843]: 2025-11-28 07:47:35.650666804 +0000 UTC m=+22.564758769 container init 49d84c226d95e993c15293ea02b7fe971016efdfeadbf8407c7cf2b65927d060 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_proskuriakova, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_CLEAN=True, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, build-date=2025-09-24T08:57:55, vcs-type=git, version=7, io.k8s.description=Red Hat Ceph Storage 7, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 28 02:47:35 localhost podman[26843]: 2025-11-28 07:47:35.66270331 +0000 UTC m=+22.576795275 container start 49d84c226d95e993c15293ea02b7fe971016efdfeadbf8407c7cf2b65927d060 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_proskuriakova, vcs-type=git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, release=553, architecture=x86_64, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.buildah.version=1.33.12, ceph=True, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 28 02:47:35 localhost podman[26843]: 2025-11-28 07:47:35.662954098 +0000 UTC m=+22.577046063 container attach 49d84c226d95e993c15293ea02b7fe971016efdfeadbf8407c7cf2b65927d060 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_proskuriakova, distribution-scope=public, name=rhceph, CEPH_POINT_RELEASE=, architecture=x86_64, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, com.redhat.component=rhceph-container, RELEASE=main, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, version=7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, release=553, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 28 02:47:35 localhost intelligent_proskuriakova[26983]: 167 167 Nov 28 02:47:35 localhost systemd[1]: libpod-49d84c226d95e993c15293ea02b7fe971016efdfeadbf8407c7cf2b65927d060.scope: Deactivated successfully. Nov 28 02:47:35 localhost podman[26843]: 2025-11-28 07:47:35.668791891 +0000 UTC m=+22.582883866 container died 49d84c226d95e993c15293ea02b7fe971016efdfeadbf8407c7cf2b65927d060 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_proskuriakova, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, release=553, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-type=git, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux ) Nov 28 02:47:35 localhost podman[26988]: 2025-11-28 07:47:35.76719957 +0000 UTC m=+0.083021189 container remove 49d84c226d95e993c15293ea02b7fe971016efdfeadbf8407c7cf2b65927d060 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_proskuriakova, RELEASE=main, distribution-scope=public, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, ceph=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.buildah.version=1.33.12) Nov 28 02:47:35 localhost systemd[1]: libpod-conmon-49d84c226d95e993c15293ea02b7fe971016efdfeadbf8407c7cf2b65927d060.scope: Deactivated successfully. Nov 28 02:47:35 localhost podman[27009]: Nov 28 02:47:35 localhost podman[27009]: 2025-11-28 07:47:35.99565664 +0000 UTC m=+0.074397750 container create e80abf35ae6ce6ba378eadac13dbb99b64b52ca9e145357efe0ffb735f496fd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_williams, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_BRANCH=main, RELEASE=main, version=7, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, ceph=True, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc.) Nov 28 02:47:36 localhost systemd[1]: Started libpod-conmon-e80abf35ae6ce6ba378eadac13dbb99b64b52ca9e145357efe0ffb735f496fd6.scope. Nov 28 02:47:36 localhost systemd[1]: Started libcrun container. Nov 28 02:47:36 localhost podman[27009]: 2025-11-28 07:47:35.965747478 +0000 UTC m=+0.044488588 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 02:47:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29f259534901738a5f4b97b654de74250a449d775673a37bdc56d10d23dcba1c/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 28 02:47:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29f259534901738a5f4b97b654de74250a449d775673a37bdc56d10d23dcba1c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 28 02:47:36 localhost podman[27009]: 2025-11-28 07:47:36.085374039 +0000 UTC m=+0.164115159 container init e80abf35ae6ce6ba378eadac13dbb99b64b52ca9e145357efe0ffb735f496fd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_williams, GIT_BRANCH=main, RELEASE=main, version=7, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vendor=Red Hat, Inc., name=rhceph, io.openshift.tags=rhceph ceph, release=553, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=) Nov 28 02:47:36 localhost podman[27009]: 2025-11-28 07:47:36.094157412 +0000 UTC m=+0.172898532 container start e80abf35ae6ce6ba378eadac13dbb99b64b52ca9e145357efe0ffb735f496fd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_williams, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, ceph=True, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, name=rhceph, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, architecture=x86_64, version=7, RELEASE=main, CEPH_POINT_RELEASE=, distribution-scope=public, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 28 02:47:36 localhost podman[27009]: 2025-11-28 07:47:36.094492345 +0000 UTC m=+0.173233455 container attach e80abf35ae6ce6ba378eadac13dbb99b64b52ca9e145357efe0ffb735f496fd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_williams, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, ceph=True, RELEASE=main, vendor=Red Hat, Inc., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-type=git, distribution-scope=public, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, release=553, version=7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph) Nov 28 02:47:36 localhost systemd[1]: var-lib-containers-storage-overlay-6422b05a9753c3f1dbb0caf61c729a218996922d76eb2d3d412dd44a585ed3b6-merged.mount: Deactivated successfully. Nov 28 02:47:36 localhost cranky_williams[27024]: [ Nov 28 02:47:36 localhost cranky_williams[27024]: { Nov 28 02:47:36 localhost cranky_williams[27024]: "available": false, Nov 28 02:47:36 localhost cranky_williams[27024]: "ceph_device": false, Nov 28 02:47:36 localhost cranky_williams[27024]: "device_id": "QEMU_DVD-ROM_QM00001", Nov 28 02:47:36 localhost cranky_williams[27024]: "lsm_data": {}, Nov 28 02:47:36 localhost cranky_williams[27024]: "lvs": [], Nov 28 02:47:36 localhost cranky_williams[27024]: "path": "/dev/sr0", Nov 28 02:47:36 localhost cranky_williams[27024]: "rejected_reasons": [ Nov 28 02:47:36 localhost cranky_williams[27024]: "Has a FileSystem", Nov 28 02:47:36 localhost cranky_williams[27024]: "Insufficient space (<5GB)" Nov 28 02:47:36 localhost cranky_williams[27024]: ], Nov 28 02:47:36 localhost cranky_williams[27024]: "sys_api": { Nov 28 02:47:36 localhost cranky_williams[27024]: "actuators": null, Nov 28 02:47:36 localhost cranky_williams[27024]: "device_nodes": "sr0", Nov 28 02:47:36 localhost cranky_williams[27024]: "human_readable_size": "482.00 KB", Nov 28 02:47:36 localhost cranky_williams[27024]: "id_bus": "ata", Nov 28 02:47:36 localhost cranky_williams[27024]: "model": "QEMU DVD-ROM", Nov 28 02:47:36 localhost cranky_williams[27024]: "nr_requests": "2", Nov 28 02:47:36 localhost cranky_williams[27024]: "partitions": {}, Nov 28 02:47:36 localhost cranky_williams[27024]: "path": "/dev/sr0", Nov 28 02:47:36 localhost cranky_williams[27024]: "removable": "1", Nov 28 02:47:36 localhost cranky_williams[27024]: "rev": "2.5+", Nov 28 02:47:36 localhost cranky_williams[27024]: "ro": "0", Nov 28 02:47:36 localhost cranky_williams[27024]: "rotational": "1", Nov 28 02:47:36 localhost cranky_williams[27024]: "sas_address": "", Nov 28 02:47:36 localhost cranky_williams[27024]: "sas_device_handle": "", Nov 28 02:47:36 localhost cranky_williams[27024]: "scheduler_mode": "mq-deadline", Nov 28 02:47:36 localhost cranky_williams[27024]: "sectors": 0, Nov 28 02:47:36 localhost cranky_williams[27024]: "sectorsize": "2048", Nov 28 02:47:36 localhost cranky_williams[27024]: "size": 493568.0, Nov 28 02:47:36 localhost cranky_williams[27024]: "support_discard": "0", Nov 28 02:47:36 localhost cranky_williams[27024]: "type": "disk", Nov 28 02:47:36 localhost cranky_williams[27024]: "vendor": "QEMU" Nov 28 02:47:36 localhost cranky_williams[27024]: } Nov 28 02:47:36 localhost cranky_williams[27024]: } Nov 28 02:47:36 localhost cranky_williams[27024]: ] Nov 28 02:47:36 localhost systemd[1]: libpod-e80abf35ae6ce6ba378eadac13dbb99b64b52ca9e145357efe0ffb735f496fd6.scope: Deactivated successfully. Nov 28 02:47:36 localhost podman[27009]: 2025-11-28 07:47:36.889843116 +0000 UTC m=+0.968584216 container died e80abf35ae6ce6ba378eadac13dbb99b64b52ca9e145357efe0ffb735f496fd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_williams, ceph=True, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, RELEASE=main, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 28 02:47:36 localhost systemd[1]: var-lib-containers-storage-overlay-29f259534901738a5f4b97b654de74250a449d775673a37bdc56d10d23dcba1c-merged.mount: Deactivated successfully. Nov 28 02:47:36 localhost podman[28234]: 2025-11-28 07:47:36.989760737 +0000 UTC m=+0.084803410 container remove e80abf35ae6ce6ba378eadac13dbb99b64b52ca9e145357efe0ffb735f496fd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_williams, name=rhceph, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux , GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, release=553, version=7, vendor=Red Hat, Inc.) Nov 28 02:47:36 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 28 02:47:36 localhost systemd[1]: libpod-conmon-e80abf35ae6ce6ba378eadac13dbb99b64b52ca9e145357efe0ffb735f496fd6.scope: Deactivated successfully. Nov 28 02:47:37 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 28 02:47:37 localhost systemd[1]: systemd-coredump.socket: Deactivated successfully. Nov 28 02:47:37 localhost systemd[1]: Closed Process Core Dump Socket. Nov 28 02:47:37 localhost systemd[1]: Stopping Process Core Dump Socket... Nov 28 02:47:37 localhost systemd[1]: Listening on Process Core Dump Socket. Nov 28 02:47:37 localhost systemd[1]: Reloading. Nov 28 02:47:37 localhost systemd-rc-local-generator[28313]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:47:37 localhost systemd-sysv-generator[28317]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:47:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:47:37 localhost systemd[1]: Reloading. Nov 28 02:47:37 localhost systemd-rc-local-generator[28352]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:47:37 localhost systemd-sysv-generator[28355]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:47:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:48:01 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 28 02:48:02 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 28 02:48:02 localhost podman[28438]: Nov 28 02:48:02 localhost podman[28438]: 2025-11-28 07:48:02.209461019 +0000 UTC m=+0.038100620 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 02:48:05 localhost podman[28438]: 2025-11-28 07:48:05.376875744 +0000 UTC m=+3.205515285 container create f9fea4cc00b8bd6c2ca22841f14244101f9133134594667c167d29b1894ea569 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_tharp, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, version=7, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, architecture=x86_64, io.buildah.version=1.33.12, ceph=True, distribution-scope=public, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-type=git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., RELEASE=main, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph) Nov 28 02:48:05 localhost systemd[1]: Started libpod-conmon-f9fea4cc00b8bd6c2ca22841f14244101f9133134594667c167d29b1894ea569.scope. Nov 28 02:48:05 localhost systemd[1]: Started libcrun container. Nov 28 02:48:05 localhost podman[28438]: 2025-11-28 07:48:05.745392148 +0000 UTC m=+3.574031669 container init f9fea4cc00b8bd6c2ca22841f14244101f9133134594667c167d29b1894ea569 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_tharp, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, ceph=True, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.tags=rhceph ceph, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, io.k8s.description=Red Hat Ceph Storage 7) Nov 28 02:48:05 localhost podman[28438]: 2025-11-28 07:48:05.753685619 +0000 UTC m=+3.582325170 container start f9fea4cc00b8bd6c2ca22841f14244101f9133134594667c167d29b1894ea569 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_tharp, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, version=7, io.buildah.version=1.33.12, release=553, maintainer=Guillaume Abrioux , architecture=x86_64, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_CLEAN=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, ceph=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph) Nov 28 02:48:05 localhost podman[28438]: 2025-11-28 07:48:05.753884254 +0000 UTC m=+3.582523805 container attach f9fea4cc00b8bd6c2ca22841f14244101f9133134594667c167d29b1894ea569 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_tharp, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , architecture=x86_64, ceph=True, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, io.buildah.version=1.33.12, vendor=Red Hat, Inc., version=7, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True) Nov 28 02:48:05 localhost youthful_tharp[28543]: 167 167 Nov 28 02:48:05 localhost systemd[1]: libpod-f9fea4cc00b8bd6c2ca22841f14244101f9133134594667c167d29b1894ea569.scope: Deactivated successfully. Nov 28 02:48:05 localhost podman[28438]: 2025-11-28 07:48:05.756840529 +0000 UTC m=+3.585480100 container died f9fea4cc00b8bd6c2ca22841f14244101f9133134594667c167d29b1894ea569 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_tharp, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.openshift.expose-services=, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., version=7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, RELEASE=main, vcs-type=git, maintainer=Guillaume Abrioux , architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main) Nov 28 02:48:05 localhost systemd[1]: var-lib-containers-storage-overlay-2a8d697ae47ba7fd5fc07a53e3854d38b8540e57427d94fd8afb44ee011fa460-merged.mount: Deactivated successfully. Nov 28 02:48:05 localhost podman[28548]: 2025-11-28 07:48:05.861005845 +0000 UTC m=+0.089195998 container remove f9fea4cc00b8bd6c2ca22841f14244101f9133134594667c167d29b1894ea569 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_tharp, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, CEPH_POINT_RELEASE=, version=7, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_CLEAN=True, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, distribution-scope=public, build-date=2025-09-24T08:57:55) Nov 28 02:48:05 localhost systemd[1]: libpod-conmon-f9fea4cc00b8bd6c2ca22841f14244101f9133134594667c167d29b1894ea569.scope: Deactivated successfully. Nov 28 02:48:05 localhost systemd[1]: Reloading. Nov 28 02:48:06 localhost systemd-rc-local-generator[28587]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:48:06 localhost systemd-sysv-generator[28594]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:48:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:48:06 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 28 02:48:06 localhost systemd[1]: Reloading. Nov 28 02:48:06 localhost systemd-rc-local-generator[28627]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:48:06 localhost systemd-sysv-generator[28632]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:48:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:48:06 localhost systemd[1]: Reached target All Ceph clusters and services. Nov 28 02:48:06 localhost systemd[1]: Reloading. Nov 28 02:48:06 localhost systemd-rc-local-generator[28665]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:48:06 localhost systemd-sysv-generator[28671]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:48:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:48:06 localhost systemd[1]: Reached target Ceph cluster 2c5417c9-00eb-57d5-a565-ddecbc7995c1. Nov 28 02:48:06 localhost systemd[1]: Reloading. Nov 28 02:48:06 localhost systemd-rc-local-generator[28706]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:48:06 localhost systemd-sysv-generator[28709]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:48:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:48:06 localhost systemd[1]: Reloading. Nov 28 02:48:06 localhost systemd-sysv-generator[28750]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:48:06 localhost systemd-rc-local-generator[28747]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:48:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:48:07 localhost systemd[1]: Created slice Slice /system/ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1. Nov 28 02:48:07 localhost systemd[1]: Reached target System Time Set. Nov 28 02:48:07 localhost systemd[1]: Reached target System Time Synchronized. Nov 28 02:48:07 localhost systemd[1]: Starting Ceph crash.np0005538513 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1... Nov 28 02:48:07 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 28 02:48:07 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 28 02:48:07 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Nov 28 02:48:07 localhost podman[28808]: Nov 28 02:48:07 localhost podman[28808]: 2025-11-28 07:48:07.518027622 +0000 UTC m=+0.075893630 container create bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, GIT_BRANCH=main, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, version=7, com.redhat.component=rhceph-container, GIT_CLEAN=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, distribution-scope=public, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph) Nov 28 02:48:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86da5d0fe3bd11ba723935f9d08933124c41be8b02c1ee8a86209bcb5d7ec477/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:07 localhost podman[28808]: 2025-11-28 07:48:07.486474469 +0000 UTC m=+0.044340517 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 02:48:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86da5d0fe3bd11ba723935f9d08933124c41be8b02c1ee8a86209bcb5d7ec477/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86da5d0fe3bd11ba723935f9d08933124c41be8b02c1ee8a86209bcb5d7ec477/merged/etc/ceph/ceph.client.crash.np0005538513.keyring supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:07 localhost podman[28808]: 2025-11-28 07:48:07.624286402 +0000 UTC m=+0.182152400 container init bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, CEPH_POINT_RELEASE=, architecture=x86_64, name=rhceph, RELEASE=main, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, distribution-scope=public, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-type=git, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.tags=rhceph ceph) Nov 28 02:48:07 localhost podman[28808]: 2025-11-28 07:48:07.634977673 +0000 UTC m=+0.192843671 container start bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, release=553, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, version=7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, name=rhceph, RELEASE=main, io.openshift.expose-services=, vendor=Red Hat, Inc.) Nov 28 02:48:07 localhost bash[28808]: bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 Nov 28 02:48:07 localhost systemd[1]: Started Ceph crash.np0005538513 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1. Nov 28 02:48:07 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513[28822]: INFO:ceph-crash:pinging cluster to exercise our key Nov 28 02:48:07 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513[28822]: 2025-11-28T07:48:07.813+0000 7fe5f3232640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory Nov 28 02:48:07 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513[28822]: 2025-11-28T07:48:07.813+0000 7fe5f3232640 -1 AuthRegistry(0x7fe5ec0680d0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx Nov 28 02:48:07 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513[28822]: 2025-11-28T07:48:07.814+0000 7fe5f3232640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory Nov 28 02:48:07 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513[28822]: 2025-11-28T07:48:07.814+0000 7fe5f3232640 -1 AuthRegistry(0x7fe5f3231000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx Nov 28 02:48:07 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513[28822]: 2025-11-28T07:48:07.821+0000 7fe5f0fa7640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1] Nov 28 02:48:07 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513[28822]: 2025-11-28T07:48:07.822+0000 7fe5ebfff640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1] Nov 28 02:48:07 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513[28822]: 2025-11-28T07:48:07.823+0000 7fe5f17a8640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1] Nov 28 02:48:07 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513[28822]: 2025-11-28T07:48:07.823+0000 7fe5f3232640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication Nov 28 02:48:07 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513[28822]: [errno 13] RADOS permission denied (error connecting to the cluster) Nov 28 02:48:07 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513[28822]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s Nov 28 02:48:17 localhost podman[28908]: Nov 28 02:48:17 localhost podman[28908]: 2025-11-28 07:48:17.481342182 +0000 UTC m=+0.107784940 container create bf7185081afeac295544e399612ccf401e6b802a7e42f41e752af54a7a116c49 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_merkle, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, vcs-type=git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, name=rhceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, release=553, distribution-scope=public, description=Red Hat Ceph Storage 7, version=7, ceph=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 28 02:48:17 localhost podman[28908]: 2025-11-28 07:48:17.417785946 +0000 UTC m=+0.044228714 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 02:48:17 localhost systemd[1]: Started libpod-conmon-bf7185081afeac295544e399612ccf401e6b802a7e42f41e752af54a7a116c49.scope. Nov 28 02:48:17 localhost systemd[1]: Started libcrun container. Nov 28 02:48:17 localhost podman[28908]: 2025-11-28 07:48:17.557556178 +0000 UTC m=+0.183998946 container init bf7185081afeac295544e399612ccf401e6b802a7e42f41e752af54a7a116c49 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_merkle, ceph=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, distribution-scope=public, GIT_CLEAN=True, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, architecture=x86_64, vcs-type=git) Nov 28 02:48:17 localhost podman[28908]: 2025-11-28 07:48:17.569085032 +0000 UTC m=+0.195527790 container start bf7185081afeac295544e399612ccf401e6b802a7e42f41e752af54a7a116c49 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_merkle, architecture=x86_64, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, version=7, vendor=Red Hat, Inc., GIT_CLEAN=True, RELEASE=main, ceph=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, release=553, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 28 02:48:17 localhost podman[28908]: 2025-11-28 07:48:17.569382299 +0000 UTC m=+0.195825057 container attach bf7185081afeac295544e399612ccf401e6b802a7e42f41e752af54a7a116c49 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_merkle, distribution-scope=public, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, build-date=2025-09-24T08:57:55, name=rhceph, io.openshift.tags=rhceph ceph, vcs-type=git, ceph=True, release=553, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64) Nov 28 02:48:17 localhost systemd[1]: libpod-bf7185081afeac295544e399612ccf401e6b802a7e42f41e752af54a7a116c49.scope: Deactivated successfully. Nov 28 02:48:17 localhost hungry_merkle[28923]: 167 167 Nov 28 02:48:17 localhost podman[28908]: 2025-11-28 07:48:17.575253658 +0000 UTC m=+0.201696416 container died bf7185081afeac295544e399612ccf401e6b802a7e42f41e752af54a7a116c49 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_merkle, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, GIT_CLEAN=True, RELEASE=main, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.openshift.expose-services=, architecture=x86_64) Nov 28 02:48:17 localhost podman[28928]: 2025-11-28 07:48:17.706828222 +0000 UTC m=+0.123324015 container remove bf7185081afeac295544e399612ccf401e6b802a7e42f41e752af54a7a116c49 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_merkle, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, version=7, GIT_CLEAN=True, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, release=553, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7) Nov 28 02:48:17 localhost systemd[1]: libpod-conmon-bf7185081afeac295544e399612ccf401e6b802a7e42f41e752af54a7a116c49.scope: Deactivated successfully. Nov 28 02:48:17 localhost podman[28949]: Nov 28 02:48:17 localhost podman[28949]: 2025-11-28 07:48:17.927381325 +0000 UTC m=+0.076431532 container create 02eb82e9f5e9ef1d122bc7ef8cddaccfec09553409cf5ff1a670cbc20764d88f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_herschel, com.redhat.component=rhceph-container, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, architecture=x86_64, vcs-type=git, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.expose-services=, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public) Nov 28 02:48:17 localhost systemd[1]: Started libpod-conmon-02eb82e9f5e9ef1d122bc7ef8cddaccfec09553409cf5ff1a670cbc20764d88f.scope. Nov 28 02:48:17 localhost podman[28949]: 2025-11-28 07:48:17.896734427 +0000 UTC m=+0.045784634 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 02:48:18 localhost systemd[1]: Started libcrun container. Nov 28 02:48:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50d1703faa0a1860e04ba8ea53f9495f250a5cfa90c05cef5d0e3689954afe1f/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50d1703faa0a1860e04ba8ea53f9495f250a5cfa90c05cef5d0e3689954afe1f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50d1703faa0a1860e04ba8ea53f9495f250a5cfa90c05cef5d0e3689954afe1f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50d1703faa0a1860e04ba8ea53f9495f250a5cfa90c05cef5d0e3689954afe1f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50d1703faa0a1860e04ba8ea53f9495f250a5cfa90c05cef5d0e3689954afe1f/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:18 localhost podman[28949]: 2025-11-28 07:48:18.074886194 +0000 UTC m=+0.223936391 container init 02eb82e9f5e9ef1d122bc7ef8cddaccfec09553409cf5ff1a670cbc20764d88f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_herschel, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, architecture=x86_64, RELEASE=main, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.openshift.expose-services=, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, maintainer=Guillaume Abrioux , version=7, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 02:48:18 localhost podman[28949]: 2025-11-28 07:48:18.084524478 +0000 UTC m=+0.233574675 container start 02eb82e9f5e9ef1d122bc7ef8cddaccfec09553409cf5ff1a670cbc20764d88f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_herschel, release=553, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, io.openshift.tags=rhceph ceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., architecture=x86_64, RELEASE=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 28 02:48:18 localhost podman[28949]: 2025-11-28 07:48:18.084799396 +0000 UTC m=+0.233849593 container attach 02eb82e9f5e9ef1d122bc7ef8cddaccfec09553409cf5ff1a670cbc20764d88f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_herschel, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, maintainer=Guillaume Abrioux , RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, name=rhceph, architecture=x86_64, vcs-type=git, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, distribution-scope=public) Nov 28 02:48:18 localhost systemd[1]: var-lib-containers-storage-overlay-73afdf7678a16c8a3a284b3ca5dd53ad89410e36908734ee3b3355b4b2b859bd-merged.mount: Deactivated successfully. Nov 28 02:48:18 localhost elated_herschel[28964]: --> passed data devices: 0 physical, 2 LVM Nov 28 02:48:18 localhost elated_herschel[28964]: --> relative data size: 1.0 Nov 28 02:48:18 localhost elated_herschel[28964]: Running command: /usr/bin/ceph-authtool --gen-print-key Nov 28 02:48:18 localhost elated_herschel[28964]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new d7af0c01-7a1e-4708-8e50-081c55d3ecd3 Nov 28 02:48:19 localhost elated_herschel[28964]: Running command: /usr/bin/ceph-authtool --gen-print-key Nov 28 02:48:19 localhost lvm[29018]: PV /dev/loop3 online, VG ceph_vg0 is complete. Nov 28 02:48:19 localhost lvm[29018]: VG ceph_vg0 finished Nov 28 02:48:19 localhost elated_herschel[28964]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2 Nov 28 02:48:19 localhost elated_herschel[28964]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0 Nov 28 02:48:19 localhost elated_herschel[28964]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Nov 28 02:48:19 localhost elated_herschel[28964]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block Nov 28 02:48:19 localhost elated_herschel[28964]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap Nov 28 02:48:19 localhost elated_herschel[28964]: stderr: got monmap epoch 3 Nov 28 02:48:19 localhost elated_herschel[28964]: --> Creating keyring file for osd.2 Nov 28 02:48:19 localhost elated_herschel[28964]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring Nov 28 02:48:19 localhost elated_herschel[28964]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/ Nov 28 02:48:19 localhost elated_herschel[28964]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid d7af0c01-7a1e-4708-8e50-081c55d3ecd3 --setuser ceph --setgroup ceph Nov 28 02:48:22 localhost elated_herschel[28964]: stderr: 2025-11-28T07:48:19.783+0000 7fe0c21e3a80 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3] Nov 28 02:48:22 localhost elated_herschel[28964]: stderr: 2025-11-28T07:48:19.783+0000 7fe0c21e3a80 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid Nov 28 02:48:22 localhost elated_herschel[28964]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0 Nov 28 02:48:22 localhost elated_herschel[28964]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 Nov 28 02:48:22 localhost elated_herschel[28964]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config Nov 28 02:48:22 localhost elated_herschel[28964]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block Nov 28 02:48:22 localhost elated_herschel[28964]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block Nov 28 02:48:22 localhost elated_herschel[28964]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Nov 28 02:48:22 localhost elated_herschel[28964]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 Nov 28 02:48:22 localhost elated_herschel[28964]: --> ceph-volume lvm activate successful for osd ID: 2 Nov 28 02:48:22 localhost elated_herschel[28964]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0 Nov 28 02:48:22 localhost elated_herschel[28964]: Running command: /usr/bin/ceph-authtool --gen-print-key Nov 28 02:48:22 localhost elated_herschel[28964]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 59ca4283-ae21-42b7-993b-9e0e69e2fb94 Nov 28 02:48:23 localhost lvm[29963]: PV /dev/loop4 online, VG ceph_vg1 is complete. Nov 28 02:48:23 localhost lvm[29963]: VG ceph_vg1 finished Nov 28 02:48:23 localhost elated_herschel[28964]: Running command: /usr/bin/ceph-authtool --gen-print-key Nov 28 02:48:23 localhost elated_herschel[28964]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-5 Nov 28 02:48:23 localhost elated_herschel[28964]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg1/ceph_lv1 Nov 28 02:48:23 localhost elated_herschel[28964]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Nov 28 02:48:23 localhost elated_herschel[28964]: Running command: /usr/bin/ln -s /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-5/block Nov 28 02:48:23 localhost elated_herschel[28964]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-5/activate.monmap Nov 28 02:48:23 localhost elated_herschel[28964]: stderr: got monmap epoch 3 Nov 28 02:48:23 localhost elated_herschel[28964]: --> Creating keyring file for osd.5 Nov 28 02:48:23 localhost elated_herschel[28964]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5/keyring Nov 28 02:48:23 localhost elated_herschel[28964]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5/ Nov 28 02:48:23 localhost elated_herschel[28964]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 5 --monmap /var/lib/ceph/osd/ceph-5/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-5/ --osd-uuid 59ca4283-ae21-42b7-993b-9e0e69e2fb94 --setuser ceph --setgroup ceph Nov 28 02:48:26 localhost elated_herschel[28964]: stderr: 2025-11-28T07:48:23.739+0000 7fc91224da80 -1 bluestore(/var/lib/ceph/osd/ceph-5//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3] Nov 28 02:48:26 localhost elated_herschel[28964]: stderr: 2025-11-28T07:48:23.739+0000 7fc91224da80 -1 bluestore(/var/lib/ceph/osd/ceph-5/) _read_fsid unparsable uuid Nov 28 02:48:26 localhost elated_herschel[28964]: --> ceph-volume lvm prepare successful for: ceph_vg1/ceph_lv1 Nov 28 02:48:26 localhost elated_herschel[28964]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 Nov 28 02:48:26 localhost elated_herschel[28964]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg1/ceph_lv1 --path /var/lib/ceph/osd/ceph-5 --no-mon-config Nov 28 02:48:26 localhost elated_herschel[28964]: Running command: /usr/bin/ln -snf /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-5/block Nov 28 02:48:26 localhost elated_herschel[28964]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-5/block Nov 28 02:48:26 localhost elated_herschel[28964]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Nov 28 02:48:26 localhost elated_herschel[28964]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 Nov 28 02:48:26 localhost elated_herschel[28964]: --> ceph-volume lvm activate successful for osd ID: 5 Nov 28 02:48:26 localhost elated_herschel[28964]: --> ceph-volume lvm create successful for: ceph_vg1/ceph_lv1 Nov 28 02:48:26 localhost systemd[1]: libpod-02eb82e9f5e9ef1d122bc7ef8cddaccfec09553409cf5ff1a670cbc20764d88f.scope: Deactivated successfully. Nov 28 02:48:26 localhost systemd[1]: libpod-02eb82e9f5e9ef1d122bc7ef8cddaccfec09553409cf5ff1a670cbc20764d88f.scope: Consumed 3.937s CPU time. Nov 28 02:48:26 localhost podman[30878]: 2025-11-28 07:48:26.465871441 +0000 UTC m=+0.055648145 container died 02eb82e9f5e9ef1d122bc7ef8cddaccfec09553409cf5ff1a670cbc20764d88f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_herschel, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, ceph=True, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, distribution-scope=public, io.buildah.version=1.33.12, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, architecture=x86_64, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 02:48:26 localhost systemd[1]: var-lib-containers-storage-overlay-50d1703faa0a1860e04ba8ea53f9495f250a5cfa90c05cef5d0e3689954afe1f-merged.mount: Deactivated successfully. Nov 28 02:48:26 localhost podman[30878]: 2025-11-28 07:48:26.505555341 +0000 UTC m=+0.095332015 container remove 02eb82e9f5e9ef1d122bc7ef8cddaccfec09553409cf5ff1a670cbc20764d88f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_herschel, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, name=rhceph, architecture=x86_64, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, distribution-scope=public, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, GIT_CLEAN=True, vcs-type=git, com.redhat.component=rhceph-container) Nov 28 02:48:26 localhost systemd[1]: libpod-conmon-02eb82e9f5e9ef1d122bc7ef8cddaccfec09553409cf5ff1a670cbc20764d88f.scope: Deactivated successfully. Nov 28 02:48:27 localhost podman[30959]: Nov 28 02:48:27 localhost podman[30959]: 2025-11-28 07:48:27.321009271 +0000 UTC m=+0.077496500 container create bd356341998c924174850a354b0a1e9d9201c6b99d6d3d9c6780a4d43aead830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_feynman, ceph=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, name=rhceph, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_CLEAN=True, vendor=Red Hat, Inc., release=553, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.openshift.expose-services=) Nov 28 02:48:27 localhost systemd[1]: Started libpod-conmon-bd356341998c924174850a354b0a1e9d9201c6b99d6d3d9c6780a4d43aead830.scope. Nov 28 02:48:27 localhost systemd[1]: Started libcrun container. Nov 28 02:48:27 localhost podman[30959]: 2025-11-28 07:48:27.383796116 +0000 UTC m=+0.140283345 container init bd356341998c924174850a354b0a1e9d9201c6b99d6d3d9c6780a4d43aead830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_feynman, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, ceph=True, CEPH_POINT_RELEASE=, architecture=x86_64, version=7, maintainer=Guillaume Abrioux , GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, vcs-type=git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 28 02:48:27 localhost podman[30959]: 2025-11-28 07:48:27.291132542 +0000 UTC m=+0.047619771 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 02:48:27 localhost podman[30959]: 2025-11-28 07:48:27.394120789 +0000 UTC m=+0.150607978 container start bd356341998c924174850a354b0a1e9d9201c6b99d6d3d9c6780a4d43aead830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_feynman, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, vcs-type=git, release=553, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, distribution-scope=public, version=7, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container) Nov 28 02:48:27 localhost podman[30959]: 2025-11-28 07:48:27.394325384 +0000 UTC m=+0.150812623 container attach bd356341998c924174850a354b0a1e9d9201c6b99d6d3d9c6780a4d43aead830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_feynman, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, distribution-scope=public, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., version=7, release=553) Nov 28 02:48:27 localhost vigilant_feynman[30974]: 167 167 Nov 28 02:48:27 localhost systemd[1]: libpod-bd356341998c924174850a354b0a1e9d9201c6b99d6d3d9c6780a4d43aead830.scope: Deactivated successfully. Nov 28 02:48:27 localhost podman[30959]: 2025-11-28 07:48:27.397543346 +0000 UTC m=+0.154030615 container died bd356341998c924174850a354b0a1e9d9201c6b99d6d3d9c6780a4d43aead830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_feynman, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, ceph=True, version=7, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc.) Nov 28 02:48:27 localhost systemd[1]: var-lib-containers-storage-overlay-f71a6cc3ce445fa13a64367be4316221fb9076df268ac8fa76587aa61b9ba1b1-merged.mount: Deactivated successfully. Nov 28 02:48:27 localhost podman[30979]: 2025-11-28 07:48:27.497150006 +0000 UTC m=+0.088474288 container remove bd356341998c924174850a354b0a1e9d9201c6b99d6d3d9c6780a4d43aead830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_feynman, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_BRANCH=main, distribution-scope=public, RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.buildah.version=1.33.12, version=7, release=553) Nov 28 02:48:27 localhost systemd[1]: libpod-conmon-bd356341998c924174850a354b0a1e9d9201c6b99d6d3d9c6780a4d43aead830.scope: Deactivated successfully. Nov 28 02:48:27 localhost podman[31001]: Nov 28 02:48:27 localhost podman[31001]: 2025-11-28 07:48:27.690626314 +0000 UTC m=+0.080293542 container create 14068e5fdcbf2ee0e6fc31c9cf8063474ef34fb81f29fbb4efc5e7860feaabcc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_blackwell, io.k8s.description=Red Hat Ceph Storage 7, version=7, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , GIT_BRANCH=main, RELEASE=main, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., ceph=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 28 02:48:27 localhost systemd[1]: Started libpod-conmon-14068e5fdcbf2ee0e6fc31c9cf8063474ef34fb81f29fbb4efc5e7860feaabcc.scope. Nov 28 02:48:27 localhost systemd[1]: Started libcrun container. Nov 28 02:48:27 localhost podman[31001]: 2025-11-28 07:48:27.657266976 +0000 UTC m=+0.046934214 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 02:48:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87cfa97b9ed18e3b802e5c3381ad0090efe7f6ccd40f1a65736a9319eba9f3a6/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87cfa97b9ed18e3b802e5c3381ad0090efe7f6ccd40f1a65736a9319eba9f3a6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87cfa97b9ed18e3b802e5c3381ad0090efe7f6ccd40f1a65736a9319eba9f3a6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:27 localhost podman[31001]: 2025-11-28 07:48:27.830322012 +0000 UTC m=+0.219989230 container init 14068e5fdcbf2ee0e6fc31c9cf8063474ef34fb81f29fbb4efc5e7860feaabcc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_blackwell, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.tags=rhceph ceph, RELEASE=main, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, vendor=Red Hat, Inc., ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, architecture=x86_64, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, release=553, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7) Nov 28 02:48:27 localhost podman[31001]: 2025-11-28 07:48:27.841436865 +0000 UTC m=+0.231104093 container start 14068e5fdcbf2ee0e6fc31c9cf8063474ef34fb81f29fbb4efc5e7860feaabcc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_blackwell, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.buildah.version=1.33.12, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, name=rhceph, RELEASE=main, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True) Nov 28 02:48:27 localhost podman[31001]: 2025-11-28 07:48:27.841784974 +0000 UTC m=+0.231452202 container attach 14068e5fdcbf2ee0e6fc31c9cf8063474ef34fb81f29fbb4efc5e7860feaabcc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_blackwell, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, release=553, architecture=x86_64, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, vcs-type=git, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, CEPH_POINT_RELEASE=, version=7, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7) Nov 28 02:48:28 localhost wizardly_blackwell[31017]: { Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "2": [ Nov 28 02:48:28 localhost wizardly_blackwell[31017]: { Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "devices": [ Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "/dev/loop3" Nov 28 02:48:28 localhost wizardly_blackwell[31017]: ], Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "lv_name": "ceph_lv0", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "lv_path": "/dev/ceph_vg0/ceph_lv0", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "lv_size": "7511998464", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PBfNQH-dddB-qx5q-D9S0-20Aj-jIGQ-Q1V216,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=2c5417c9-00eb-57d5-a565-ddecbc7995c1,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=d7af0c01-7a1e-4708-8e50-081c55d3ecd3,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "lv_uuid": "PBfNQH-dddB-qx5q-D9S0-20Aj-jIGQ-Q1V216", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "name": "ceph_lv0", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "path": "/dev/ceph_vg0/ceph_lv0", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "tags": { Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "ceph.block_device": "/dev/ceph_vg0/ceph_lv0", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "ceph.block_uuid": "PBfNQH-dddB-qx5q-D9S0-20Aj-jIGQ-Q1V216", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "ceph.cephx_lockbox_secret": "", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "ceph.cluster_fsid": "2c5417c9-00eb-57d5-a565-ddecbc7995c1", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "ceph.cluster_name": "ceph", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "ceph.crush_device_class": "", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "ceph.encrypted": "0", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "ceph.osd_fsid": "d7af0c01-7a1e-4708-8e50-081c55d3ecd3", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "ceph.osd_id": "2", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "ceph.osdspec_affinity": "default_drive_group", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "ceph.type": "block", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "ceph.vdo": "0" Nov 28 02:48:28 localhost wizardly_blackwell[31017]: }, Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "type": "block", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "vg_name": "ceph_vg0" Nov 28 02:48:28 localhost wizardly_blackwell[31017]: } Nov 28 02:48:28 localhost wizardly_blackwell[31017]: ], Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "5": [ Nov 28 02:48:28 localhost wizardly_blackwell[31017]: { Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "devices": [ Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "/dev/loop4" Nov 28 02:48:28 localhost wizardly_blackwell[31017]: ], Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "lv_name": "ceph_lv1", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "lv_path": "/dev/ceph_vg1/ceph_lv1", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "lv_size": "7511998464", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=L6wfVH-D1iV-h6TJ-Yiyz-6IQS-j2Vx-XDZY4Q,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=2c5417c9-00eb-57d5-a565-ddecbc7995c1,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=59ca4283-ae21-42b7-993b-9e0e69e2fb94,ceph.osd_id=5,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "lv_uuid": "L6wfVH-D1iV-h6TJ-Yiyz-6IQS-j2Vx-XDZY4Q", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "name": "ceph_lv1", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "path": "/dev/ceph_vg1/ceph_lv1", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "tags": { Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "ceph.block_device": "/dev/ceph_vg1/ceph_lv1", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "ceph.block_uuid": "L6wfVH-D1iV-h6TJ-Yiyz-6IQS-j2Vx-XDZY4Q", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "ceph.cephx_lockbox_secret": "", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "ceph.cluster_fsid": "2c5417c9-00eb-57d5-a565-ddecbc7995c1", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "ceph.cluster_name": "ceph", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "ceph.crush_device_class": "", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "ceph.encrypted": "0", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "ceph.osd_fsid": "59ca4283-ae21-42b7-993b-9e0e69e2fb94", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "ceph.osd_id": "5", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "ceph.osdspec_affinity": "default_drive_group", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "ceph.type": "block", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "ceph.vdo": "0" Nov 28 02:48:28 localhost wizardly_blackwell[31017]: }, Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "type": "block", Nov 28 02:48:28 localhost wizardly_blackwell[31017]: "vg_name": "ceph_vg1" Nov 28 02:48:28 localhost wizardly_blackwell[31017]: } Nov 28 02:48:28 localhost wizardly_blackwell[31017]: ] Nov 28 02:48:28 localhost wizardly_blackwell[31017]: } Nov 28 02:48:28 localhost systemd[1]: libpod-14068e5fdcbf2ee0e6fc31c9cf8063474ef34fb81f29fbb4efc5e7860feaabcc.scope: Deactivated successfully. Nov 28 02:48:28 localhost podman[31001]: 2025-11-28 07:48:28.188403682 +0000 UTC m=+0.578070940 container died 14068e5fdcbf2ee0e6fc31c9cf8063474ef34fb81f29fbb4efc5e7860feaabcc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_blackwell, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, version=7, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux , distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, ceph=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, name=rhceph, vcs-type=git, io.buildah.version=1.33.12, GIT_BRANCH=main, vendor=Red Hat, Inc.) Nov 28 02:48:28 localhost podman[31026]: 2025-11-28 07:48:28.270769335 +0000 UTC m=+0.069241190 container remove 14068e5fdcbf2ee0e6fc31c9cf8063474ef34fb81f29fbb4efc5e7860feaabcc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_blackwell, release=553, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., RELEASE=main, description=Red Hat Ceph Storage 7, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , architecture=x86_64, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, distribution-scope=public, io.openshift.expose-services=, GIT_BRANCH=main) Nov 28 02:48:28 localhost systemd[1]: libpod-conmon-14068e5fdcbf2ee0e6fc31c9cf8063474ef34fb81f29fbb4efc5e7860feaabcc.scope: Deactivated successfully. Nov 28 02:48:28 localhost systemd[1]: var-lib-containers-storage-overlay-87cfa97b9ed18e3b802e5c3381ad0090efe7f6ccd40f1a65736a9319eba9f3a6-merged.mount: Deactivated successfully. Nov 28 02:48:28 localhost podman[31113]: Nov 28 02:48:29 localhost podman[31113]: 2025-11-28 07:48:29.010997485 +0000 UTC m=+0.056484726 container create c938fc14255fc378430f225c5b853dd5df6dc5f431aed9f58bfbb35dfed312e6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_sammet, CEPH_POINT_RELEASE=, ceph=True, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, RELEASE=main, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , version=7, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main) Nov 28 02:48:29 localhost podman[31113]: 2025-11-28 07:48:28.988843182 +0000 UTC m=+0.034330423 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 02:48:29 localhost systemd[1]: Started libpod-conmon-c938fc14255fc378430f225c5b853dd5df6dc5f431aed9f58bfbb35dfed312e6.scope. Nov 28 02:48:29 localhost systemd[1]: Started libcrun container. Nov 28 02:48:29 localhost podman[31113]: 2025-11-28 07:48:29.671637482 +0000 UTC m=+0.717124693 container init c938fc14255fc378430f225c5b853dd5df6dc5f431aed9f58bfbb35dfed312e6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_sammet, RELEASE=main, version=7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_CLEAN=True, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux ) Nov 28 02:48:29 localhost podman[31113]: 2025-11-28 07:48:29.679230784 +0000 UTC m=+0.724717995 container start c938fc14255fc378430f225c5b853dd5df6dc5f431aed9f58bfbb35dfed312e6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_sammet, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, architecture=x86_64, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, name=rhceph, GIT_BRANCH=main, release=553) Nov 28 02:48:29 localhost podman[31113]: 2025-11-28 07:48:29.680055426 +0000 UTC m=+0.725542657 container attach c938fc14255fc378430f225c5b853dd5df6dc5f431aed9f58bfbb35dfed312e6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_sammet, version=7, io.openshift.expose-services=, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., RELEASE=main, GIT_BRANCH=main, architecture=x86_64, name=rhceph, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=) Nov 28 02:48:29 localhost determined_sammet[31128]: 167 167 Nov 28 02:48:29 localhost systemd[1]: libpod-c938fc14255fc378430f225c5b853dd5df6dc5f431aed9f58bfbb35dfed312e6.scope: Deactivated successfully. Nov 28 02:48:29 localhost podman[31113]: 2025-11-28 07:48:29.681448671 +0000 UTC m=+0.726935882 container died c938fc14255fc378430f225c5b853dd5df6dc5f431aed9f58bfbb35dfed312e6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_sammet, RELEASE=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, architecture=x86_64, name=rhceph, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, version=7, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, release=553, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 02:48:29 localhost systemd[1]: var-lib-containers-storage-overlay-b8b7eae0f1e25cf32e37a90502876f4c83c1623cbe43e94f894b54f9e908f8a0-merged.mount: Deactivated successfully. Nov 28 02:48:29 localhost podman[31133]: 2025-11-28 07:48:29.744519764 +0000 UTC m=+0.055042810 container remove c938fc14255fc378430f225c5b853dd5df6dc5f431aed9f58bfbb35dfed312e6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_sammet, RELEASE=main, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, com.redhat.component=rhceph-container, name=rhceph, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 28 02:48:29 localhost systemd[1]: libpod-conmon-c938fc14255fc378430f225c5b853dd5df6dc5f431aed9f58bfbb35dfed312e6.scope: Deactivated successfully. Nov 28 02:48:30 localhost podman[31162]: Nov 28 02:48:30 localhost podman[31162]: 2025-11-28 07:48:30.014864763 +0000 UTC m=+0.060567619 container create c444739f84a73336309c7822485df59bc49bd77b5e9257ba6d2521cf4495a3d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate-test, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, name=rhceph, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vcs-type=git, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, distribution-scope=public, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, ceph=True, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc.) Nov 28 02:48:30 localhost systemd[1]: Started libpod-conmon-c444739f84a73336309c7822485df59bc49bd77b5e9257ba6d2521cf4495a3d9.scope. Nov 28 02:48:30 localhost systemd[1]: Started libcrun container. Nov 28 02:48:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7810b54db4be30bf4154937efab47e01189370daaf687a20db001750d36a7744/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7810b54db4be30bf4154937efab47e01189370daaf687a20db001750d36a7744/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7810b54db4be30bf4154937efab47e01189370daaf687a20db001750d36a7744/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:30 localhost podman[31162]: 2025-11-28 07:48:29.992563676 +0000 UTC m=+0.038266552 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 02:48:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7810b54db4be30bf4154937efab47e01189370daaf687a20db001750d36a7744/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7810b54db4be30bf4154937efab47e01189370daaf687a20db001750d36a7744/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:30 localhost podman[31162]: 2025-11-28 07:48:30.108709298 +0000 UTC m=+0.154412154 container init c444739f84a73336309c7822485df59bc49bd77b5e9257ba6d2521cf4495a3d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate-test, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, GIT_BRANCH=main, name=rhceph, RELEASE=main, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, version=7, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.buildah.version=1.33.12, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=) Nov 28 02:48:30 localhost podman[31162]: 2025-11-28 07:48:30.117666636 +0000 UTC m=+0.163369502 container start c444739f84a73336309c7822485df59bc49bd77b5e9257ba6d2521cf4495a3d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate-test, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.openshift.expose-services=, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, distribution-scope=public, ceph=True, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , vcs-type=git) Nov 28 02:48:30 localhost podman[31162]: 2025-11-28 07:48:30.117900792 +0000 UTC m=+0.163603668 container attach c444739f84a73336309c7822485df59bc49bd77b5e9257ba6d2521cf4495a3d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate-test, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, distribution-scope=public, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, name=rhceph, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , ceph=True, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, release=553, description=Red Hat Ceph Storage 7) Nov 28 02:48:30 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate-test[31177]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID] Nov 28 02:48:30 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate-test[31177]: [--no-systemd] [--no-tmpfs] Nov 28 02:48:30 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate-test[31177]: ceph-volume activate: error: unrecognized arguments: --bad-option Nov 28 02:48:30 localhost systemd[1]: libpod-c444739f84a73336309c7822485df59bc49bd77b5e9257ba6d2521cf4495a3d9.scope: Deactivated successfully. Nov 28 02:48:30 localhost podman[31162]: 2025-11-28 07:48:30.385156493 +0000 UTC m=+0.430859429 container died c444739f84a73336309c7822485df59bc49bd77b5e9257ba6d2521cf4495a3d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate-test, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Guillaume Abrioux , name=rhceph, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, io.openshift.expose-services=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 28 02:48:30 localhost podman[31182]: 2025-11-28 07:48:30.468147572 +0000 UTC m=+0.070457482 container remove c444739f84a73336309c7822485df59bc49bd77b5e9257ba6d2521cf4495a3d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate-test, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, version=7, architecture=x86_64, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, maintainer=Guillaume Abrioux , GIT_BRANCH=main, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-type=git) Nov 28 02:48:30 localhost systemd-journald[618]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation. Nov 28 02:48:30 localhost systemd-journald[618]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 28 02:48:30 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 28 02:48:30 localhost systemd[1]: libpod-conmon-c444739f84a73336309c7822485df59bc49bd77b5e9257ba6d2521cf4495a3d9.scope: Deactivated successfully. Nov 28 02:48:30 localhost systemd[1]: var-lib-containers-storage-overlay-7810b54db4be30bf4154937efab47e01189370daaf687a20db001750d36a7744-merged.mount: Deactivated successfully. Nov 28 02:48:30 localhost systemd[1]: Reloading. Nov 28 02:48:30 localhost systemd-rc-local-generator[31238]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:48:30 localhost systemd-sysv-generator[31241]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:48:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:48:31 localhost systemd[1]: Reloading. Nov 28 02:48:31 localhost systemd-rc-local-generator[31275]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:48:31 localhost systemd-sysv-generator[31282]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:48:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:48:31 localhost systemd[1]: Starting Ceph osd.2 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1... Nov 28 02:48:31 localhost podman[31340]: Nov 28 02:48:31 localhost podman[31340]: 2025-11-28 07:48:31.650900596 +0000 UTC m=+0.076143186 container create efb74051641f9204a30b515a812649c53e6520e5112241529648a49895c7bedd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.buildah.version=1.33.12, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, ceph=True, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, architecture=x86_64, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux ) Nov 28 02:48:31 localhost systemd[1]: tmp-crun.XlD2MG.mount: Deactivated successfully. Nov 28 02:48:31 localhost systemd[1]: Started libcrun container. Nov 28 02:48:31 localhost podman[31340]: 2025-11-28 07:48:31.619866057 +0000 UTC m=+0.045108637 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 02:48:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a6cf4bc8f2f92907c473d7bce1a628783db896d4e848a956efe2f4a59b43874/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a6cf4bc8f2f92907c473d7bce1a628783db896d4e848a956efe2f4a59b43874/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a6cf4bc8f2f92907c473d7bce1a628783db896d4e848a956efe2f4a59b43874/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a6cf4bc8f2f92907c473d7bce1a628783db896d4e848a956efe2f4a59b43874/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a6cf4bc8f2f92907c473d7bce1a628783db896d4e848a956efe2f4a59b43874/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:31 localhost podman[31340]: 2025-11-28 07:48:31.759328251 +0000 UTC m=+0.184570841 container init efb74051641f9204a30b515a812649c53e6520e5112241529648a49895c7bedd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-type=git, name=rhceph, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, version=7, RELEASE=main, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, release=553, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_BRANCH=main) Nov 28 02:48:31 localhost systemd[1]: tmp-crun.5XYLON.mount: Deactivated successfully. Nov 28 02:48:31 localhost podman[31340]: 2025-11-28 07:48:31.769355776 +0000 UTC m=+0.194598326 container start efb74051641f9204a30b515a812649c53e6520e5112241529648a49895c7bedd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, version=7, GIT_BRANCH=main, release=553, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, maintainer=Guillaume Abrioux , io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc.) Nov 28 02:48:31 localhost podman[31340]: 2025-11-28 07:48:31.769597282 +0000 UTC m=+0.194839922 container attach efb74051641f9204a30b515a812649c53e6520e5112241529648a49895c7bedd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_CLEAN=True, release=553, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph) Nov 28 02:48:32 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate[31354]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 Nov 28 02:48:32 localhost bash[31340]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 Nov 28 02:48:32 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate[31354]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0 Nov 28 02:48:32 localhost bash[31340]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0 Nov 28 02:48:32 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate[31354]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0 Nov 28 02:48:32 localhost bash[31340]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0 Nov 28 02:48:32 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate[31354]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Nov 28 02:48:32 localhost bash[31340]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Nov 28 02:48:32 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate[31354]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block Nov 28 02:48:32 localhost bash[31340]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block Nov 28 02:48:32 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate[31354]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 Nov 28 02:48:32 localhost bash[31340]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 Nov 28 02:48:32 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate[31354]: --> ceph-volume raw activate successful for osd ID: 2 Nov 28 02:48:32 localhost bash[31340]: --> ceph-volume raw activate successful for osd ID: 2 Nov 28 02:48:32 localhost systemd[1]: libpod-efb74051641f9204a30b515a812649c53e6520e5112241529648a49895c7bedd.scope: Deactivated successfully. Nov 28 02:48:32 localhost podman[31340]: 2025-11-28 07:48:32.473802396 +0000 UTC m=+0.899045006 container died efb74051641f9204a30b515a812649c53e6520e5112241529648a49895c7bedd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, maintainer=Guillaume Abrioux , ceph=True, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, RELEASE=main, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, version=7, name=rhceph, release=553, io.openshift.expose-services=) Nov 28 02:48:32 localhost podman[31479]: 2025-11-28 07:48:32.574404833 +0000 UTC m=+0.085153265 container remove efb74051641f9204a30b515a812649c53e6520e5112241529648a49895c7bedd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2-activate, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.expose-services=, name=rhceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_CLEAN=True, version=7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, release=553, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 28 02:48:32 localhost systemd[1]: var-lib-containers-storage-overlay-9a6cf4bc8f2f92907c473d7bce1a628783db896d4e848a956efe2f4a59b43874-merged.mount: Deactivated successfully. Nov 28 02:48:32 localhost podman[31539]: Nov 28 02:48:32 localhost podman[31539]: 2025-11-28 07:48:32.951047323 +0000 UTC m=+0.117148737 container create 6cb78ea5787c09a22761cdff13c64234dc370dfb29fad754315b7ad61b1065f9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, architecture=x86_64) Nov 28 02:48:33 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0de357ab3f32be41d06d275ed8a715c5e823c6c40791035f46e8d10ce58ea2e/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:33 localhost podman[31539]: 2025-11-28 07:48:32.917388428 +0000 UTC m=+0.083489912 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 02:48:33 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0de357ab3f32be41d06d275ed8a715c5e823c6c40791035f46e8d10ce58ea2e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:33 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0de357ab3f32be41d06d275ed8a715c5e823c6c40791035f46e8d10ce58ea2e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:33 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0de357ab3f32be41d06d275ed8a715c5e823c6c40791035f46e8d10ce58ea2e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:33 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0de357ab3f32be41d06d275ed8a715c5e823c6c40791035f46e8d10ce58ea2e/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:33 localhost podman[31539]: 2025-11-28 07:48:33.070126459 +0000 UTC m=+0.236227883 container init 6cb78ea5787c09a22761cdff13c64234dc370dfb29fad754315b7ad61b1065f9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, ceph=True, GIT_BRANCH=main, RELEASE=main, CEPH_POINT_RELEASE=, distribution-scope=public, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7) Nov 28 02:48:33 localhost podman[31539]: 2025-11-28 07:48:33.07840634 +0000 UTC m=+0.244507774 container start 6cb78ea5787c09a22761cdff13c64234dc370dfb29fad754315b7ad61b1065f9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2, version=7, io.openshift.expose-services=, architecture=x86_64, build-date=2025-09-24T08:57:55, RELEASE=main, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, name=rhceph, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_BRANCH=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 28 02:48:33 localhost bash[31539]: 6cb78ea5787c09a22761cdff13c64234dc370dfb29fad754315b7ad61b1065f9 Nov 28 02:48:33 localhost systemd[1]: Started Ceph osd.2 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1. Nov 28 02:48:33 localhost ceph-osd[31557]: set uid:gid to 167:167 (ceph:ceph) Nov 28 02:48:33 localhost ceph-osd[31557]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2 Nov 28 02:48:33 localhost ceph-osd[31557]: pidfile_write: ignore empty --pid-file Nov 28 02:48:33 localhost ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block Nov 28 02:48:33 localhost ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument Nov 28 02:48:33 localhost ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 28 02:48:33 localhost ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Nov 28 02:48:33 localhost ceph-osd[31557]: bdev(0x55ff50e33180 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block Nov 28 02:48:33 localhost ceph-osd[31557]: bdev(0x55ff50e33180 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument Nov 28 02:48:33 localhost ceph-osd[31557]: bdev(0x55ff50e33180 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 28 02:48:33 localhost ceph-osd[31557]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB Nov 28 02:48:33 localhost ceph-osd[31557]: bdev(0x55ff50e33180 /var/lib/ceph/osd/ceph-2/block) close Nov 28 02:48:33 localhost ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) close Nov 28 02:48:33 localhost ceph-osd[31557]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal Nov 28 02:48:33 localhost ceph-osd[31557]: load: jerasure load: lrc Nov 28 02:48:33 localhost ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block Nov 28 02:48:33 localhost ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument Nov 28 02:48:33 localhost ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 28 02:48:33 localhost ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Nov 28 02:48:33 localhost ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) close Nov 28 02:48:33 localhost podman[31648]: Nov 28 02:48:33 localhost podman[31648]: 2025-11-28 07:48:33.937098999 +0000 UTC m=+0.066684025 container create 91c281b2d0e97b15e60aeadecd2a9917d2cb03f5eab7a642c551444db475f5b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_ganguly, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_BRANCH=main, CEPH_POINT_RELEASE=, ceph=True, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, distribution-scope=public, com.redhat.component=rhceph-container, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, release=553, architecture=x86_64, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 02:48:33 localhost ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block Nov 28 02:48:33 localhost ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument Nov 28 02:48:33 localhost ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 28 02:48:33 localhost ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Nov 28 02:48:33 localhost ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) close Nov 28 02:48:33 localhost systemd[1]: Started libpod-conmon-91c281b2d0e97b15e60aeadecd2a9917d2cb03f5eab7a642c551444db475f5b0.scope. Nov 28 02:48:33 localhost systemd[1]: Started libcrun container. Nov 28 02:48:34 localhost podman[31648]: 2025-11-28 07:48:33.907684952 +0000 UTC m=+0.037269978 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 02:48:34 localhost podman[31648]: 2025-11-28 07:48:34.013653195 +0000 UTC m=+0.143238231 container init 91c281b2d0e97b15e60aeadecd2a9917d2cb03f5eab7a642c551444db475f5b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_ganguly, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, architecture=x86_64, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, maintainer=Guillaume Abrioux , GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, distribution-scope=public, name=rhceph) Nov 28 02:48:34 localhost podman[31648]: 2025-11-28 07:48:34.022358956 +0000 UTC m=+0.151943992 container start 91c281b2d0e97b15e60aeadecd2a9917d2cb03f5eab7a642c551444db475f5b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_ganguly, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.buildah.version=1.33.12, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , vcs-type=git, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, RELEASE=main, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 28 02:48:34 localhost podman[31648]: 2025-11-28 07:48:34.022562431 +0000 UTC m=+0.152147467 container attach 91c281b2d0e97b15e60aeadecd2a9917d2cb03f5eab7a642c551444db475f5b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_ganguly, io.openshift.expose-services=, ceph=True, name=rhceph, architecture=x86_64, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , GIT_CLEAN=True, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-type=git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc.) Nov 28 02:48:34 localhost relaxed_ganguly[31668]: 167 167 Nov 28 02:48:34 localhost systemd[1]: libpod-91c281b2d0e97b15e60aeadecd2a9917d2cb03f5eab7a642c551444db475f5b0.scope: Deactivated successfully. Nov 28 02:48:34 localhost podman[31648]: 2025-11-28 07:48:34.026599633 +0000 UTC m=+0.156184689 container died 91c281b2d0e97b15e60aeadecd2a9917d2cb03f5eab7a642c551444db475f5b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_ganguly, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.expose-services=, name=rhceph, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, distribution-scope=public, release=553, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , architecture=x86_64, ceph=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7) Nov 28 02:48:34 localhost podman[31673]: 2025-11-28 07:48:34.127160838 +0000 UTC m=+0.090467639 container remove 91c281b2d0e97b15e60aeadecd2a9917d2cb03f5eab7a642c551444db475f5b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_ganguly, io.buildah.version=1.33.12, version=7, distribution-scope=public, io.openshift.expose-services=, name=rhceph, CEPH_POINT_RELEASE=, ceph=True, vcs-type=git, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 28 02:48:34 localhost systemd[1]: libpod-conmon-91c281b2d0e97b15e60aeadecd2a9917d2cb03f5eab7a642c551444db475f5b0.scope: Deactivated successfully. Nov 28 02:48:34 localhost ceph-osd[31557]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second Nov 28 02:48:34 localhost ceph-osd[31557]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196 Nov 28 02:48:34 localhost ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block Nov 28 02:48:34 localhost ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument Nov 28 02:48:34 localhost ceph-osd[31557]: bdev(0x55ff50e32e00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 28 02:48:34 localhost ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Nov 28 02:48:34 localhost ceph-osd[31557]: bdev(0x55ff50e33180 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block Nov 28 02:48:34 localhost ceph-osd[31557]: bdev(0x55ff50e33180 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument Nov 28 02:48:34 localhost ceph-osd[31557]: bdev(0x55ff50e33180 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 28 02:48:34 localhost ceph-osd[31557]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB Nov 28 02:48:34 localhost ceph-osd[31557]: bluefs mount Nov 28 02:48:34 localhost ceph-osd[31557]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Nov 28 02:48:34 localhost ceph-osd[31557]: bluefs mount shared_bdev_used = 0 Nov 28 02:48:34 localhost ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: RocksDB version: 7.9.2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Git sha 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Compile date 2025-09-23 00:00:00 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: DB SUMMARY Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: DB Session ID: QQ88DS7L6ZLLRUDA5NZ1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: CURRENT file: CURRENT Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: IDENTITY file: IDENTITY Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.error_if_exists: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.create_if_missing: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.paranoid_checks: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.flush_verify_memtable_count: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.env: 0x55ff510c6bd0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.fs: LegacyFileSystem Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.info_log: 0x55ff51dba400 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_file_opening_threads: 16 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.statistics: (nil) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.use_fsync: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_log_file_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_manifest_file_size: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.log_file_time_to_roll: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.keep_log_file_num: 1000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.recycle_log_file_num: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.allow_fallocate: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.allow_mmap_reads: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.allow_mmap_writes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.use_direct_reads: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.create_missing_column_families: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.db_log_dir: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.wal_dir: db.wal Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_cache_numshardbits: 6 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.WAL_ttl_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.WAL_size_limit_MB: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.manifest_preallocation_size: 4194304 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.is_fd_close_on_exec: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.advise_random_on_open: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.db_write_buffer_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_buffer_manager: 0x55ff50e1c140 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.access_hint_on_compaction_start: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.random_access_max_buffer_size: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.use_adaptive_mutex: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.rate_limiter: (nil) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.wal_recovery_mode: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_thread_tracking: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_pipelined_write: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.unordered_write: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.allow_concurrent_memtable_write: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_thread_max_yield_usec: 100 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_thread_slow_yield_usec: 3 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.row_cache: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.wal_filter: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.avoid_flush_during_recovery: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.allow_ingest_behind: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.two_write_queues: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.manual_wal_flush: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.wal_compression: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.atomic_flush: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.persist_stats_to_disk: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_dbid_to_manifest: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.log_readahead_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.file_checksum_gen_factory: Unknown Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.best_efforts_recovery: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.allow_data_in_errors: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.db_host_id: __hostname__ Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enforce_single_del_contracts: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_background_jobs: 4 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_background_compactions: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_subcompactions: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.avoid_flush_during_shutdown: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.writable_file_max_buffer_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.delayed_write_rate : 16777216 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_total_wal_size: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.stats_dump_period_sec: 600 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.stats_persist_period_sec: 600 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.stats_history_buffer_size: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_open_files: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bytes_per_sync: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.wal_bytes_per_sync: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.strict_bytes_per_sync: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_readahead_size: 2097152 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_background_flushes: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Compression algorithms supported: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: #011kZSTD supported: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: #011kXpressCompression supported: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: #011kBZip2Compression supported: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: #011kLZ4Compression supported: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: #011kZlibCompression supported: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: #011kLZ4HCCompression supported: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: #011kSnappyCompression supported: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Fast CRC32 supported: Supported on x86 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: DMutex implementation: pthread_mutex_t Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dba5c0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55ff50e0a850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression: LZ4 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.num_levels: 7 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_properties_collectors: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.merge_operator: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dba5c0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55ff50e0a850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression: LZ4 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.num_levels: 7 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.merge_operator: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dba5c0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55ff50e0a850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression: LZ4 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.num_levels: 7 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.merge_operator: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dba5c0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55ff50e0a850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression: LZ4 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.num_levels: 7 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.merge_operator: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dba5c0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55ff50e0a850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression: LZ4 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.num_levels: 7 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.merge_operator: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dba5c0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55ff50e0a850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression: LZ4 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.num_levels: 7 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.merge_operator: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dba5c0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55ff50e0a850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression: LZ4 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.num_levels: 7 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.merge_operator: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dba7e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55ff50e0a2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression: LZ4 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.num_levels: 7 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.merge_operator: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dba7e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55ff50e0a2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression: LZ4 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.num_levels: 7 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.merge_operator: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dba7e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55ff50e0a2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression: LZ4 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.num_levels: 7 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 525bbb97-839b-44e9-be72-9b5ac24ac615 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316114267517, "job": 1, "event": "recovery_started", "wal_files": [31]} Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316114267809, "job": 1, "event": "recovery_finished"} Nov 28 02:48:34 localhost ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Nov 28 02:48:34 localhost ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025 Nov 28 02:48:34 localhost ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240 Nov 28 02:48:34 localhost ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3 Nov 28 02:48:34 localhost ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000 Nov 28 02:48:34 localhost ceph-osd[31557]: freelist init Nov 28 02:48:34 localhost ceph-osd[31557]: freelist _read_cfg Nov 28 02:48:34 localhost ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete Nov 28 02:48:34 localhost ceph-osd[31557]: bluefs umount Nov 28 02:48:34 localhost ceph-osd[31557]: bdev(0x55ff50e33180 /var/lib/ceph/osd/ceph-2/block) close Nov 28 02:48:34 localhost podman[31897]: Nov 28 02:48:34 localhost podman[31897]: 2025-11-28 07:48:34.483484343 +0000 UTC m=+0.087869984 container create 3cc0785e7c356c27f22f4de7b8ec16938d24fb01111693f2bb5ee88b61ccf62a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate-test, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, architecture=x86_64, build-date=2025-09-24T08:57:55, ceph=True, name=rhceph, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, GIT_CLEAN=True) Nov 28 02:48:34 localhost ceph-osd[31557]: bdev(0x55ff50e33180 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block Nov 28 02:48:34 localhost ceph-osd[31557]: bdev(0x55ff50e33180 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument Nov 28 02:48:34 localhost ceph-osd[31557]: bdev(0x55ff50e33180 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 28 02:48:34 localhost ceph-osd[31557]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB Nov 28 02:48:34 localhost ceph-osd[31557]: bluefs mount Nov 28 02:48:34 localhost ceph-osd[31557]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Nov 28 02:48:34 localhost ceph-osd[31557]: bluefs mount shared_bdev_used = 4718592 Nov 28 02:48:34 localhost ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: RocksDB version: 7.9.2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Git sha 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Compile date 2025-09-23 00:00:00 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: DB SUMMARY Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: DB Session ID: QQ88DS7L6ZLLRUDA5NZ0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: CURRENT file: CURRENT Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: IDENTITY file: IDENTITY Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.error_if_exists: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.create_if_missing: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.paranoid_checks: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.flush_verify_memtable_count: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.env: 0x55ff510c7c00 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.fs: LegacyFileSystem Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.info_log: 0x55ff51f6e820 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_file_opening_threads: 16 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.statistics: (nil) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.use_fsync: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_log_file_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_manifest_file_size: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.log_file_time_to_roll: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.keep_log_file_num: 1000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.recycle_log_file_num: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.allow_fallocate: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.allow_mmap_reads: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.allow_mmap_writes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.use_direct_reads: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.create_missing_column_families: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.db_log_dir: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.wal_dir: db.wal Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_cache_numshardbits: 6 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.WAL_ttl_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.WAL_size_limit_MB: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.manifest_preallocation_size: 4194304 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.is_fd_close_on_exec: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.advise_random_on_open: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.db_write_buffer_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_buffer_manager: 0x55ff50e1d540 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.access_hint_on_compaction_start: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.random_access_max_buffer_size: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.use_adaptive_mutex: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.rate_limiter: (nil) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.wal_recovery_mode: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_thread_tracking: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_pipelined_write: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.unordered_write: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.allow_concurrent_memtable_write: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_thread_max_yield_usec: 100 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_thread_slow_yield_usec: 3 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.row_cache: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.wal_filter: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.avoid_flush_during_recovery: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.allow_ingest_behind: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.two_write_queues: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.manual_wal_flush: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.wal_compression: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.atomic_flush: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.persist_stats_to_disk: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_dbid_to_manifest: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.log_readahead_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.file_checksum_gen_factory: Unknown Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.best_efforts_recovery: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.allow_data_in_errors: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.db_host_id: __hostname__ Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enforce_single_del_contracts: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_background_jobs: 4 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_background_compactions: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_subcompactions: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.avoid_flush_during_shutdown: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.writable_file_max_buffer_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.delayed_write_rate : 16777216 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_total_wal_size: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.stats_dump_period_sec: 600 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.stats_persist_period_sec: 600 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.stats_history_buffer_size: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_open_files: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bytes_per_sync: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.wal_bytes_per_sync: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.strict_bytes_per_sync: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_readahead_size: 2097152 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_background_flushes: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Compression algorithms supported: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: #011kZSTD supported: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: #011kXpressCompression supported: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: #011kBZip2Compression supported: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: #011kLZ4Compression supported: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: #011kZlibCompression supported: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: #011kLZ4HCCompression supported: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: #011kSnappyCompression supported: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Fast CRC32 supported: Supported on x86 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: DMutex implementation: pthread_mutex_t Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51f6ea40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55ff50e0b610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression: LZ4 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.num_levels: 7 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_properties_collectors: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.merge_operator: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51f6ea40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55ff50e0b610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression: LZ4 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.num_levels: 7 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.merge_operator: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51f6ea40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55ff50e0b610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression: LZ4 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.num_levels: 7 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:34 localhost systemd[1]: Started libpod-conmon-3cc0785e7c356c27f22f4de7b8ec16938d24fb01111693f2bb5ee88b61ccf62a.scope. Nov 28 02:48:34 localhost podman[31897]: 2025-11-28 07:48:34.447608392 +0000 UTC m=+0.051994033 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.merge_operator: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:34 localhost systemd[1]: Started libcrun container. Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51f6ea40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55ff50e0b610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression: LZ4 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.num_levels: 7 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.merge_operator: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51f6ea40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55ff50e0b610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression: LZ4 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.num_levels: 7 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.merge_operator: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51f6ea40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55ff50e0b610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression: LZ4 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.num_levels: 7 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.merge_operator: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51f6ea40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55ff50e0b610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression: LZ4 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.num_levels: 7 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.merge_operator: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dbab40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55ff50e0a2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression: LZ4 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.num_levels: 7 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:34 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a4e74e855d90d20904e991bf78eea8c7ad44a3a513f664d58ca3a32b1e45158/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.merge_operator: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dbab40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55ff50e0a2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression: LZ4 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.num_levels: 7 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.merge_operator: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55ff51dbab40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55ff50e0a2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression: LZ4 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.num_levels: 7 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 525bbb97-839b-44e9-be72-9b5ac24ac615 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316114539056, "job": 1, "event": "recovery_started", "wal_files": [31]} Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316114544544, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764316114, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "525bbb97-839b-44e9-be72-9b5ac24ac615", "db_session_id": "QQ88DS7L6ZLLRUDA5NZ0", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316114548180, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764316114, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "525bbb97-839b-44e9-be72-9b5ac24ac615", "db_session_id": "QQ88DS7L6ZLLRUDA5NZ0", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316114552059, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764316114, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "525bbb97-839b-44e9-be72-9b5ac24ac615", "db_session_id": "QQ88DS7L6ZLLRUDA5NZ0", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}} Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316114555822, "job": 1, "event": "recovery_finished"} Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/version_set.cc:5047] Creating manifest 40 Nov 28 02:48:34 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a4e74e855d90d20904e991bf78eea8c7ad44a3a513f664d58ca3a32b1e45158/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55ff51f82380 Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: DB pointer 0x55ff51d11a00 Nov 28 02:48:34 localhost ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Nov 28 02:48:34 localhost ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4 Nov 28 02:48:34 localhost ceph-osd[31557]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 02:48:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 460.80 MB usag Nov 28 02:48:34 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a4e74e855d90d20904e991bf78eea8c7ad44a3a513f664d58ca3a32b1e45158/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:34 localhost ceph-osd[31557]: /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs Nov 28 02:48:34 localhost ceph-osd[31557]: /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello Nov 28 02:48:34 localhost ceph-osd[31557]: _get_class not permitted to load lua Nov 28 02:48:34 localhost ceph-osd[31557]: _get_class not permitted to load sdk Nov 28 02:48:34 localhost ceph-osd[31557]: _get_class not permitted to load test_remote_reads Nov 28 02:48:34 localhost ceph-osd[31557]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients Nov 28 02:48:34 localhost ceph-osd[31557]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons Nov 28 02:48:34 localhost ceph-osd[31557]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds Nov 28 02:48:34 localhost ceph-osd[31557]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature Nov 28 02:48:34 localhost ceph-osd[31557]: osd.2 0 load_pgs Nov 28 02:48:34 localhost ceph-osd[31557]: osd.2 0 load_pgs opened 0 pgs Nov 28 02:48:34 localhost ceph-osd[31557]: osd.2 0 log_to_monitors true Nov 28 02:48:34 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2[31553]: 2025-11-28T07:48:34.613+0000 7fd094bd2a80 -1 osd.2 0 log_to_monitors true Nov 28 02:48:34 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a4e74e855d90d20904e991bf78eea8c7ad44a3a513f664d58ca3a32b1e45158/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:34 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a4e74e855d90d20904e991bf78eea8c7ad44a3a513f664d58ca3a32b1e45158/merged/var/lib/ceph/osd/ceph-5 supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:34 localhost podman[31897]: 2025-11-28 07:48:34.636321607 +0000 UTC m=+0.240707248 container init 3cc0785e7c356c27f22f4de7b8ec16938d24fb01111693f2bb5ee88b61ccf62a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate-test, GIT_BRANCH=main, GIT_CLEAN=True, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, RELEASE=main, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, version=7, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=) Nov 28 02:48:34 localhost podman[31897]: 2025-11-28 07:48:34.647532372 +0000 UTC m=+0.251917993 container start 3cc0785e7c356c27f22f4de7b8ec16938d24fb01111693f2bb5ee88b61ccf62a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate-test, version=7, GIT_CLEAN=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, maintainer=Guillaume Abrioux , name=rhceph, ceph=True, vcs-type=git, distribution-scope=public, io.openshift.tags=rhceph ceph) Nov 28 02:48:34 localhost podman[31897]: 2025-11-28 07:48:34.647696936 +0000 UTC m=+0.252082597 container attach 3cc0785e7c356c27f22f4de7b8ec16938d24fb01111693f2bb5ee88b61ccf62a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate-test, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, distribution-scope=public, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, maintainer=Guillaume Abrioux , GIT_BRANCH=main, GIT_CLEAN=True, vcs-type=git, com.redhat.component=rhceph-container, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, release=553, ceph=True, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 28 02:48:34 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate-test[32062]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID] Nov 28 02:48:34 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate-test[32062]: [--no-systemd] [--no-tmpfs] Nov 28 02:48:34 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate-test[32062]: ceph-volume activate: error: unrecognized arguments: --bad-option Nov 28 02:48:34 localhost systemd[1]: libpod-3cc0785e7c356c27f22f4de7b8ec16938d24fb01111693f2bb5ee88b61ccf62a.scope: Deactivated successfully. Nov 28 02:48:34 localhost podman[31897]: 2025-11-28 07:48:34.873587966 +0000 UTC m=+0.477973597 container died 3cc0785e7c356c27f22f4de7b8ec16938d24fb01111693f2bb5ee88b61ccf62a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate-test, maintainer=Guillaume Abrioux , architecture=x86_64, name=rhceph, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, com.redhat.component=rhceph-container, release=553, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7) Nov 28 02:48:34 localhost systemd[1]: tmp-crun.9ZWNwq.mount: Deactivated successfully. Nov 28 02:48:34 localhost systemd[1]: var-lib-containers-storage-overlay-e1c35257d5e551a4d40add5184be9cafdc10e96519277913cde544a854cc14fa-merged.mount: Deactivated successfully. Nov 28 02:48:34 localhost systemd[1]: var-lib-containers-storage-overlay-7a4e74e855d90d20904e991bf78eea8c7ad44a3a513f664d58ca3a32b1e45158-merged.mount: Deactivated successfully. Nov 28 02:48:34 localhost podman[32132]: 2025-11-28 07:48:34.978266476 +0000 UTC m=+0.094066072 container remove 3cc0785e7c356c27f22f4de7b8ec16938d24fb01111693f2bb5ee88b61ccf62a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate-test, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.expose-services=, release=553, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, vcs-type=git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux ) Nov 28 02:48:34 localhost systemd[1]: libpod-conmon-3cc0785e7c356c27f22f4de7b8ec16938d24fb01111693f2bb5ee88b61ccf62a.scope: Deactivated successfully. Nov 28 02:48:35 localhost systemd[1]: Reloading. Nov 28 02:48:35 localhost systemd-sysv-generator[32193]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:48:35 localhost systemd-rc-local-generator[32190]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:48:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:48:35 localhost systemd[1]: Reloading. Nov 28 02:48:35 localhost systemd-rc-local-generator[32228]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:48:35 localhost systemd-sysv-generator[32233]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:48:35 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : purged_snaps scrub starts Nov 28 02:48:35 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : purged_snaps scrub ok Nov 28 02:48:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:48:35 localhost systemd[1]: Starting Ceph osd.5 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1... Nov 28 02:48:36 localhost podman[32294]: Nov 28 02:48:36 localhost podman[32294]: 2025-11-28 07:48:36.203881459 +0000 UTC m=+0.074460523 container create 6a080119ce945ea7739d53197fc2390b0cc9e3714c2b432aee35a9d3aa355013 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, com.redhat.component=rhceph-container, version=7, io.openshift.expose-services=, io.buildah.version=1.33.12, name=rhceph, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, RELEASE=main, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, ceph=True, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55) Nov 28 02:48:36 localhost systemd[1]: tmp-crun.BFDco7.mount: Deactivated successfully. Nov 28 02:48:36 localhost systemd[1]: Started libcrun container. Nov 28 02:48:36 localhost podman[32294]: 2025-11-28 07:48:36.171493986 +0000 UTC m=+0.042073100 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 02:48:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8edaf488f4faecc9741d4f49b8ba64db92db3c1345008684fdb91ab4b8869b56/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8edaf488f4faecc9741d4f49b8ba64db92db3c1345008684fdb91ab4b8869b56/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8edaf488f4faecc9741d4f49b8ba64db92db3c1345008684fdb91ab4b8869b56/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8edaf488f4faecc9741d4f49b8ba64db92db3c1345008684fdb91ab4b8869b56/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8edaf488f4faecc9741d4f49b8ba64db92db3c1345008684fdb91ab4b8869b56/merged/var/lib/ceph/osd/ceph-5 supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:36 localhost podman[32294]: 2025-11-28 07:48:36.334562209 +0000 UTC m=+0.205141273 container init 6a080119ce945ea7739d53197fc2390b0cc9e3714c2b432aee35a9d3aa355013 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, release=553, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., RELEASE=main, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, version=7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 02:48:36 localhost podman[32294]: 2025-11-28 07:48:36.347529239 +0000 UTC m=+0.218108333 container start 6a080119ce945ea7739d53197fc2390b0cc9e3714c2b432aee35a9d3aa355013 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_CLEAN=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, vcs-type=git, RELEASE=main, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , version=7, io.openshift.expose-services=, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc.) Nov 28 02:48:36 localhost podman[32294]: 2025-11-28 07:48:36.347841367 +0000 UTC m=+0.218420441 container attach 6a080119ce945ea7739d53197fc2390b0cc9e3714c2b432aee35a9d3aa355013 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, ceph=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, architecture=x86_64, RELEASE=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux ) Nov 28 02:48:36 localhost ceph-osd[31557]: osd.2 0 done with init, starting boot process Nov 28 02:48:36 localhost ceph-osd[31557]: osd.2 0 start_boot Nov 28 02:48:36 localhost ceph-osd[31557]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1 Nov 28 02:48:36 localhost ceph-osd[31557]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0 Nov 28 02:48:36 localhost ceph-osd[31557]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3 Nov 28 02:48:36 localhost ceph-osd[31557]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10 Nov 28 02:48:36 localhost ceph-osd[31557]: osd.2 0 bench count 12288000 bsize 4 KiB Nov 28 02:48:37 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate[32308]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 Nov 28 02:48:37 localhost bash[32294]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 Nov 28 02:48:37 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate[32308]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-5 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1 Nov 28 02:48:37 localhost bash[32294]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-5 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1 Nov 28 02:48:37 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate[32308]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1 Nov 28 02:48:37 localhost bash[32294]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1 Nov 28 02:48:37 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate[32308]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Nov 28 02:48:37 localhost bash[32294]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Nov 28 02:48:37 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate[32308]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-5/block Nov 28 02:48:37 localhost bash[32294]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-5/block Nov 28 02:48:37 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate[32308]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 Nov 28 02:48:37 localhost bash[32294]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 Nov 28 02:48:37 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate[32308]: --> ceph-volume raw activate successful for osd ID: 5 Nov 28 02:48:37 localhost bash[32294]: --> ceph-volume raw activate successful for osd ID: 5 Nov 28 02:48:37 localhost systemd[1]: libpod-6a080119ce945ea7739d53197fc2390b0cc9e3714c2b432aee35a9d3aa355013.scope: Deactivated successfully. Nov 28 02:48:37 localhost podman[32294]: 2025-11-28 07:48:37.167772322 +0000 UTC m=+1.038351396 container died 6a080119ce945ea7739d53197fc2390b0cc9e3714c2b432aee35a9d3aa355013 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.expose-services=, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.buildah.version=1.33.12, vcs-type=git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 28 02:48:37 localhost systemd[1]: var-lib-containers-storage-overlay-8edaf488f4faecc9741d4f49b8ba64db92db3c1345008684fdb91ab4b8869b56-merged.mount: Deactivated successfully. Nov 28 02:48:37 localhost podman[32429]: 2025-11-28 07:48:37.292386158 +0000 UTC m=+0.110337504 container remove 6a080119ce945ea7739d53197fc2390b0cc9e3714c2b432aee35a9d3aa355013 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5-activate, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, release=553, io.openshift.expose-services=, ceph=True, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, distribution-scope=public, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_CLEAN=True, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7) Nov 28 02:48:37 localhost podman[32488]: Nov 28 02:48:37 localhost podman[32488]: 2025-11-28 07:48:37.669800569 +0000 UTC m=+0.087395282 container create 4f3622bdb38481233795ae3180a50069a5087f8f4c9c12f799952a3c52cc3fc6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5, ceph=True, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux , GIT_BRANCH=main, GIT_CLEAN=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 28 02:48:37 localhost podman[32488]: 2025-11-28 07:48:37.620839054 +0000 UTC m=+0.038433817 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 02:48:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e5e613a02c56da4c93d1e8a8188914bb612ea02364afe0c8ca7263373eb0ad5/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e5e613a02c56da4c93d1e8a8188914bb612ea02364afe0c8ca7263373eb0ad5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e5e613a02c56da4c93d1e8a8188914bb612ea02364afe0c8ca7263373eb0ad5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e5e613a02c56da4c93d1e8a8188914bb612ea02364afe0c8ca7263373eb0ad5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e5e613a02c56da4c93d1e8a8188914bb612ea02364afe0c8ca7263373eb0ad5/merged/var/lib/ceph/osd/ceph-5 supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:37 localhost podman[32488]: 2025-11-28 07:48:37.815111681 +0000 UTC m=+0.232706414 container init 4f3622bdb38481233795ae3180a50069a5087f8f4c9c12f799952a3c52cc3fc6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5, distribution-scope=public, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=553, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, name=rhceph, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, architecture=x86_64) Nov 28 02:48:37 localhost podman[32488]: 2025-11-28 07:48:37.848016117 +0000 UTC m=+0.265610860 container start 4f3622bdb38481233795ae3180a50069a5087f8f4c9c12f799952a3c52cc3fc6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, release=553, distribution-scope=public, RELEASE=main, io.openshift.tags=rhceph ceph) Nov 28 02:48:37 localhost bash[32488]: 4f3622bdb38481233795ae3180a50069a5087f8f4c9c12f799952a3c52cc3fc6 Nov 28 02:48:37 localhost systemd[1]: Started Ceph osd.5 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1. Nov 28 02:48:37 localhost ceph-osd[32506]: set uid:gid to 167:167 (ceph:ceph) Nov 28 02:48:37 localhost ceph-osd[32506]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2 Nov 28 02:48:37 localhost ceph-osd[32506]: pidfile_write: ignore empty --pid-file Nov 28 02:48:37 localhost ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block Nov 28 02:48:37 localhost ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument Nov 28 02:48:37 localhost ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 28 02:48:37 localhost ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Nov 28 02:48:37 localhost ceph-osd[32506]: bdev(0x558439171180 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block Nov 28 02:48:37 localhost ceph-osd[32506]: bdev(0x558439171180 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument Nov 28 02:48:37 localhost ceph-osd[32506]: bdev(0x558439171180 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 28 02:48:37 localhost ceph-osd[32506]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-5/block size 7.0 GiB Nov 28 02:48:37 localhost ceph-osd[32506]: bdev(0x558439171180 /var/lib/ceph/osd/ceph-5/block) close Nov 28 02:48:38 localhost ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) close Nov 28 02:48:38 localhost ceph-osd[32506]: starting osd.5 osd_data /var/lib/ceph/osd/ceph-5 /var/lib/ceph/osd/ceph-5/journal Nov 28 02:48:38 localhost ceph-osd[32506]: load: jerasure load: lrc Nov 28 02:48:38 localhost ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block Nov 28 02:48:38 localhost ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument Nov 28 02:48:38 localhost ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 28 02:48:38 localhost ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Nov 28 02:48:38 localhost ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) close Nov 28 02:48:38 localhost podman[32597]: Nov 28 02:48:38 localhost ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block Nov 28 02:48:38 localhost ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument Nov 28 02:48:38 localhost ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 28 02:48:38 localhost ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Nov 28 02:48:38 localhost ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) close Nov 28 02:48:38 localhost podman[32597]: 2025-11-28 07:48:38.718857025 +0000 UTC m=+0.097902989 container create 60208db6d0ac2f25afd9c2f81eb8933eb34be3c31a6a8ec5fe39fb46c3405918 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_gauss, distribution-scope=public, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.openshift.tags=rhceph ceph, version=7, io.buildah.version=1.33.12, GIT_CLEAN=True, vcs-type=git, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=) Nov 28 02:48:38 localhost podman[32597]: 2025-11-28 07:48:38.666616998 +0000 UTC m=+0.045662962 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 02:48:38 localhost systemd[1]: Started libpod-conmon-60208db6d0ac2f25afd9c2f81eb8933eb34be3c31a6a8ec5fe39fb46c3405918.scope. Nov 28 02:48:38 localhost systemd[1]: Started libcrun container. Nov 28 02:48:38 localhost podman[32597]: 2025-11-28 07:48:38.849757902 +0000 UTC m=+0.228803846 container init 60208db6d0ac2f25afd9c2f81eb8933eb34be3c31a6a8ec5fe39fb46c3405918 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_gauss, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, CEPH_POINT_RELEASE=, release=553, GIT_CLEAN=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 02:48:38 localhost podman[32597]: 2025-11-28 07:48:38.861084989 +0000 UTC m=+0.240130933 container start 60208db6d0ac2f25afd9c2f81eb8933eb34be3c31a6a8ec5fe39fb46c3405918 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_gauss, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, architecture=x86_64, name=rhceph, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, distribution-scope=public, release=553) Nov 28 02:48:38 localhost podman[32597]: 2025-11-28 07:48:38.861393087 +0000 UTC m=+0.240439101 container attach 60208db6d0ac2f25afd9c2f81eb8933eb34be3c31a6a8ec5fe39fb46c3405918 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_gauss, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, release=553, architecture=x86_64, io.openshift.expose-services=, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vendor=Red Hat, Inc., version=7, io.openshift.tags=rhceph ceph, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container) Nov 28 02:48:38 localhost gracious_gauss[32616]: 167 167 Nov 28 02:48:38 localhost systemd[1]: libpod-60208db6d0ac2f25afd9c2f81eb8933eb34be3c31a6a8ec5fe39fb46c3405918.scope: Deactivated successfully. Nov 28 02:48:38 localhost podman[32597]: 2025-11-28 07:48:38.870119779 +0000 UTC m=+0.249165783 container died 60208db6d0ac2f25afd9c2f81eb8933eb34be3c31a6a8ec5fe39fb46c3405918 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_gauss, ceph=True, GIT_BRANCH=main, RELEASE=main, build-date=2025-09-24T08:57:55, release=553, GIT_CLEAN=True, vcs-type=git, version=7, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux ) Nov 28 02:48:38 localhost ceph-osd[32506]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second Nov 28 02:48:38 localhost ceph-osd[32506]: osd.5:0.OSDShard using op scheduler mclock_scheduler, cutoff=196 Nov 28 02:48:38 localhost ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block Nov 28 02:48:38 localhost ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument Nov 28 02:48:38 localhost ceph-osd[32506]: bdev(0x558439170e00 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 28 02:48:38 localhost ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Nov 28 02:48:38 localhost ceph-osd[32506]: bdev(0x558439171180 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block Nov 28 02:48:38 localhost ceph-osd[32506]: bdev(0x558439171180 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument Nov 28 02:48:38 localhost ceph-osd[32506]: bdev(0x558439171180 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 28 02:48:38 localhost ceph-osd[32506]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-5/block size 7.0 GiB Nov 28 02:48:38 localhost ceph-osd[32506]: bluefs mount Nov 28 02:48:38 localhost ceph-osd[32506]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Nov 28 02:48:38 localhost ceph-osd[32506]: bluefs mount shared_bdev_used = 0 Nov 28 02:48:38 localhost ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: RocksDB version: 7.9.2 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Git sha 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Compile date 2025-09-23 00:00:00 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: DB SUMMARY Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: DB Session ID: QIG9JK7FL3F9WYN4LANW Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: CURRENT file: CURRENT Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: IDENTITY file: IDENTITY Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.error_if_exists: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.create_if_missing: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.paranoid_checks: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.flush_verify_memtable_count: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.env: 0x558439404cb0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.fs: LegacyFileSystem Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.info_log: 0x55843a102740 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_file_opening_threads: 16 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.statistics: (nil) Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.use_fsync: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_log_file_size: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_manifest_file_size: 1073741824 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.log_file_time_to_roll: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.keep_log_file_num: 1000 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.recycle_log_file_num: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.allow_fallocate: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.allow_mmap_reads: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.allow_mmap_writes: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.use_direct_reads: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.create_missing_column_families: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.db_log_dir: Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.wal_dir: db.wal Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.table_cache_numshardbits: 6 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.WAL_ttl_seconds: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.WAL_size_limit_MB: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.manifest_preallocation_size: 4194304 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.is_fd_close_on_exec: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.advise_random_on_open: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.db_write_buffer_size: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.write_buffer_manager: 0x55843915a140 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.access_hint_on_compaction_start: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.random_access_max_buffer_size: 1048576 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.use_adaptive_mutex: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.rate_limiter: (nil) Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.wal_recovery_mode: 2 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.enable_thread_tracking: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.enable_pipelined_write: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.unordered_write: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.allow_concurrent_memtable_write: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.write_thread_max_yield_usec: 100 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.write_thread_slow_yield_usec: 3 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.row_cache: None Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.wal_filter: None Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.avoid_flush_during_recovery: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.allow_ingest_behind: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.two_write_queues: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.manual_wal_flush: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.wal_compression: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.atomic_flush: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.persist_stats_to_disk: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.write_dbid_to_manifest: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.log_readahead_size: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.file_checksum_gen_factory: Unknown Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.best_efforts_recovery: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.allow_data_in_errors: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.db_host_id: __hostname__ Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.enforce_single_del_contracts: true Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_background_jobs: 4 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_background_compactions: -1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_subcompactions: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.avoid_flush_during_shutdown: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.writable_file_max_buffer_size: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.delayed_write_rate : 16777216 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_total_wal_size: 1073741824 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.stats_dump_period_sec: 600 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.stats_persist_period_sec: 600 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.stats_history_buffer_size: 1048576 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_open_files: -1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.bytes_per_sync: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.wal_bytes_per_sync: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.strict_bytes_per_sync: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compaction_readahead_size: 2097152 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_background_flushes: -1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Compression algorithms supported: Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: #011kZSTD supported: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: #011kXpressCompression supported: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: #011kBZip2Compression supported: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: #011kLZ4Compression supported: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: #011kZlibCompression supported: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: #011kLZ4HCCompression supported: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: #011kSnappyCompression supported: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Fast CRC32 supported: Supported on x86 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: DMutex implementation: pthread_mutex_t Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter: None Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55843a102900)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x558439148850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compression: LZ4 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.num_levels: 7 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.table_properties_collectors: Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.merge_operator: None Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter: None Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55843a102900)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x558439148850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compression: LZ4 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.num_levels: 7 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:38 localhost ceph-osd[32506]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.merge_operator: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55843a102900)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x558439148850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression: LZ4 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.num_levels: 7 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.merge_operator: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55843a102900)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x558439148850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression: LZ4 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.num_levels: 7 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.merge_operator: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55843a102900)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x558439148850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression: LZ4 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.num_levels: 7 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.merge_operator: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55843a102900)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x558439148850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression: LZ4 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.num_levels: 7 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.merge_operator: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55843a102900)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x558439148850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression: LZ4 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.num_levels: 7 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.merge_operator: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55843a102b20)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5584391482d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression: LZ4 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.num_levels: 7 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.merge_operator: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55843a102b20)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5584391482d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression: LZ4 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.num_levels: 7 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.merge_operator: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55843a102b20)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5584391482d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression: LZ4 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.num_levels: 7 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 2c33d069-07f2-43c6-8b70-8b37a70b2431 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316119009345, "job": 1, "event": "recovery_started", "wal_files": [31]} Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316119009673, "job": 1, "event": "recovery_finished"} Nov 28 02:48:39 localhost ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Nov 28 02:48:39 localhost ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _open_super_meta old nid_max 1025 Nov 28 02:48:39 localhost ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _open_super_meta old blobid_max 10240 Nov 28 02:48:39 localhost ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _open_super_meta ondisk_format 4 compat_ondisk_format 3 Nov 28 02:48:39 localhost ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _open_super_meta min_alloc_size 0x1000 Nov 28 02:48:39 localhost ceph-osd[32506]: freelist init Nov 28 02:48:39 localhost ceph-osd[32506]: freelist _read_cfg Nov 28 02:48:39 localhost ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete Nov 28 02:48:39 localhost ceph-osd[32506]: bluefs umount Nov 28 02:48:39 localhost ceph-osd[32506]: bdev(0x558439171180 /var/lib/ceph/osd/ceph-5/block) close Nov 28 02:48:39 localhost podman[32621]: 2025-11-28 07:48:39.026529603 +0000 UTC m=+0.138978302 container remove 60208db6d0ac2f25afd9c2f81eb8933eb34be3c31a6a8ec5fe39fb46c3405918 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_gauss, vendor=Red Hat, Inc., version=7, name=rhceph, vcs-type=git, io.buildah.version=1.33.12, RELEASE=main, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.openshift.tags=rhceph ceph, architecture=x86_64, ceph=True, io.k8s.description=Red Hat Ceph Storage 7) Nov 28 02:48:39 localhost systemd[1]: libpod-conmon-60208db6d0ac2f25afd9c2f81eb8933eb34be3c31a6a8ec5fe39fb46c3405918.scope: Deactivated successfully. Nov 28 02:48:39 localhost podman[32834]: Nov 28 02:48:39 localhost systemd[1]: var-lib-containers-storage-overlay-f369eaa6673c2c86c889c1ddac2e0d5d6a7af4d65af83ecf2e6fce15337e7f5e-merged.mount: Deactivated successfully. Nov 28 02:48:39 localhost podman[32834]: 2025-11-28 07:48:39.257337268 +0000 UTC m=+0.101004498 container create e18c31b9983404861865f964abccc1c24d5e067572f133d39dbb490052eeb2c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_lederberg, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, architecture=x86_64, vendor=Red Hat, Inc., version=7, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, vcs-type=git, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 02:48:39 localhost ceph-osd[32506]: bdev(0x558439171180 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block Nov 28 02:48:39 localhost ceph-osd[32506]: bdev(0x558439171180 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument Nov 28 02:48:39 localhost ceph-osd[32506]: bdev(0x558439171180 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Nov 28 02:48:39 localhost ceph-osd[32506]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-5/block size 7.0 GiB Nov 28 02:48:39 localhost ceph-osd[32506]: bluefs mount Nov 28 02:48:39 localhost ceph-osd[32506]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Nov 28 02:48:39 localhost ceph-osd[32506]: bluefs mount shared_bdev_used = 4718592 Nov 28 02:48:39 localhost ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: RocksDB version: 7.9.2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Git sha 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Compile date 2025-09-23 00:00:00 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: DB SUMMARY Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: DB Session ID: QIG9JK7FL3F9WYN4LANX Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: CURRENT file: CURRENT Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: IDENTITY file: IDENTITY Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.error_if_exists: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.create_if_missing: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.paranoid_checks: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.flush_verify_memtable_count: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.env: 0x558439405ea0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.fs: LegacyFileSystem Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.info_log: 0x558439210a40 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_file_opening_threads: 16 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.statistics: (nil) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.use_fsync: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_log_file_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_manifest_file_size: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.log_file_time_to_roll: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.keep_log_file_num: 1000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.recycle_log_file_num: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.allow_fallocate: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.allow_mmap_reads: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.allow_mmap_writes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.use_direct_reads: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.create_missing_column_families: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.db_log_dir: Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.wal_dir: db.wal Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_cache_numshardbits: 6 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.WAL_ttl_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.WAL_size_limit_MB: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.manifest_preallocation_size: 4194304 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.is_fd_close_on_exec: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.advise_random_on_open: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.db_write_buffer_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.write_buffer_manager: 0x55843915b540 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.access_hint_on_compaction_start: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.random_access_max_buffer_size: 1048576 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.use_adaptive_mutex: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.rate_limiter: (nil) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.wal_recovery_mode: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_thread_tracking: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_pipelined_write: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.unordered_write: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.allow_concurrent_memtable_write: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.write_thread_max_yield_usec: 100 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.write_thread_slow_yield_usec: 3 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.row_cache: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.wal_filter: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.avoid_flush_during_recovery: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.allow_ingest_behind: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.two_write_queues: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.manual_wal_flush: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.wal_compression: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.atomic_flush: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.persist_stats_to_disk: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.write_dbid_to_manifest: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.log_readahead_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.file_checksum_gen_factory: Unknown Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.best_efforts_recovery: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.allow_data_in_errors: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.db_host_id: __hostname__ Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enforce_single_del_contracts: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_background_jobs: 4 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_background_compactions: -1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_subcompactions: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.avoid_flush_during_shutdown: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.writable_file_max_buffer_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.delayed_write_rate : 16777216 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_total_wal_size: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.stats_dump_period_sec: 600 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.stats_persist_period_sec: 600 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.stats_history_buffer_size: 1048576 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_open_files: -1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bytes_per_sync: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.wal_bytes_per_sync: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.strict_bytes_per_sync: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_readahead_size: 2097152 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_background_flushes: -1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Compression algorithms supported: Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: #011kZSTD supported: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: #011kXpressCompression supported: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: #011kBZip2Compression supported: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: #011kLZ4Compression supported: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: #011kZlibCompression supported: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: #011kLZ4HCCompression supported: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: #011kSnappyCompression supported: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Fast CRC32 supported: Supported on x86 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: DMutex implementation: pthread_mutex_t Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5584392117e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x558439149610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression: LZ4 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.num_levels: 7 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_properties_collectors: Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.merge_operator: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5584392117e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x558439149610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression: LZ4 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.num_levels: 7 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.merge_operator: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5584392117e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x558439149610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression: LZ4 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.num_levels: 7 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.merge_operator: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5584392117e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x558439149610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression: LZ4 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.num_levels: 7 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.merge_operator: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5584392117e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x558439149610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression: LZ4 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.num_levels: 7 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.merge_operator: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5584392117e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x558439149610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression: LZ4 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.num_levels: 7 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.merge_operator: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5584392117e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x558439149610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression: LZ4 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.num_levels: 7 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.merge_operator: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x558439210f60)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5584391482d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression: LZ4 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.num_levels: 7 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.merge_operator: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x558439210f60)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5584391482d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression: LZ4 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.num_levels: 7 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:39 localhost podman[32834]: 2025-11-28 07:48:39.204096655 +0000 UTC m=+0.047763895 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.merge_operator: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_filter_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.sst_partitioner_factory: None Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x558439210f60)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5584391482d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.write_buffer_size: 16777216 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number: 64 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression: LZ4 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression: Disabled Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.num_levels: 7 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.level: 32767 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.enabled: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.arena_block_size: 1048576 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_support: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.bloom_locality: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.max_successive_merges: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.force_consistency_checks: 1 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.ttl: 2592000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_files: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.min_blob_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_size: 268435456 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 2c33d069-07f2-43c6-8b70-8b37a70b2431 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316119306088, "job": 1, "event": "recovery_started", "wal_files": [31]} Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316119314181, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764316119, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2c33d069-07f2-43c6-8b70-8b37a70b2431", "db_session_id": "QIG9JK7FL3F9WYN4LANX", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316119319536, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764316119, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2c33d069-07f2-43c6-8b70-8b37a70b2431", "db_session_id": "QIG9JK7FL3F9WYN4LANX", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316119323907, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764316119, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2c33d069-07f2-43c6-8b70-8b37a70b2431", "db_session_id": "QIG9JK7FL3F9WYN4LANX", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}} Nov 28 02:48:39 localhost systemd[1]: Started libpod-conmon-e18c31b9983404861865f964abccc1c24d5e067572f133d39dbb490052eeb2c6.scope. Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764316119356289, "job": 1, "event": "recovery_finished"} Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/version_set.cc:5047] Creating manifest 40 Nov 28 02:48:39 localhost systemd[1]: Started libcrun container. Nov 28 02:48:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ae70a0968f7f760f7aaf3c7e95cf793447e5cc122a7a8d1f8c9cbc3e8197f70/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ae70a0968f7f760f7aaf3c7e95cf793447e5cc122a7a8d1f8c9cbc3e8197f70/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ae70a0968f7f760f7aaf3c7e95cf793447e5cc122a7a8d1f8c9cbc3e8197f70/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:39 localhost podman[32834]: 2025-11-28 07:48:39.41206646 +0000 UTC m=+0.255733700 container init e18c31b9983404861865f964abccc1c24d5e067572f133d39dbb490052eeb2c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_lederberg, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, distribution-scope=public, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, maintainer=Guillaume Abrioux , release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, ceph=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, name=rhceph) Nov 28 02:48:39 localhost systemd[1]: tmp-crun.Jmx6dQ.mount: Deactivated successfully. Nov 28 02:48:39 localhost podman[32834]: 2025-11-28 07:48:39.436608773 +0000 UTC m=+0.280275973 container start e18c31b9983404861865f964abccc1c24d5e067572f133d39dbb490052eeb2c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_lederberg, maintainer=Guillaume Abrioux , release=553, io.openshift.expose-services=, ceph=True, CEPH_POINT_RELEASE=, distribution-scope=public, vcs-type=git, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, name=rhceph, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7) Nov 28 02:48:39 localhost podman[32834]: 2025-11-28 07:48:39.436959353 +0000 UTC m=+0.280626583 container attach e18c31b9983404861865f964abccc1c24d5e067572f133d39dbb490052eeb2c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_lederberg, version=7, RELEASE=main, name=rhceph, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, ceph=True, description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.33.12, release=553, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55843a2c6380 Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: DB pointer 0x55843a059a00 Nov 28 02:48:39 localhost ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Nov 28 02:48:39 localhost ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _upgrade_super from 4, latest 4 Nov 28 02:48:39 localhost ceph-osd[32506]: bluestore(/var/lib/ceph/osd/ceph-5) _upgrade_super done Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 02:48:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.3 total, 0.3 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.008 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.008 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.008 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.01 0.00 1 0.008 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.3 total, 0.3 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x558439149610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.3 total, 0.3 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x558439149610#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.3 total, 0.3 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012 Nov 28 02:48:39 localhost ceph-osd[32506]: /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs Nov 28 02:48:39 localhost ceph-osd[32506]: /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello Nov 28 02:48:39 localhost ceph-osd[32506]: _get_class not permitted to load lua Nov 28 02:48:39 localhost ceph-osd[32506]: _get_class not permitted to load sdk Nov 28 02:48:39 localhost ceph-osd[32506]: _get_class not permitted to load test_remote_reads Nov 28 02:48:39 localhost ceph-osd[32506]: osd.5 0 crush map has features 288232575208783872, adjusting msgr requires for clients Nov 28 02:48:39 localhost ceph-osd[32506]: osd.5 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons Nov 28 02:48:39 localhost ceph-osd[32506]: osd.5 0 crush map has features 288232575208783872, adjusting msgr requires for osds Nov 28 02:48:39 localhost ceph-osd[32506]: osd.5 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature Nov 28 02:48:39 localhost ceph-osd[32506]: osd.5 0 load_pgs Nov 28 02:48:39 localhost ceph-osd[32506]: osd.5 0 load_pgs opened 0 pgs Nov 28 02:48:39 localhost ceph-osd[32506]: osd.5 0 log_to_monitors true Nov 28 02:48:39 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5[32502]: 2025-11-28T07:48:39.615+0000 7f777c525a80 -1 osd.5 0 log_to_monitors true Nov 28 02:48:39 localhost hungry_lederberg[33031]: { Nov 28 02:48:39 localhost hungry_lederberg[33031]: "59ca4283-ae21-42b7-993b-9e0e69e2fb94": { Nov 28 02:48:39 localhost hungry_lederberg[33031]: "ceph_fsid": "2c5417c9-00eb-57d5-a565-ddecbc7995c1", Nov 28 02:48:39 localhost hungry_lederberg[33031]: "device": "/dev/mapper/ceph_vg1-ceph_lv1", Nov 28 02:48:39 localhost hungry_lederberg[33031]: "osd_id": 5, Nov 28 02:48:39 localhost hungry_lederberg[33031]: "osd_uuid": "59ca4283-ae21-42b7-993b-9e0e69e2fb94", Nov 28 02:48:39 localhost hungry_lederberg[33031]: "type": "bluestore" Nov 28 02:48:39 localhost hungry_lederberg[33031]: }, Nov 28 02:48:39 localhost hungry_lederberg[33031]: "d7af0c01-7a1e-4708-8e50-081c55d3ecd3": { Nov 28 02:48:39 localhost hungry_lederberg[33031]: "ceph_fsid": "2c5417c9-00eb-57d5-a565-ddecbc7995c1", Nov 28 02:48:39 localhost hungry_lederberg[33031]: "device": "/dev/mapper/ceph_vg0-ceph_lv0", Nov 28 02:48:39 localhost hungry_lederberg[33031]: "osd_id": 2, Nov 28 02:48:39 localhost hungry_lederberg[33031]: "osd_uuid": "d7af0c01-7a1e-4708-8e50-081c55d3ecd3", Nov 28 02:48:39 localhost hungry_lederberg[33031]: "type": "bluestore" Nov 28 02:48:39 localhost hungry_lederberg[33031]: } Nov 28 02:48:39 localhost hungry_lederberg[33031]: } Nov 28 02:48:40 localhost systemd[1]: libpod-e18c31b9983404861865f964abccc1c24d5e067572f133d39dbb490052eeb2c6.scope: Deactivated successfully. Nov 28 02:48:40 localhost podman[32834]: 2025-11-28 07:48:40.020145202 +0000 UTC m=+0.863812422 container died e18c31b9983404861865f964abccc1c24d5e067572f133d39dbb490052eeb2c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_lederberg, GIT_CLEAN=True, architecture=x86_64, io.openshift.expose-services=, version=7, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, distribution-scope=public, name=rhceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=) Nov 28 02:48:40 localhost podman[33100]: 2025-11-28 07:48:40.11690095 +0000 UTC m=+0.088740285 container remove e18c31b9983404861865f964abccc1c24d5e067572f133d39dbb490052eeb2c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_lederberg, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, maintainer=Guillaume Abrioux , GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, ceph=True, io.buildah.version=1.33.12, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, release=553, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, version=7, vendor=Red Hat, Inc., name=rhceph, GIT_BRANCH=main) Nov 28 02:48:40 localhost systemd[1]: libpod-conmon-e18c31b9983404861865f964abccc1c24d5e067572f133d39dbb490052eeb2c6.scope: Deactivated successfully. Nov 28 02:48:40 localhost systemd[1]: var-lib-containers-storage-overlay-7ae70a0968f7f760f7aaf3c7e95cf793447e5cc122a7a8d1f8c9cbc3e8197f70-merged.mount: Deactivated successfully. Nov 28 02:48:40 localhost systemd[26286]: Starting Mark boot as successful... Nov 28 02:48:40 localhost systemd[26286]: Finished Mark boot as successful. Nov 28 02:48:40 localhost ceph-osd[31557]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 19.279 iops: 4935.372 elapsed_sec: 0.608 Nov 28 02:48:40 localhost ceph-osd[31557]: log_channel(cluster) log [WRN] : OSD bench result of 4935.372076 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd]. Nov 28 02:48:40 localhost ceph-osd[31557]: osd.2 0 waiting for initial osdmap Nov 28 02:48:40 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2[31553]: 2025-11-28T07:48:40.255+0000 7fd091366640 -1 osd.2 0 waiting for initial osdmap Nov 28 02:48:40 localhost ceph-osd[31557]: osd.2 13 crush map has features 288514050185494528, adjusting msgr requires for clients Nov 28 02:48:40 localhost ceph-osd[31557]: osd.2 13 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons Nov 28 02:48:40 localhost ceph-osd[31557]: osd.2 13 crush map has features 3314932999778484224, adjusting msgr requires for osds Nov 28 02:48:40 localhost ceph-osd[31557]: osd.2 13 check_osdmap_features require_osd_release unknown -> reef Nov 28 02:48:40 localhost ceph-osd[31557]: osd.2 13 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Nov 28 02:48:40 localhost ceph-osd[31557]: osd.2 13 set_numa_affinity not setting numa affinity Nov 28 02:48:40 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-2[31553]: 2025-11-28T07:48:40.273+0000 7fd08c17b640 -1 osd.2 13 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Nov 28 02:48:40 localhost ceph-osd[31557]: osd.2 13 _collect_metadata loop3: no unique device id for loop3: fallback method has no model nor serial Nov 28 02:48:40 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : purged_snaps scrub starts Nov 28 02:48:40 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : purged_snaps scrub ok Nov 28 02:48:41 localhost ceph-osd[32506]: osd.5 0 done with init, starting boot process Nov 28 02:48:41 localhost ceph-osd[32506]: osd.5 0 start_boot Nov 28 02:48:41 localhost ceph-osd[32506]: osd.5 0 maybe_override_options_for_qos osd_max_backfills set to 1 Nov 28 02:48:41 localhost ceph-osd[32506]: osd.5 0 maybe_override_options_for_qos osd_recovery_max_active set to 0 Nov 28 02:48:41 localhost ceph-osd[32506]: osd.5 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3 Nov 28 02:48:41 localhost ceph-osd[32506]: osd.5 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10 Nov 28 02:48:41 localhost ceph-osd[32506]: osd.5 0 bench count 12288000 bsize 4 KiB Nov 28 02:48:41 localhost ceph-osd[31557]: osd.2 14 state: booting -> active Nov 28 02:48:41 localhost systemd[1]: tmp-crun.E1rhRd.mount: Deactivated successfully. Nov 28 02:48:41 localhost podman[33228]: 2025-11-28 07:48:41.703583889 +0000 UTC m=+0.119283962 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, vcs-type=git, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.buildah.version=1.33.12, architecture=x86_64, GIT_CLEAN=True, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.expose-services=, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, io.openshift.tags=rhceph ceph) Nov 28 02:48:41 localhost podman[33228]: 2025-11-28 07:48:41.837646746 +0000 UTC m=+0.253346859 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, build-date=2025-09-24T08:57:55, ceph=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., version=7, GIT_BRANCH=main, name=rhceph, RELEASE=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7) Nov 28 02:48:43 localhost ceph-osd[31557]: osd.2 16 crush map has features 288514051259236352, adjusting msgr requires for clients Nov 28 02:48:43 localhost ceph-osd[31557]: osd.2 16 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons Nov 28 02:48:43 localhost ceph-osd[31557]: osd.2 16 crush map has features 3314933000852226048, adjusting msgr requires for osds Nov 28 02:48:43 localhost podman[33426]: Nov 28 02:48:43 localhost podman[33426]: 2025-11-28 07:48:43.888439757 +0000 UTC m=+0.100480954 container create 718f3beb942534a450e919b24a5f04bd5ecdefbb9a2c26f840fa8075b2c38c60 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_sutherland, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, ceph=True, release=553, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, RELEASE=main, name=rhceph, io.openshift.tags=rhceph ceph, version=7, distribution-scope=public, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_CLEAN=True, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements) Nov 28 02:48:43 localhost systemd[1]: Started libpod-conmon-718f3beb942534a450e919b24a5f04bd5ecdefbb9a2c26f840fa8075b2c38c60.scope. Nov 28 02:48:43 localhost podman[33426]: 2025-11-28 07:48:43.838476957 +0000 UTC m=+0.050518184 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 02:48:43 localhost systemd[1]: Started libcrun container. Nov 28 02:48:43 localhost podman[33426]: 2025-11-28 07:48:43.962768605 +0000 UTC m=+0.174809812 container init 718f3beb942534a450e919b24a5f04bd5ecdefbb9a2c26f840fa8075b2c38c60 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_sutherland, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, com.redhat.component=rhceph-container, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_CLEAN=True, io.buildah.version=1.33.12, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , name=rhceph, io.openshift.tags=rhceph ceph, architecture=x86_64, distribution-scope=public, build-date=2025-09-24T08:57:55) Nov 28 02:48:43 localhost agitated_sutherland[33442]: 167 167 Nov 28 02:48:43 localhost systemd[1]: libpod-718f3beb942534a450e919b24a5f04bd5ecdefbb9a2c26f840fa8075b2c38c60.scope: Deactivated successfully. Nov 28 02:48:43 localhost podman[33426]: 2025-11-28 07:48:43.990723416 +0000 UTC m=+0.202764613 container start 718f3beb942534a450e919b24a5f04bd5ecdefbb9a2c26f840fa8075b2c38c60 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_sutherland, GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, architecture=x86_64, release=553, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 28 02:48:43 localhost podman[33426]: 2025-11-28 07:48:43.991129256 +0000 UTC m=+0.203170463 container attach 718f3beb942534a450e919b24a5f04bd5ecdefbb9a2c26f840fa8075b2c38c60 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_sutherland, vendor=Red Hat, Inc., version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , release=553, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, distribution-scope=public, name=rhceph, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 28 02:48:43 localhost podman[33426]: 2025-11-28 07:48:43.996109373 +0000 UTC m=+0.208150610 container died 718f3beb942534a450e919b24a5f04bd5ecdefbb9a2c26f840fa8075b2c38c60 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_sutherland, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, RELEASE=main, vendor=Red Hat, Inc., name=rhceph, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.openshift.expose-services=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux ) Nov 28 02:48:44 localhost podman[33447]: 2025-11-28 07:48:44.135112145 +0000 UTC m=+0.134910069 container remove 718f3beb942534a450e919b24a5f04bd5ecdefbb9a2c26f840fa8075b2c38c60 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_sutherland, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, ceph=True, name=rhceph, vcs-type=git, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, version=7, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, distribution-scope=public, description=Red Hat Ceph Storage 7) Nov 28 02:48:44 localhost systemd[1]: libpod-conmon-718f3beb942534a450e919b24a5f04bd5ecdefbb9a2c26f840fa8075b2c38c60.scope: Deactivated successfully. Nov 28 02:48:44 localhost podman[33468]: Nov 28 02:48:44 localhost podman[33468]: 2025-11-28 07:48:44.367667044 +0000 UTC m=+0.089801462 container create e59ab344e978e18de80bd7453fcebe05fb670bb6e6972895651ded8c4e76f0db (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_dewdney, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.12, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, version=7, name=rhceph, ceph=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, architecture=x86_64, RELEASE=main, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph) Nov 28 02:48:44 localhost systemd[1]: Started libpod-conmon-e59ab344e978e18de80bd7453fcebe05fb670bb6e6972895651ded8c4e76f0db.scope. Nov 28 02:48:44 localhost systemd[1]: Started libcrun container. Nov 28 02:48:44 localhost podman[33468]: 2025-11-28 07:48:44.324187059 +0000 UTC m=+0.046321477 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 02:48:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66939a59852f714884024eaba25714c37ad2789e08b1f642da2b3619778100e4/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66939a59852f714884024eaba25714c37ad2789e08b1f642da2b3619778100e4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66939a59852f714884024eaba25714c37ad2789e08b1f642da2b3619778100e4/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 28 02:48:44 localhost podman[33468]: 2025-11-28 07:48:44.466555167 +0000 UTC m=+0.188689575 container init e59ab344e978e18de80bd7453fcebe05fb670bb6e6972895651ded8c4e76f0db (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_dewdney, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, distribution-scope=public, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, com.redhat.component=rhceph-container, io.openshift.expose-services=, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , name=rhceph, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.tags=rhceph ceph, vcs-type=git, build-date=2025-09-24T08:57:55) Nov 28 02:48:44 localhost podman[33468]: 2025-11-28 07:48:44.478934051 +0000 UTC m=+0.201068459 container start e59ab344e978e18de80bd7453fcebe05fb670bb6e6972895651ded8c4e76f0db (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_dewdney, version=7, vcs-type=git, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, distribution-scope=public, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, name=rhceph, RELEASE=main, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12) Nov 28 02:48:44 localhost podman[33468]: 2025-11-28 07:48:44.47923619 +0000 UTC m=+0.201370598 container attach e59ab344e978e18de80bd7453fcebe05fb670bb6e6972895651ded8c4e76f0db (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_dewdney, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_BRANCH=main, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, vcs-type=git, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc.) Nov 28 02:48:44 localhost systemd[1]: tmp-crun.Dx4AKv.mount: Deactivated successfully. Nov 28 02:48:44 localhost systemd[1]: var-lib-containers-storage-overlay-4e4fef968abce8499352901c38db19c10e77328d673d6da31eb225f2f4312b58-merged.mount: Deactivated successfully. Nov 28 02:48:45 localhost ceph-osd[32506]: osd.5 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 19.941 iops: 5104.910 elapsed_sec: 0.588 Nov 28 02:48:45 localhost ceph-osd[32506]: log_channel(cluster) log [WRN] : OSD bench result of 5104.909958 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.5. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd]. Nov 28 02:48:45 localhost ceph-osd[32506]: osd.5 0 waiting for initial osdmap Nov 28 02:48:45 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5[32502]: 2025-11-28T07:48:45.016+0000 7f7778cb9640 -1 osd.5 0 waiting for initial osdmap Nov 28 02:48:45 localhost ceph-osd[32506]: osd.5 17 crush map has features 288514051259236352, adjusting msgr requires for clients Nov 28 02:48:45 localhost ceph-osd[32506]: osd.5 17 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons Nov 28 02:48:45 localhost ceph-osd[32506]: osd.5 17 crush map has features 3314933000852226048, adjusting msgr requires for osds Nov 28 02:48:45 localhost ceph-osd[32506]: osd.5 17 check_osdmap_features require_osd_release unknown -> reef Nov 28 02:48:45 localhost ceph-osd[32506]: osd.5 17 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Nov 28 02:48:45 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-osd-5[32502]: 2025-11-28T07:48:45.039+0000 7f7773ace640 -1 osd.5 17 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Nov 28 02:48:45 localhost ceph-osd[32506]: osd.5 17 set_numa_affinity not setting numa affinity Nov 28 02:48:45 localhost ceph-osd[32506]: osd.5 17 _collect_metadata loop4: no unique device id for loop4: fallback method has no model nor serial Nov 28 02:48:45 localhost ceph-osd[32506]: osd.5 18 state: booting -> active Nov 28 02:48:45 localhost eloquent_dewdney[33484]: [ Nov 28 02:48:45 localhost eloquent_dewdney[33484]: { Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "available": false, Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "ceph_device": false, Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "device_id": "QEMU_DVD-ROM_QM00001", Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "lsm_data": {}, Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "lvs": [], Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "path": "/dev/sr0", Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "rejected_reasons": [ Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "Has a FileSystem", Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "Insufficient space (<5GB)" Nov 28 02:48:45 localhost eloquent_dewdney[33484]: ], Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "sys_api": { Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "actuators": null, Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "device_nodes": "sr0", Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "human_readable_size": "482.00 KB", Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "id_bus": "ata", Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "model": "QEMU DVD-ROM", Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "nr_requests": "2", Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "partitions": {}, Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "path": "/dev/sr0", Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "removable": "1", Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "rev": "2.5+", Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "ro": "0", Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "rotational": "1", Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "sas_address": "", Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "sas_device_handle": "", Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "scheduler_mode": "mq-deadline", Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "sectors": 0, Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "sectorsize": "2048", Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "size": 493568.0, Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "support_discard": "0", Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "type": "disk", Nov 28 02:48:45 localhost eloquent_dewdney[33484]: "vendor": "QEMU" Nov 28 02:48:45 localhost eloquent_dewdney[33484]: } Nov 28 02:48:45 localhost eloquent_dewdney[33484]: } Nov 28 02:48:45 localhost eloquent_dewdney[33484]: ] Nov 28 02:48:45 localhost systemd[1]: libpod-e59ab344e978e18de80bd7453fcebe05fb670bb6e6972895651ded8c4e76f0db.scope: Deactivated successfully. Nov 28 02:48:45 localhost podman[33468]: 2025-11-28 07:48:45.382683626 +0000 UTC m=+1.104818014 container died e59ab344e978e18de80bd7453fcebe05fb670bb6e6972895651ded8c4e76f0db (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_dewdney, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, build-date=2025-09-24T08:57:55, ceph=True, GIT_CLEAN=True, architecture=x86_64, version=7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12) Nov 28 02:48:45 localhost systemd[1]: var-lib-containers-storage-overlay-66939a59852f714884024eaba25714c37ad2789e08b1f642da2b3619778100e4-merged.mount: Deactivated successfully. Nov 28 02:48:45 localhost podman[34703]: 2025-11-28 07:48:45.520372765 +0000 UTC m=+0.125188623 container remove e59ab344e978e18de80bd7453fcebe05fb670bb6e6972895651ded8c4e76f0db (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_dewdney, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, distribution-scope=public, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, architecture=x86_64, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , GIT_BRANCH=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7) Nov 28 02:48:45 localhost systemd[1]: libpod-conmon-e59ab344e978e18de80bd7453fcebe05fb670bb6e6972895651ded8c4e76f0db.scope: Deactivated successfully. Nov 28 02:48:46 localhost ceph-osd[32506]: osd.5 pg_epoch: 18 pg[1.0( empty local-lis/les=0/0 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=18) [1,5,3] r=1 lpr=18 pi=[15,18)/0 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 02:48:55 localhost podman[34829]: 2025-11-28 07:48:55.235066178 +0000 UTC m=+0.088917000 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, ceph=True, io.openshift.tags=rhceph ceph, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 28 02:48:55 localhost podman[34829]: 2025-11-28 07:48:55.368897149 +0000 UTC m=+0.222747971 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., version=7, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_CLEAN=True, vcs-type=git, maintainer=Guillaume Abrioux , release=553, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, architecture=x86_64) Nov 28 02:49:37 localhost sshd[34911]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:49:57 localhost podman[35011]: 2025-11-28 07:49:57.203469945 +0000 UTC m=+0.096831665 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, ceph=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git) Nov 28 02:49:57 localhost podman[35011]: 2025-11-28 07:49:57.310061951 +0000 UTC m=+0.203423691 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , GIT_CLEAN=True, vcs-type=git, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, release=553, version=7, CEPH_POINT_RELEASE=, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc.) Nov 28 02:50:01 localhost systemd[1]: session-13.scope: Deactivated successfully. Nov 28 02:50:01 localhost systemd[1]: session-13.scope: Consumed 20.957s CPU time. Nov 28 02:50:01 localhost systemd-logind[764]: Session 13 logged out. Waiting for processes to exit. Nov 28 02:50:01 localhost systemd-logind[764]: Removed session 13. Nov 28 02:52:05 localhost systemd[26286]: Created slice User Background Tasks Slice. Nov 28 02:52:05 localhost systemd[26286]: Starting Cleanup of User's Temporary Files and Directories... Nov 28 02:52:05 localhost systemd[26286]: Finished Cleanup of User's Temporary Files and Directories. Nov 28 02:53:40 localhost sshd[35385]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:53:40 localhost systemd-logind[764]: New session 27 of user zuul. Nov 28 02:53:40 localhost systemd[1]: Started Session 27 of User zuul. Nov 28 02:53:41 localhost python3[35433]: ansible-ansible.legacy.ping Invoked with data=pong Nov 28 02:53:42 localhost python3[35478]: ansible-setup Invoked with gather_subset=['!facter', '!ohai'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 02:53:42 localhost python3[35498]: ansible-user Invoked with name=tripleo-admin generate_ssh_key=False state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005538513.localdomain update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Nov 28 02:53:43 localhost python3[35554]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/tripleo-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:53:43 localhost python3[35597]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/tripleo-admin mode=288 owner=root group=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764316423.0929687-66619-262688719115229/source _original_basename=tmp9pewikg7 follow=False checksum=b3e7ecdcc699d217c6b083a91b07208207813d93 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:53:44 localhost python3[35627]: ansible-file Invoked with path=/home/tripleo-admin state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:53:44 localhost python3[35643]: ansible-file Invoked with path=/home/tripleo-admin/.ssh state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:53:44 localhost python3[35659]: ansible-file Invoked with path=/home/tripleo-admin/.ssh/authorized_keys state=touch owner=tripleo-admin group=tripleo-admin mode=384 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:53:45 localhost python3[35675]: ansible-lineinfile Invoked with path=/home/tripleo-admin/.ssh/authorized_keys line=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsnBivukZgTjr1SoC29hE3ofwUMxTaKeXh9gXvDwMJASbvK4q9943cbJ2j47GUf8sEgY38kkU/dxSMQWULl4d2oquIgZpJbJuXMU1WNxwGNSrS74OecQ3Or4VxTiDmu/HV83nIWHqfpDCra4DlrIBPPNwhBK4u0QYy87AJaML6NGEDaubbHgVCg1UpW1ho/sDoXptAehoCEaaeRz5tPHiXRnHpIXu44Sp8fRcyU9rBqdv+/lgachTcMYadsD2WBHIL+pptEDHB5TvQTDpnU58YdGFarn8uuGPP4t8H6xcqXbaJS9/oZa5Fb5Mh3vORBbR65jvlGg4PYGzCuI/xllY5+lGK7eyOleFyRqWKa2uAIaGoRBT4ZLKAssOFwCIaGfOAFFOBMkuylg4+MtbYiMJYRARPSRAufAROqhUDOo73y5lBrXh07aiWuSn8fU4mclWu+Xw382ryxW+XeHPc12d7S46TvGJaRvzsLtlyerRxGI77xOHRexq1Z/SFjOWLOwc= zuul-build-sshkey#012 regexp=Generated by TripleO state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:53:46 localhost python3[35689]: ansible-ping Invoked with data=pong Nov 28 02:53:57 localhost sshd[35690]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:53:57 localhost systemd-logind[764]: New session 28 of user tripleo-admin. Nov 28 02:53:57 localhost systemd[1]: Created slice User Slice of UID 1003. Nov 28 02:53:57 localhost systemd[1]: Starting User Runtime Directory /run/user/1003... Nov 28 02:53:57 localhost systemd[1]: Finished User Runtime Directory /run/user/1003. Nov 28 02:53:57 localhost systemd[1]: Starting User Manager for UID 1003... Nov 28 02:53:57 localhost systemd[35694]: Queued start job for default target Main User Target. Nov 28 02:53:57 localhost systemd[35694]: Created slice User Application Slice. Nov 28 02:53:57 localhost systemd[35694]: Started Mark boot as successful after the user session has run 2 minutes. Nov 28 02:53:57 localhost systemd[35694]: Started Daily Cleanup of User's Temporary Directories. Nov 28 02:53:57 localhost systemd[35694]: Reached target Paths. Nov 28 02:53:57 localhost systemd[35694]: Reached target Timers. Nov 28 02:53:57 localhost systemd[35694]: Starting D-Bus User Message Bus Socket... Nov 28 02:53:57 localhost systemd[35694]: Starting Create User's Volatile Files and Directories... Nov 28 02:53:57 localhost systemd[35694]: Finished Create User's Volatile Files and Directories. Nov 28 02:53:57 localhost systemd[35694]: Listening on D-Bus User Message Bus Socket. Nov 28 02:53:57 localhost systemd[35694]: Reached target Sockets. Nov 28 02:53:57 localhost systemd[35694]: Reached target Basic System. Nov 28 02:53:57 localhost systemd[35694]: Reached target Main User Target. Nov 28 02:53:57 localhost systemd[35694]: Startup finished in 124ms. Nov 28 02:53:57 localhost systemd[1]: Started User Manager for UID 1003. Nov 28 02:53:57 localhost systemd[1]: Started Session 28 of User tripleo-admin. Nov 28 02:53:58 localhost python3[35755]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d Nov 28 02:54:03 localhost python3[35805]: ansible-selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config Nov 28 02:54:04 localhost python3[35852]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None Nov 28 02:54:05 localhost python3[35915]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.tykmt0pztmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:54:05 localhost python3[35945]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.tykmt0pztmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:54:06 localhost python3[35961]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.tykmt0pztmphosts insertbefore=BOF block=172.17.0.106 np0005538513.localdomain np0005538513#012172.18.0.106 np0005538513.storage.localdomain np0005538513.storage#012172.20.0.106 np0005538513.storagemgmt.localdomain np0005538513.storagemgmt#012172.17.0.106 np0005538513.internalapi.localdomain np0005538513.internalapi#012172.19.0.106 np0005538513.tenant.localdomain np0005538513.tenant#012192.168.122.106 np0005538513.ctlplane.localdomain np0005538513.ctlplane#012172.17.0.107 np0005538514.localdomain np0005538514#012172.18.0.107 np0005538514.storage.localdomain np0005538514.storage#012172.20.0.107 np0005538514.storagemgmt.localdomain np0005538514.storagemgmt#012172.17.0.107 np0005538514.internalapi.localdomain np0005538514.internalapi#012172.19.0.107 np0005538514.tenant.localdomain np0005538514.tenant#012192.168.122.107 np0005538514.ctlplane.localdomain np0005538514.ctlplane#012172.17.0.108 np0005538515.localdomain np0005538515#012172.18.0.108 np0005538515.storage.localdomain np0005538515.storage#012172.20.0.108 np0005538515.storagemgmt.localdomain np0005538515.storagemgmt#012172.17.0.108 np0005538515.internalapi.localdomain np0005538515.internalapi#012172.19.0.108 np0005538515.tenant.localdomain np0005538515.tenant#012192.168.122.108 np0005538515.ctlplane.localdomain np0005538515.ctlplane#012172.17.0.103 np0005538510.localdomain np0005538510#012172.18.0.103 np0005538510.storage.localdomain np0005538510.storage#012172.20.0.103 np0005538510.storagemgmt.localdomain np0005538510.storagemgmt#012172.17.0.103 np0005538510.internalapi.localdomain np0005538510.internalapi#012172.19.0.103 np0005538510.tenant.localdomain np0005538510.tenant#012192.168.122.103 np0005538510.ctlplane.localdomain np0005538510.ctlplane#012172.17.0.104 np0005538511.localdomain np0005538511#012172.18.0.104 np0005538511.storage.localdomain np0005538511.storage#012172.20.0.104 np0005538511.storagemgmt.localdomain np0005538511.storagemgmt#012172.17.0.104 np0005538511.internalapi.localdomain np0005538511.internalapi#012172.19.0.104 np0005538511.tenant.localdomain np0005538511.tenant#012192.168.122.104 np0005538511.ctlplane.localdomain np0005538511.ctlplane#012172.17.0.105 np0005538512.localdomain np0005538512#012172.18.0.105 np0005538512.storage.localdomain np0005538512.storage#012172.20.0.105 np0005538512.storagemgmt.localdomain np0005538512.storagemgmt#012172.17.0.105 np0005538512.internalapi.localdomain np0005538512.internalapi#012172.19.0.105 np0005538512.tenant.localdomain np0005538512.tenant#012192.168.122.105 np0005538512.ctlplane.localdomain np0005538512.ctlplane#012#012192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane#012192.168.122.99 overcloud.ctlplane.localdomain#012172.18.0.197 overcloud.storage.localdomain#012172.20.0.177 overcloud.storagemgmt.localdomain#012172.17.0.128 overcloud.internalapi.localdomain#012172.21.0.169 overcloud.localdomain#012 marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:54:07 localhost python3[35977]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.tykmt0pztmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:54:07 localhost python3[35994]: ansible-file Invoked with path=/tmp/ansible.tykmt0pztmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:54:08 localhost python3[36010]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides rhosp-release _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:54:09 localhost python3[36027]: ansible-ansible.legacy.dnf Invoked with name=['rhosp-release'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 28 02:54:13 localhost python3[36046]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:54:14 localhost python3[36063]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'jq', 'nftables', 'openvswitch', 'openstack-heat-agents', 'openstack-selinux', 'os-net-config', 'python3-libselinux', 'python3-pyyaml', 'puppet-tripleo', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 28 02:55:23 localhost kernel: SELinux: Converting 2700 SID table entries... Nov 28 02:55:23 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 28 02:55:23 localhost kernel: SELinux: policy capability open_perms=1 Nov 28 02:55:23 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 28 02:55:23 localhost kernel: SELinux: policy capability always_check_network=0 Nov 28 02:55:23 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 28 02:55:23 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 28 02:55:23 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 28 02:55:24 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=6 res=1 Nov 28 02:55:24 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 28 02:55:24 localhost systemd[1]: Starting man-db-cache-update.service... Nov 28 02:55:24 localhost systemd[1]: Reloading. Nov 28 02:55:24 localhost systemd-rc-local-generator[36968]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:55:24 localhost systemd-sysv-generator[36973]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:55:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:55:24 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 28 02:55:24 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 28 02:55:24 localhost systemd[1]: Finished man-db-cache-update.service. Nov 28 02:55:24 localhost systemd[1]: run-r045bd28e4bc94a408f5663ad62b4909b.service: Deactivated successfully. Nov 28 02:55:27 localhost python3[37410]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:55:29 localhost python3[37549]: ansible-ansible.legacy.systemd Invoked with name=openvswitch enabled=True state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 02:55:29 localhost systemd[1]: Reloading. Nov 28 02:55:29 localhost systemd-sysv-generator[37578]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:55:29 localhost systemd-rc-local-generator[37575]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:55:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:55:31 localhost python3[37603]: ansible-file Invoked with path=/var/lib/heat-config/tripleo-config-download state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:55:31 localhost python3[37619]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides openstack-network-scripts _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:55:32 localhost python3[37636]: ansible-systemd Invoked with name=NetworkManager enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Nov 28 02:55:33 localhost python3[37654]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=dns value=none backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:55:33 localhost python3[37672]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=rc-manager value=unmanaged backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:55:34 localhost python3[37690]: ansible-ansible.legacy.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 02:55:34 localhost systemd[1]: Reloading Network Manager... Nov 28 02:55:34 localhost NetworkManager[5967]: [1764316534.3109] audit: op="reload" arg="0" pid=37693 uid=0 result="success" Nov 28 02:55:34 localhost NetworkManager[5967]: [1764316534.3117] config: signal: SIGHUP,config-files,values,values-user,no-auto-default,dns-mode,rc-manager (/etc/NetworkManager/NetworkManager.conf (lib: 00-server.conf) (run: 15-carrier-timeout.conf)) Nov 28 02:55:34 localhost NetworkManager[5967]: [1764316534.3117] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged Nov 28 02:55:34 localhost systemd[1]: Reloaded Network Manager. Nov 28 02:55:35 localhost python3[37709]: ansible-ansible.legacy.command Invoked with _raw_params=ln -f -s /usr/share/openstack-puppet/modules/* /etc/puppet/modules/ _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:55:36 localhost python3[37726]: ansible-stat Invoked with path=/usr/bin/ansible-playbook follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 02:55:36 localhost python3[37744]: ansible-stat Invoked with path=/usr/bin/ansible-playbook-3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 02:55:37 localhost python3[37760]: ansible-file Invoked with state=link src=/usr/bin/ansible-playbook path=/usr/bin/ansible-playbook-3 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:55:37 localhost python3[37776]: ansible-tempfile Invoked with state=file prefix=ansible. suffix= path=None Nov 28 02:55:38 localhost python3[37792]: ansible-stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 02:55:39 localhost python3[37808]: ansible-blockinfile Invoked with path=/tmp/ansible.ah5juyos block=[192.168.122.106]*,[np0005538513.ctlplane.localdomain]*,[172.17.0.106]*,[np0005538513.internalapi.localdomain]*,[172.18.0.106]*,[np0005538513.storage.localdomain]*,[172.20.0.106]*,[np0005538513.storagemgmt.localdomain]*,[172.19.0.106]*,[np0005538513.tenant.localdomain]*,[np0005538513.localdomain]*,[np0005538513]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCToHi/c1OL/UxMWy2v/t0tcvSlMeoKa6EPBYbcu51p2Gn2UxEPgCRLM9+84Smh2pxAR4Y/5LVm2lbZ9Gf4okHGg5GLIyqzxxqbQHyR+YRljujVEOvksUPuKCptzx9fQj2Ij2t9GPGHc5klgGPIKjx0pza8T37vdz+G9y7zuK5wWI66AeN8y/6dD2hvi1Lp94VRSvTTEo+nUOFSIgsOwqQO+ZSwTgjG1pmtESBe8nkhW0I0BQPX46v9f1PN1LXDg8cN2FSVjQ91RI0uCvTaBYJ3soFBFspgiJ113zapbQCaNwg7lK7ofS0QT5WONP3QIsDAq1gSpWuOdS2DRY4NU3WMd4m5tLbj+ubiWr39rNU/zQiEl8r38aiM0OwOfuQ9S8wxO7phpVCQrbOkYCLLijdy/xTODvP+jYohTMWX8Gh6IVeVtm6SB2Tw3lDBCjpqlclCSs905Xe+mTJ6WYTaz+Q1xgflKEeemzJ0+rt+QZbrmL7u5MUdf/l/yOLAgACNsws=#012[192.168.122.107]*,[np0005538514.ctlplane.localdomain]*,[172.17.0.107]*,[np0005538514.internalapi.localdomain]*,[172.18.0.107]*,[np0005538514.storage.localdomain]*,[172.20.0.107]*,[np0005538514.storagemgmt.localdomain]*,[172.19.0.107]*,[np0005538514.tenant.localdomain]*,[np0005538514.localdomain]*,[np0005538514]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLIqwhlOevSQuHXF0nrkLOzRoSQqnWWb/cXzK4um93clqGujVOE9PUyL6ONBo/qlr4Pp+QMzSsIFwjW1T/6G+Ce2CS/TGphIUxvvB9NhBt+OJl/zDUEmjAU6bwVIx6ApqtimsXWWIap9GEtVWA5P9pcqPMyGzq1mCzwCS252Ylioij0zZxfMrxTt3RSsWrDED61vRes0ZKd8HERTLN+Lzis5t8f74zfwTesOea6CRkIHth4cUP7ua3q+KhhbhIPj+fXWN5w+qVbcTMJSYyUPsZ2ymPhR4x4db1oPk1Jg14dw1BnmAZZl3v8o4l7bUQ2Fj/PE1JbSiApxbK+V0KdZGMrG4iVbnMmzwBXPXHa6lNQGneflVd3MNEepnTnXx4hAVpaJHc8EtIREq8aPe07DW0wL9clpTKaSGU2Ma+BLXmSDPkuPh6JWLxn3iM1yybL574NnGt2MgBj6z2tiSb4NkNmaBkoG8PMHw8YUSabKBBZNiMEO2GKBpHZldSrYvOZHU=#012[192.168.122.108]*,[np0005538515.ctlplane.localdomain]*,[172.17.0.108]*,[np0005538515.internalapi.localdomain]*,[172.18.0.108]*,[np0005538515.storage.localdomain]*,[172.20.0.108]*,[np0005538515.storagemgmt.localdomain]*,[172.19.0.108]*,[np0005538515.tenant.localdomain]*,[np0005538515.localdomain]*,[np0005538515]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDbwc/gBZF5hmsFU6BSK1/DT5hduj2+3ukzoCGLU6mgpBv7BiInN7vVOqXilL+QUAWOvfKTekaQe1Vv/2jpygQnlu6MEMopmac/36IfVjgt39zxCULfSWv3Gp8tLP0ATF2LfhHBWFrGX7G3Bg3AiNfIUnQIQadBaKIByl+FfA7nJ7phwBAwJaQxvByGDeMwC2CWIUPgVqKclcw1WmldPnNmwquLlCbAeMV2hHlBfnVk8BI6fsOUcBB6a05zRpJpbrl584F+qkiQX0RpZYJQdZCoLiJStJv39lYhgiAWChUOVJsCbeNQnC9/Xgs5JhmRESgXh7Tm+8UNW9DxSHN7BS5qKYPUULdjobSp2v9pFOx30MLMsNd5r3JE07pgm5PpjuviSGEvJ8DIAPTF3kUXM43wax1q9rGV4ZfoJiLAwS9CmWWDWZDg17cnC5z+3qi+K8HUKz8LxQCHI+yEtTFzUEYyXTQfQbNvkauEHI/PwFA1iC+4/2g/0UhtjkM+FO2Czwk=#012[192.168.122.103]*,[np0005538510.ctlplane.localdomain]*,[172.17.0.103]*,[np0005538510.internalapi.localdomain]*,[172.18.0.103]*,[np0005538510.storage.localdomain]*,[172.20.0.103]*,[np0005538510.storagemgmt.localdomain]*,[172.19.0.103]*,[np0005538510.tenant.localdomain]*,[np0005538510.localdomain]*,[np0005538510]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAxqgPHnyGChl6yd1/HRo8ox+w8llSVhIj8iYUdDG7IquyLr4/CZguZzRkngbXi/Dq544iKS4kFL/zPKi+yuxeFs4b6fgo4vGoV8wwKNSJXx0d0hOQa9651VqB6k/trENRTgLa2fHkXgF+/g0f7HvloQfhr7qjhTBRV4l4UfJiOEpMvMxN6map/D0JuHlAZGZ5mGUoBTEMuPGEPvMWqe0kc/I8WIgsMsvijOGM2xDxsOqAYlV9a8faoyMdacWUNkeQTfPF6h+z8xdvP8qWPtrPKWHMpcGicTI6pFZ2JxOjWnMaBXs2j/CN7HFLbyOCwuAvAu9efAbxJvgtZlO++6kSlq7SHMzwv7PLP69GaQJHR+jANJ/O2BchbxL09mIkpFSzLSS0k7xXJlwqnAMciIlTaud2n5Hqnnb06WgtvD6O0nnuCLH5am7F1YDGJJgUmNbbgF1PuwzOZqQy+tA2igji/n2z87KkGZdIbrHdPU1PPIlzVGPO6aO02RhvtD+/iQM=#012[192.168.122.104]*,[np0005538511.ctlplane.localdomain]*,[172.17.0.104]*,[np0005538511.internalapi.localdomain]*,[172.18.0.104]*,[np0005538511.storage.localdomain]*,[172.20.0.104]*,[np0005538511.storagemgmt.localdomain]*,[172.19.0.104]*,[np0005538511.tenant.localdomain]*,[np0005538511.localdomain]*,[np0005538511]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDqtmgm0KAOOIJ7a8whlZPfasnwJpfcm6zVmQjiKHZZrcojE/a6oALfufKXbfWWiLjJ2VzyK9v7QPNXhIWxgAKT9J40A1lSpSAmmxMaWvy+hzzvePs0Z4Fc4bFX7V4zBGI+dAJ+eAu73z6OKNuMhxBrL46ejpRFbqjwBP3veWRiLOMbyPn+Wc+amop0p1eEzV2QHMAIC5Dwm6/tYNLixNSa/Ea0ciaY3jWii+IGhYy+wqQP+9qkoVf9bZ4Ewa+7UfXI/q4zvvic/Znb8ZpCpezLnH4ilBORLyV9r/wkkkVGY7UVgUdSoLVjzTGQAtHl2ZgA3zJ2F2ES9QcBEvrHygT4vGgtEaxQn8XFhBwhzCpPaLyXti/6d+8M36cJx+7gv1eEfDgLz3MNR+tcnFSew9N6dIN4afV0DvA/9FsWk8PTqddN4iHcZzRo0GiDJWNtB+gYVZOytTYMZm2Cyv59IthEzxaB+wTZoSdCeuEeTM0ohYspOKirIPqMPuCbGbtrJFE=#012[192.168.122.105]*,[np0005538512.ctlplane.localdomain]*,[172.17.0.105]*,[np0005538512.internalapi.localdomain]*,[172.18.0.105]*,[np0005538512.storage.localdomain]*,[172.20.0.105]*,[np0005538512.storagemgmt.localdomain]*,[172.19.0.105]*,[np0005538512.tenant.localdomain]*,[np0005538512.localdomain]*,[np0005538512]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCy9/gxqH+eMqafXwUuPf+1Clpw4qsugdFefisnCDhJ5U7Pc+eWMUQVMS0ErxabBJhneDOyPwXwIbv72cEAtmgfvHDlSuS3mt8LRzKqsv1dXTy4Zqb3JGVzrvxo0iczGRsn2MIDJUv/Zjq9YqVeCnDj2HOwV+qx+EFecEFXS797FxsnMmTw0A5z8yUtBuJEGAKQX96LpZc4k5ltq+Uy0rK85Kk7cGR4A+wrIChLC8wggxvA99NdPEBtne6Chb+3PcbYUcTGhGtV6FGzpgbWmuWT/gcANb+fJE5/4n87loLmBMsmvGhvQuN9kuJ20g6nwPJbPTpIbV6XALx4tbma68bL3RL+lcGlh3jf0pEXPfolrB/MRmJn5ggMLjRv50FrowQalnCEgWE0gtd9IGjmqFz3jP008bGotn9rcacbjC2AvE+5NEjp7TzXGnFcD6jW8+9AWiusCww4ULs/oWbi0GLkmhwU5EifitDYF2+r1CigAdlEjb6sa0wAQSmclWk6guM=#012 create=True state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:55:39 localhost python3[37824]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.ah5juyos' > /etc/ssh/ssh_known_hosts _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:55:39 localhost python3[37842]: ansible-file Invoked with path=/tmp/ansible.ah5juyos state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:55:40 localhost python3[37858]: ansible-file Invoked with path=/var/log/journal state=directory mode=0750 owner=root group=root setype=var_log_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 02:55:41 localhost python3[37874]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active cloud-init.service || systemctl is-enabled cloud-init.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:55:41 localhost python3[37892]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline | grep -q cloud-init=disabled _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:55:41 localhost python3[37911]: ansible-community.general.cloud_init_data_facts Invoked with filter=status Nov 28 02:55:44 localhost python3[38048]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:55:44 localhost python3[38065]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 28 02:55:48 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Nov 28 02:55:48 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Nov 28 02:55:48 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 28 02:55:48 localhost systemd[1]: Starting man-db-cache-update.service... Nov 28 02:55:48 localhost systemd[1]: Reloading. Nov 28 02:55:48 localhost systemd-rc-local-generator[38134]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:55:48 localhost systemd-sysv-generator[38139]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:55:48 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:55:48 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 28 02:55:48 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Nov 28 02:55:48 localhost systemd[1]: tuned.service: Deactivated successfully. Nov 28 02:55:48 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Nov 28 02:55:48 localhost systemd[1]: tuned.service: Consumed 1.774s CPU time. Nov 28 02:55:48 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Nov 28 02:55:49 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 28 02:55:49 localhost systemd[1]: Finished man-db-cache-update.service. Nov 28 02:55:49 localhost systemd[1]: run-r4394cd071c81425da221e64f64dcefd1.service: Deactivated successfully. Nov 28 02:55:50 localhost systemd[1]: Started Dynamic System Tuning Daemon. Nov 28 02:55:50 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 28 02:55:50 localhost systemd[1]: Starting man-db-cache-update.service... Nov 28 02:55:50 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 28 02:55:50 localhost systemd[1]: Finished man-db-cache-update.service. Nov 28 02:55:50 localhost systemd[1]: run-rcb4f556933214083b12f71af96247f7e.service: Deactivated successfully. Nov 28 02:55:51 localhost python3[38501]: ansible-systemd Invoked with name=tuned state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 02:55:51 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Nov 28 02:55:51 localhost systemd[1]: tuned.service: Deactivated successfully. Nov 28 02:55:51 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Nov 28 02:55:51 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Nov 28 02:55:53 localhost systemd[1]: Started Dynamic System Tuning Daemon. Nov 28 02:55:53 localhost python3[38696]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:55:54 localhost python3[38713]: ansible-slurp Invoked with src=/etc/tuned/active_profile Nov 28 02:55:54 localhost python3[38729]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 02:55:55 localhost python3[38745]: ansible-ansible.legacy.command Invoked with _raw_params=tuned-adm profile throughput-performance _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:55:56 localhost python3[38765]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:55:57 localhost python3[38782]: ansible-stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 02:55:59 localhost python3[38798]: ansible-replace Invoked with regexp=TRIPLEO_HEAT_TEMPLATE_KERNEL_ARGS dest=/etc/default/grub replace= path=/etc/default/grub backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:56:05 localhost python3[38814]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:56:05 localhost python3[38862]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:56:05 localhost systemd[35694]: Starting Mark boot as successful... Nov 28 02:56:05 localhost systemd[35694]: Finished Mark boot as successful. Nov 28 02:56:06 localhost python3[38908]: ansible-ansible.legacy.copy Invoked with mode=384 dest=/etc/puppet/hiera.yaml src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316565.340449-71264-134970873186692/source _original_basename=tmpl19i9un2 follow=False checksum=aaf3699defba931d532f4955ae152f505046749a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:56:06 localhost python3[38951]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:56:07 localhost python3[39047]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:56:07 localhost python3[39090]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316566.9805455-71361-33364981101521/source dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json follow=False checksum=f62dcfb681d1b393d0933e3027f5bdff5685b671 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:56:08 localhost python3[39167]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:56:08 localhost python3[39210]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316567.9143033-71496-85961111386279/source dest=/etc/puppet/hieradata/bootstrap_node.json mode=None follow=False _original_basename=bootstrap_node.j2 checksum=526fa277b7a2f2320a39d589994ce8c8af83f91d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:56:09 localhost python3[39272]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:56:09 localhost python3[39315]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316568.853994-71496-238988601288442/source dest=/etc/puppet/hieradata/vip_data.json mode=None follow=False _original_basename=vip_data.j2 checksum=a223df0bad6272fbaedbfa3b3952717db2fe2201 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:56:10 localhost python3[39377]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:56:10 localhost python3[39420]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316569.7963903-71496-114579238824483/source dest=/etc/puppet/hieradata/net_ip_map.json mode=None follow=False _original_basename=net_ip_map.j2 checksum=68b5a56a66cb10764ef3288009ad5e9b7e8faf12 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:56:11 localhost python3[39482]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:56:11 localhost python3[39525]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316570.8300068-71496-25857694174394/source dest=/etc/puppet/hieradata/cloud_domain.json mode=None follow=False _original_basename=cloud_domain.j2 checksum=5dd835a63e6a03d74797c2e2eadf4bea1cecd9d9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:56:12 localhost python3[39587]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:56:12 localhost python3[39630]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316571.6708229-71496-225660578965828/source dest=/etc/puppet/hieradata/fqdn.json mode=None follow=False _original_basename=fqdn.j2 checksum=8507472d542de0e0675ce4c861ee207d860b9ae3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:56:12 localhost python3[39692]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:56:13 localhost python3[39735]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316572.5381417-71496-181357957177740/source dest=/etc/puppet/hieradata/service_names.json mode=None follow=False _original_basename=service_names.j2 checksum=ff586b96402d8ae133745cf06f17e772b2f22d52 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:56:13 localhost python3[39797]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:56:14 localhost python3[39840]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316573.3885307-71496-39635942012134/source dest=/etc/puppet/hieradata/service_configs.json mode=None follow=False _original_basename=service_configs.j2 checksum=8f5fcf4d1773fc71cd0863786080c50634c31bf2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:56:14 localhost python3[39902]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:56:15 localhost python3[39945]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316574.3161898-71496-196645832956672/source dest=/etc/puppet/hieradata/extraconfig.json mode=None follow=False _original_basename=extraconfig.j2 checksum=5f36b2ea290645ee34d943220a14b54ee5ea5be5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:56:15 localhost python3[40007]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:56:15 localhost python3[40050]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316575.2061021-71496-248193349042453/source dest=/etc/puppet/hieradata/role_extraconfig.json mode=None follow=False _original_basename=role_extraconfig.j2 checksum=34875968bf996542162e620523f9dcfb3deac331 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:56:16 localhost python3[40112]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:56:16 localhost python3[40155]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316576.077748-71496-217358525951479/source dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json mode=None follow=False _original_basename=ovn_chassis_mac_map.j2 checksum=cb8f167f8f40b87df4e2f7549c43619389cc84d7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:56:17 localhost python3[40185]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 02:56:18 localhost python3[40233]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:56:18 localhost python3[40276]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/ansible_managed.json owner=root group=root mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316577.8640547-72162-160011934086397/source _original_basename=tmpm632thdr follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:56:23 localhost python3[40306]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_default_ipv4'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 28 02:56:23 localhost python3[40367]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 38.102.83.1 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:56:28 localhost python3[40384]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.10 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:56:33 localhost python3[40401]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 192.168.122.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:56:34 localhost python3[40424]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.18.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:56:34 localhost python3[40447]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.20.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:56:35 localhost python3[40470]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.17.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:56:35 localhost python3[40493]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.19.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:57:17 localhost python3[40592]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:57:17 localhost python3[40640]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:57:17 localhost python3[40658]: ansible-ansible.legacy.file Invoked with mode=384 dest=/etc/puppet/hiera.yaml _original_basename=tmp18fz95lr recurse=False state=file path=/etc/puppet/hiera.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:57:18 localhost python3[40688]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:57:19 localhost python3[40736]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:57:19 localhost python3[40754]: ansible-ansible.legacy.file Invoked with dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json recurse=False state=file path=/etc/puppet/hieradata/all_nodes.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:57:20 localhost python3[40816]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:57:20 localhost python3[40834]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/bootstrap_node.json _original_basename=bootstrap_node.j2 recurse=False state=file path=/etc/puppet/hieradata/bootstrap_node.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:57:20 localhost python3[40896]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:57:21 localhost python3[40914]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/vip_data.json _original_basename=vip_data.j2 recurse=False state=file path=/etc/puppet/hieradata/vip_data.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:57:21 localhost python3[40976]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:57:22 localhost python3[40994]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/net_ip_map.json _original_basename=net_ip_map.j2 recurse=False state=file path=/etc/puppet/hieradata/net_ip_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:57:22 localhost python3[41056]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:57:22 localhost python3[41074]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/cloud_domain.json _original_basename=cloud_domain.j2 recurse=False state=file path=/etc/puppet/hieradata/cloud_domain.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:57:23 localhost python3[41136]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:57:23 localhost python3[41154]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/fqdn.json _original_basename=fqdn.j2 recurse=False state=file path=/etc/puppet/hieradata/fqdn.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:57:24 localhost python3[41216]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:57:24 localhost python3[41234]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_names.json _original_basename=service_names.j2 recurse=False state=file path=/etc/puppet/hieradata/service_names.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:57:24 localhost python3[41296]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:57:24 localhost python3[41314]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_configs.json _original_basename=service_configs.j2 recurse=False state=file path=/etc/puppet/hieradata/service_configs.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:57:25 localhost python3[41376]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:57:25 localhost python3[41394]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/extraconfig.json _original_basename=extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:57:26 localhost python3[41456]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:57:26 localhost python3[41474]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/role_extraconfig.json _original_basename=role_extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/role_extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:57:26 localhost python3[41536]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:57:27 localhost python3[41554]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json _original_basename=ovn_chassis_mac_map.j2 recurse=False state=file path=/etc/puppet/hieradata/ovn_chassis_mac_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:57:27 localhost python3[41584]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 02:57:28 localhost python3[41632]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:57:28 localhost python3[41650]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=0644 dest=/etc/puppet/hieradata/ansible_managed.json _original_basename=tmpwgostixu recurse=False state=file path=/etc/puppet/hieradata/ansible_managed.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:57:31 localhost python3[41680]: ansible-dnf Invoked with name=['firewalld'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 28 02:57:35 localhost python3[41697]: ansible-ansible.builtin.systemd Invoked with name=iptables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 02:57:36 localhost python3[41715]: ansible-ansible.builtin.systemd Invoked with name=ip6tables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 02:57:36 localhost python3[41733]: ansible-ansible.builtin.systemd Invoked with name=nftables state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 02:57:36 localhost systemd[1]: Reloading. Nov 28 02:57:36 localhost systemd-rc-local-generator[41760]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:57:36 localhost systemd-sysv-generator[41765]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:57:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:57:37 localhost systemd[1]: Starting Netfilter Tables... Nov 28 02:57:37 localhost systemd[1]: Finished Netfilter Tables. Nov 28 02:57:37 localhost python3[41823]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:57:38 localhost python3[41866]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316657.5355775-74983-248966599474069/source _original_basename=iptables.nft follow=False checksum=ede9860c99075946a7bc827210247aac639bc84a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:57:38 localhost python3[41896]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:57:39 localhost python3[41914]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:57:39 localhost python3[41963]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:57:40 localhost python3[42006]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316659.4205143-75125-198920321598253/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:57:40 localhost python3[42068]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-update-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:57:41 localhost python3[42111]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-update-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316660.3991268-75240-144059158843353/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:57:41 localhost python3[42173]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-flushes.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:57:42 localhost python3[42216]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-flushes.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316661.4136913-75310-166567593647304/source mode=None follow=False _original_basename=flush-chain.j2 checksum=e8e7b8db0d61a7fe393441cc91613f470eb34a6e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:57:42 localhost python3[42278]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-chains.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:57:43 localhost python3[42321]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-chains.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316662.3431256-75370-81703153813183/source mode=None follow=False _original_basename=chains.j2 checksum=e60ee651f5014e83924f4e901ecc8e25b1906610 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:57:44 localhost python3[42383]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-rules.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:57:44 localhost python3[42426]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-rules.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316663.2980568-75431-50936521324994/source mode=None follow=False _original_basename=ruleset.j2 checksum=0444e4206083f91e2fb2aabfa2928244c2db35ed backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:57:44 localhost python3[42456]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-chains.nft /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft /etc/nftables/tripleo-jumps.nft | nft -c -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:57:45 localhost python3[42521]: ansible-ansible.builtin.blockinfile Invoked with path=/etc/sysconfig/nftables.conf backup=False validate=nft -c -f %s block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/tripleo-chains.nft"#012include "/etc/nftables/tripleo-rules.nft"#012include "/etc/nftables/tripleo-jumps.nft"#012 state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:57:45 localhost python3[42538]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/tripleo-chains.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:57:46 localhost python3[42555]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft | nft -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:57:46 localhost python3[42574]: ansible-file Invoked with mode=0750 path=/var/log/containers/collectd setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:57:46 localhost python3[42590]: ansible-file Invoked with mode=0755 path=/var/lib/container-user-scripts/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:57:47 localhost python3[42606]: ansible-file Invoked with mode=0750 path=/var/log/containers/ceilometer setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:57:47 localhost python3[42622]: ansible-seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Nov 28 02:57:49 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=7 res=1 Nov 28 02:57:49 localhost python3[42642]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Nov 28 02:57:50 localhost kernel: SELinux: Converting 2704 SID table entries... Nov 28 02:57:50 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 28 02:57:50 localhost kernel: SELinux: policy capability open_perms=1 Nov 28 02:57:50 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 28 02:57:50 localhost kernel: SELinux: policy capability always_check_network=0 Nov 28 02:57:50 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 28 02:57:50 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 28 02:57:50 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 28 02:57:50 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=8 res=1 Nov 28 02:57:50 localhost python3[42663]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/target(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Nov 28 02:57:51 localhost kernel: SELinux: Converting 2704 SID table entries... Nov 28 02:57:51 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 28 02:57:51 localhost kernel: SELinux: policy capability open_perms=1 Nov 28 02:57:51 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 28 02:57:51 localhost kernel: SELinux: policy capability always_check_network=0 Nov 28 02:57:51 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 28 02:57:51 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 28 02:57:51 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 28 02:57:51 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=9 res=1 Nov 28 02:57:52 localhost python3[42684]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/var/lib/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Nov 28 02:57:52 localhost kernel: SELinux: Converting 2704 SID table entries... Nov 28 02:57:52 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 28 02:57:52 localhost kernel: SELinux: policy capability open_perms=1 Nov 28 02:57:52 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 28 02:57:52 localhost kernel: SELinux: policy capability always_check_network=0 Nov 28 02:57:52 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 28 02:57:52 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 28 02:57:52 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 28 02:57:53 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=10 res=1 Nov 28 02:57:53 localhost python3[42707]: ansible-file Invoked with path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:57:53 localhost python3[42723]: ansible-file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:57:54 localhost python3[42739]: ansible-file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:57:54 localhost python3[42755]: ansible-stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 02:57:55 localhost python3[42771]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-enabled --quiet iscsi.service _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:57:55 localhost python3[42788]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 28 02:57:59 localhost python3[42805]: ansible-file Invoked with path=/etc/modules-load.d state=directory mode=493 owner=root group=root setype=etc_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 02:57:59 localhost python3[42853]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:58:00 localhost python3[42896]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316679.6330104-76381-160894852188564/source dest=/etc/modules-load.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:00 localhost python3[42926]: ansible-systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 02:58:00 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Nov 28 02:58:00 localhost systemd[1]: Stopped Load Kernel Modules. Nov 28 02:58:00 localhost systemd[1]: Stopping Load Kernel Modules... Nov 28 02:58:00 localhost systemd[1]: Starting Load Kernel Modules... Nov 28 02:58:00 localhost kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Nov 28 02:58:00 localhost kernel: Bridge firewalling registered Nov 28 02:58:00 localhost systemd-modules-load[42929]: Inserted module 'br_netfilter' Nov 28 02:58:00 localhost systemd-modules-load[42929]: Module 'msr' is built in Nov 28 02:58:00 localhost systemd[1]: Finished Load Kernel Modules. Nov 28 02:58:01 localhost python3[42980]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:58:01 localhost python3[43023]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316681.100863-76432-139094064857990/source dest=/etc/sysctl.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-sysctl.conf.j2 checksum=cddb9401fdafaaf28a4a94b98448f98ae93c94c9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:02 localhost python3[43053]: ansible-sysctl Invoked with name=fs.aio-max-nr value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 28 02:58:02 localhost python3[43070]: ansible-sysctl Invoked with name=fs.inotify.max_user_instances value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 28 02:58:02 localhost python3[43088]: ansible-sysctl Invoked with name=kernel.pid_max value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 28 02:58:03 localhost python3[43106]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-arptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 28 02:58:03 localhost python3[43123]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-ip6tables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 28 02:58:04 localhost python3[43140]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-iptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 28 02:58:04 localhost python3[43157]: ansible-sysctl Invoked with name=net.ipv4.conf.all.rp_filter value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 28 02:58:04 localhost python3[43175]: ansible-sysctl Invoked with name=net.ipv4.ip_forward value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 28 02:58:06 localhost python3[43193]: ansible-sysctl Invoked with name=net.ipv4.ip_local_reserved_ports value=35357,49000-49001 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 28 02:58:06 localhost python3[43211]: ansible-sysctl Invoked with name=net.ipv4.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 28 02:58:06 localhost python3[43229]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh1 value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 28 02:58:06 localhost python3[43247]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh2 value=2048 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 28 02:58:07 localhost python3[43265]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh3 value=4096 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 28 02:58:07 localhost python3[43283]: ansible-sysctl Invoked with name=net.ipv6.conf.all.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 28 02:58:07 localhost python3[43300]: ansible-sysctl Invoked with name=net.ipv6.conf.all.forwarding value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 28 02:58:08 localhost python3[43317]: ansible-sysctl Invoked with name=net.ipv6.conf.default.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 28 02:58:08 localhost python3[43334]: ansible-sysctl Invoked with name=net.ipv6.conf.lo.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 28 02:58:08 localhost python3[43351]: ansible-sysctl Invoked with name=net.ipv6.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Nov 28 02:58:09 localhost python3[43369]: ansible-systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 02:58:09 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Nov 28 02:58:09 localhost systemd[1]: Stopped Apply Kernel Variables. Nov 28 02:58:09 localhost systemd[1]: Stopping Apply Kernel Variables... Nov 28 02:58:09 localhost systemd[1]: Starting Apply Kernel Variables... Nov 28 02:58:09 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Nov 28 02:58:09 localhost systemd[1]: Finished Apply Kernel Variables. Nov 28 02:58:09 localhost python3[43389]: ansible-file Invoked with mode=0750 path=/var/log/containers/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:10 localhost python3[43405]: ansible-file Invoked with path=/var/lib/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:10 localhost python3[43421]: ansible-file Invoked with mode=0750 path=/var/log/containers/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:11 localhost python3[43437]: ansible-stat Invoked with path=/var/lib/nova/instances follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 02:58:11 localhost python3[43456]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:11 localhost python3[43499]: ansible-file Invoked with path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:12 localhost python3[43535]: ansible-file Invoked with path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:12 localhost python3[43581]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:12 localhost python3[43612]: ansible-file Invoked with path=/etc/tmpfiles.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:58:13 localhost python3[43677]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-nova.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:58:13 localhost python3[43735]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-nova.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316692.7713733-76839-31204975545471/source _original_basename=tmpvv4ci4f6 follow=False checksum=f834349098718ec09c7562bcb470b717a83ff411 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:58:13 localhost python3[43765]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-tmpfiles --create _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:58:15 localhost python3[43782]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:58:16 localhost python3[43830]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/delay-nova-compute follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:58:16 localhost python3[43873]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/nova/delay-nova-compute mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316695.9173484-77028-89722746026274/source _original_basename=tmp0jcbpm90 follow=False checksum=f07ad3e8cf3766b3b3b07ae8278826a0ef3bb5e3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:58:17 localhost python3[43903]: ansible-file Invoked with mode=0750 path=/var/log/containers/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:17 localhost python3[43919]: ansible-file Invoked with path=/etc/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:17 localhost python3[43935]: ansible-file Invoked with path=/etc/libvirt/secrets setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:18 localhost python3[43951]: ansible-file Invoked with path=/etc/libvirt/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:18 localhost python3[43967]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:18 localhost python3[43983]: ansible-file Invoked with path=/var/cache/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:58:18 localhost python3[43999]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:19 localhost python3[44015]: ansible-file Invoked with path=/run/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:58:19 localhost python3[44031]: ansible-file Invoked with mode=0770 path=/var/log/containers/libvirt/swtpm setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:20 localhost python3[44047]: ansible-group Invoked with gid=107 name=qemu state=present system=False local=False non_unique=False Nov 28 02:58:20 localhost python3[44069]: ansible-user Invoked with comment=qemu user group=qemu name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005538513.localdomain update_password=always groups=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Nov 28 02:58:21 localhost python3[44093]: ansible-file Invoked with group=qemu owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None serole=None selevel=None attributes=None Nov 28 02:58:21 localhost python3[44109]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/rpm -q libvirt-daemon _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:58:21 localhost python3[44158]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-libvirt.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:58:22 localhost python3[44201]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-libvirt.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316701.6468189-77355-258856901106800/source _original_basename=tmpw8ggb0wa follow=False checksum=57f3ff94c666c6aae69ae22e23feb750cf9e8b13 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:58:22 localhost python3[44231]: ansible-seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False Nov 28 02:58:23 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=11 res=1 Nov 28 02:58:23 localhost python3[44253]: ansible-file Invoked with path=/etc/crypto-policies/local.d/gnutls-qemu.config state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:58:24 localhost python3[44269]: ansible-file Invoked with path=/run/libvirt setype=virt_var_run_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:24 localhost python3[44285]: ansible-seboolean Invoked with name=logrotate_read_inside_containers persistent=True state=True ignore_selinux_state=False Nov 28 02:58:26 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=12 res=1 Nov 28 02:58:26 localhost python3[44305]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 28 02:58:29 localhost python3[44322]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_interfaces'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 28 02:58:30 localhost python3[44383]: ansible-file Invoked with path=/etc/containers/networks state=directory recurse=True mode=493 owner=root group=root force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:58:30 localhost python3[44399]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:58:31 localhost python3[44459]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:58:32 localhost python3[44502]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316710.785518-77853-248879723636934/source dest=/etc/containers/networks/podman.json mode=0644 owner=root group=root follow=False _original_basename=podman_network_config.j2 checksum=5f4ee99fc7d9e996ae5b1d2f917f41c82ac4db9e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:58:32 localhost python3[44564]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:58:33 localhost python3[44609]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316712.450254-77953-234226567019734/source dest=/etc/containers/registries.conf owner=root group=root setype=etc_t mode=0644 follow=False _original_basename=registries.conf.j2 checksum=710a00cfb11a4c3eba9c028ef1984a9fea9ba83a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:33 localhost python3[44639]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=containers option=pids_limit value=4096 backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:33 localhost python3[44655]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=events_logger value="journald" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:34 localhost python3[44671]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=runtime value="crun" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:34 localhost python3[44687]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=network option=network_backend value="netavark" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 02:58:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 3254 writes, 16K keys, 3254 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.02 MB/s#012Cumulative WAL: 3254 writes, 143 syncs, 22.76 writes per sync, written: 0.01 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3254 writes, 16K keys, 3254 commit groups, 1.0 writes per commit group, ingest: 14.65 MB, 0.02 MB/s#012Interval WAL: 3254 writes, 143 syncs, 22.76 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memt Nov 28 02:58:35 localhost python3[44735]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:58:35 localhost python3[44778]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316714.8907115-78057-78672103747418/source _original_basename=tmp8he6cfek follow=False checksum=0bfbc70e9a4740c9004b9947da681f723d529c83 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:58:35 localhost python3[44808]: ansible-file Invoked with mode=0750 path=/var/log/containers/rsyslog setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:36 localhost python3[44824]: ansible-file Invoked with path=/var/lib/rsyslog.container setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:37 localhost python3[44840]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 28 02:58:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 02:58:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.3 total, 600.0 interval#012Cumulative writes: 3383 writes, 16K keys, 3383 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.03 MB/s#012Cumulative WAL: 3383 writes, 195 syncs, 17.35 writes per sync, written: 0.01 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3383 writes, 16K keys, 3383 commit groups, 1.0 writes per commit group, ingest: 15.26 MB, 0.03 MB/s#012Interval WAL: 3383 writes, 195 syncs, 17.35 writes per sync, written: 0.01 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.008 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.008 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.01 0.00 1 0.008 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.3 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.3 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.3 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memt Nov 28 02:58:40 localhost python3[44889]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:58:41 localhost python3[44934]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316720.2912393-78342-246714240581004/source validate=/usr/sbin/sshd -T -f %s mode=None follow=False _original_basename=sshd_config_block.j2 checksum=913c99ed7d5c33615bfb07a6792a4ef143dcfd2b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:58:41 localhost python3[44965]: ansible-systemd Invoked with name=sshd state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 02:58:41 localhost systemd[1]: Stopping OpenSSH server daemon... Nov 28 02:58:41 localhost systemd[1]: sshd.service: Deactivated successfully. Nov 28 02:58:41 localhost systemd[1]: Stopped OpenSSH server daemon. Nov 28 02:58:41 localhost systemd[1]: sshd.service: Consumed 4.460s CPU time, read 2.1M from disk, written 40.0K to disk. Nov 28 02:58:41 localhost systemd[1]: Stopped target sshd-keygen.target. Nov 28 02:58:41 localhost systemd[1]: Stopping sshd-keygen.target... Nov 28 02:58:41 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 28 02:58:41 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 28 02:58:41 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 28 02:58:41 localhost systemd[1]: Reached target sshd-keygen.target. Nov 28 02:58:41 localhost systemd[1]: Starting OpenSSH server daemon... Nov 28 02:58:41 localhost sshd[44969]: main: sshd: ssh-rsa algorithm is disabled Nov 28 02:58:41 localhost systemd[1]: Started OpenSSH server daemon. Nov 28 02:58:42 localhost python3[44985]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:58:43 localhost python3[45003]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:58:44 localhost python3[45021]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 28 02:58:47 localhost python3[45070]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:58:47 localhost python3[45088]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=420 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:58:48 localhost python3[45118]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 02:58:49 localhost python3[45168]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:58:49 localhost python3[45186]: ansible-ansible.legacy.file Invoked with dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service recurse=False state=file path=/etc/systemd/system/chrony-online.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:58:50 localhost python3[45216]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 02:58:50 localhost systemd[1]: Reloading. Nov 28 02:58:50 localhost systemd-sysv-generator[45246]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:58:50 localhost systemd-rc-local-generator[45243]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:58:50 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:58:50 localhost systemd[1]: Starting chronyd online sources service... Nov 28 02:58:50 localhost chronyc[45255]: 200 OK Nov 28 02:58:50 localhost systemd[1]: chrony-online.service: Deactivated successfully. Nov 28 02:58:50 localhost systemd[1]: Finished chronyd online sources service. Nov 28 02:58:50 localhost python3[45271]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:58:50 localhost chronyd[26085]: System clock was stepped by -0.000194 seconds Nov 28 02:58:51 localhost python3[45288]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:58:51 localhost python3[45305]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:58:51 localhost chronyd[26085]: System clock was stepped by 0.000000 seconds Nov 28 02:58:51 localhost python3[45322]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:58:52 localhost python3[45339]: ansible-timezone Invoked with name=UTC hwclock=None Nov 28 02:58:52 localhost systemd[1]: Starting Time & Date Service... Nov 28 02:58:52 localhost systemd[1]: Started Time & Date Service. Nov 28 02:58:53 localhost python3[45359]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:58:54 localhost python3[45376]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:58:54 localhost python3[45393]: ansible-slurp Invoked with src=/etc/tuned/active_profile Nov 28 02:58:55 localhost python3[45409]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 02:58:55 localhost python3[45425]: ansible-file Invoked with mode=0750 path=/var/log/containers/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:56 localhost python3[45441]: ansible-file Invoked with path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:56 localhost python3[45489]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/neutron-cleanup follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:58:56 localhost python3[45532]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/neutron-cleanup force=True mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316736.2150152-79292-101910019104249/source _original_basename=tmpdsa2me3l follow=False checksum=f9cc7d1e91fbae49caa7e35eb2253bba146a73b4 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:58:57 localhost python3[45594]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/neutron-cleanup.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:58:57 localhost python3[45637]: ansible-ansible.legacy.copy Invoked with dest=/usr/lib/systemd/system/neutron-cleanup.service force=True src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316737.080491-79394-149285758843359/source _original_basename=tmp34357v48 follow=False checksum=6b6cd9f074903a28d054eb530a10c7235d0c39fc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:58:58 localhost python3[45667]: ansible-ansible.legacy.systemd Invoked with enabled=True name=neutron-cleanup daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Nov 28 02:58:58 localhost systemd[1]: Reloading. Nov 28 02:58:58 localhost systemd-sysv-generator[45695]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:58:58 localhost systemd-rc-local-generator[45692]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:58:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:58:58 localhost python3[45721]: ansible-file Invoked with mode=0750 path=/var/log/containers/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:58:59 localhost python3[45737]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns add ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:58:59 localhost systemd[35694]: Created slice User Background Tasks Slice. Nov 28 02:58:59 localhost systemd[35694]: Starting Cleanup of User's Temporary Files and Directories... Nov 28 02:58:59 localhost systemd[35694]: Finished Cleanup of User's Temporary Files and Directories. Nov 28 02:58:59 localhost python3[45755]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns delete ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:58:59 localhost systemd[1]: run-netns-ns_temp.mount: Deactivated successfully. Nov 28 02:58:59 localhost python3[45772]: ansible-file Invoked with path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:59:00 localhost python3[45788]: ansible-file Invoked with path=/var/lib/neutron/kill_scripts state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:59:00 localhost python3[45836]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:59:01 localhost python3[45879]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316740.4072773-79651-244303052039741/source _original_basename=tmpwzas0rzl follow=False checksum=2f369fbe8f83639cdfd4efc53e7feb4ee77d1ed7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:59:22 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Nov 28 02:59:22 localhost python3[45986]: ansible-file Invoked with path=/var/log/containers state=directory setype=container_file_t selevel=s0 mode=488 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Nov 28 02:59:23 localhost python3[46004]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None setype=None attributes=None Nov 28 02:59:23 localhost python3[46020]: ansible-file Invoked with path=/var/lib/tripleo-config state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 28 02:59:23 localhost python3[46036]: ansible-file Invoked with path=/var/lib/container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:59:24 localhost python3[46052]: ansible-file Invoked with path=/var/lib/docker-container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:59:24 localhost python3[46068]: ansible-community.general.sefcontext Invoked with target=/var/lib/container-config-scripts(/.*)? setype=container_file_t state=present ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Nov 28 02:59:25 localhost kernel: SELinux: Converting 2707 SID table entries... Nov 28 02:59:25 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 28 02:59:25 localhost kernel: SELinux: policy capability open_perms=1 Nov 28 02:59:25 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 28 02:59:25 localhost kernel: SELinux: policy capability always_check_network=0 Nov 28 02:59:25 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 28 02:59:25 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 28 02:59:25 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 28 02:59:25 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=13 res=1 Nov 28 02:59:26 localhost python3[46089]: ansible-file Invoked with path=/var/lib/container-config-scripts state=directory setype=container_file_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 02:59:27 localhost python3[46226]: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-config/container-startup-config config_data={'step_1': {'metrics_qdr': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, 'metrics_qdr_init_logs': {'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}}, 'step_2': {'create_haproxy_wrapper': {'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, 'create_virtlogd_wrapper': {'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, 'nova_compute_init_log': {'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, 'nova_virtqemud_init_logs': {'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}}, 'step_3': {'ceilometer_init_log': {'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'collectd': {'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, 'iscsid': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, 'nova_statedir_owner': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, 'nova_virtlogd_wrapper': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': [ Nov 28 02:59:27 localhost rsyslogd[759]: message too long (31243) with configured size 8096, begin of message is: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-c [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2445 ] Nov 28 02:59:28 localhost python3[46242]: ansible-file Invoked with path=/var/lib/kolla/config_files state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 28 02:59:29 localhost python3[46258]: ansible-file Invoked with path=/var/lib/config-data mode=493 state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Nov 28 02:59:29 localhost python3[46274]: ansible-tripleo_container_configs Invoked with config_data={'/var/lib/kolla/config_files/ceilometer-agent-ipmi.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces ipmi --logfile /var/log/ceilometer/ipmi.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/ceilometer_agent_compute.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /var/log/ceilometer/compute.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/collectd.json': {'command': '/usr/sbin/collectd -f', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/', 'merge': False, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/etc/collectd.d'}], 'permissions': [{'owner': 'collectd:collectd', 'path': '/var/log/collectd', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/scripts', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/config-scripts', 'recurse': True}]}, '/var/lib/kolla/config_files/iscsid.json': {'command': '/usr/sbin/iscsid -f', 'config_files': [{'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/'}]}, '/var/lib/kolla/config_files/logrotate-crond.json': {'command': '/usr/sbin/crond -s -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/metrics_qdr.json': {'command': '/usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/', 'merge': True, 'optional': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-tls/*'}], 'permissions': [{'owner': 'qdrouterd:qdrouterd', 'path': '/var/lib/qdrouterd', 'recurse': True}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/certs/metrics_qdr.crt'}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/private/metrics_qdr.key'}]}, '/var/lib/kolla/config_files/nova-migration-target.json': {'command': 'dumb-init --single-child -- /usr/sbin/sshd -D -p 2022', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ssh/', 'owner': 'root', 'perm': '0600', 'source': '/host-ssh/ssh_host_*_key'}]}, '/var/lib/kolla/config_files/nova_compute.json': {'command': '/var/lib/nova/delay-nova-compute --delay 180 --nova-binary /usr/bin/nova-compute ', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}, {'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_wait_for_compute_service.py', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}]}, '/var/lib/kolla/config_files/nova_virtlogd.json': {'command': '/usr/local/bin/virtlogd_wrapper', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtnodedevd.json': {'command': '/usr/sbin/virtnodedevd --config /etc/libvirt/virtnodedevd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtproxyd.json': {'command': '/usr/sbin/virtproxyd --config /etc/libvirt/virtproxyd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtqemud.json': {'command': '/usr/sbin/virtqemud --config /etc/libvirt/virtqemud.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtsecretd.json': {'command': '/usr/sbin/virtsecretd --config /etc/libvirt/virtsecretd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtstoraged.json': {'command': '/usr/sbin/virtstoraged --config /etc/libvirt/virtstoraged.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/ovn_controller.json': {'command': '/usr/bin/ovn-controller --pidfile --log-file unix:/run/openvswitch/db.sock ', 'permissions': [{'owner': 'root:root', 'path': '/var/log/openvswitch', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/ovn', 'recurse': True}]}, '/var/lib/kolla/config_files/ovn_metadata_agent.json': {'command': '/usr/bin/networking-ovn-metadata-agent --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --log-file=/var/log/neutron/ovn-metadata-agent.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'neutron:neutron', 'path': '/var/log/neutron', 'recurse': True}, {'owner': 'neutron:neutron', 'path': '/var/lib/neutron', 'recurse': True}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/certs/ovn_metadata.crt', 'perm': '0644'}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/private/ovn_metadata.key', 'perm': '0644'}]}, '/var/lib/kolla/config_files/rsyslog.json': {'command': '/usr/sbin/rsyslogd -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'root:root', 'path': '/var/lib/rsyslog', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/rsyslog', 'recurse': True}]}} Nov 28 02:59:35 localhost python3[46322]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 02:59:35 localhost python3[46365]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316774.8521078-81135-25730311528161/source _original_basename=tmp789d6unz follow=False checksum=dfdcc7695edd230e7a2c06fc7b739bfa56506d8f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 02:59:35 localhost python3[46395]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 02:59:38 localhost python3[46518]: ansible-file Invoked with path=/var/lib/container-puppet state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 28 02:59:39 localhost python3[46639]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Nov 28 02:59:41 localhost python3[46655]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q lvm2 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 02:59:42 localhost python3[46672]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 28 02:59:46 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Nov 28 02:59:46 localhost dbus-broker-launch[18433]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored Nov 28 02:59:46 localhost dbus-broker-launch[18433]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored Nov 28 02:59:46 localhost dbus-broker-launch[18433]: Noticed file-system modification, trigger reload. Nov 28 02:59:46 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Nov 28 02:59:46 localhost systemd[1]: Reexecuting. Nov 28 02:59:46 localhost systemd[1]: systemd 252-14.el9_2.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Nov 28 02:59:46 localhost systemd[1]: Detected virtualization kvm. Nov 28 02:59:46 localhost systemd[1]: Detected architecture x86-64. Nov 28 02:59:46 localhost systemd-sysv-generator[46730]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:59:46 localhost systemd-rc-local-generator[46727]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:59:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:59:55 localhost kernel: SELinux: Converting 2707 SID table entries... Nov 28 02:59:55 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 28 02:59:55 localhost kernel: SELinux: policy capability open_perms=1 Nov 28 02:59:55 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 28 02:59:55 localhost kernel: SELinux: policy capability always_check_network=0 Nov 28 02:59:55 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 28 02:59:55 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 28 02:59:55 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 28 02:59:55 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Nov 28 02:59:55 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=14 res=1 Nov 28 02:59:55 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Nov 28 02:59:56 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 28 02:59:56 localhost systemd[1]: Starting man-db-cache-update.service... Nov 28 02:59:56 localhost systemd[1]: Reloading. Nov 28 02:59:56 localhost systemd-rc-local-generator[46817]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:59:56 localhost systemd-sysv-generator[46823]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:59:56 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:59:56 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 28 02:59:56 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 28 02:59:56 localhost systemd-journald[618]: Journal stopped Nov 28 02:59:56 localhost systemd-journald[618]: Received SIGTERM from PID 1 (systemd). Nov 28 02:59:56 localhost systemd[1]: Stopping Journal Service... Nov 28 02:59:56 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files... Nov 28 02:59:56 localhost systemd[1]: systemd-journald.service: Deactivated successfully. Nov 28 02:59:56 localhost systemd[1]: Stopped Journal Service. Nov 28 02:59:56 localhost systemd[1]: systemd-journald.service: Consumed 1.883s CPU time. Nov 28 02:59:56 localhost systemd[1]: Starting Journal Service... Nov 28 02:59:56 localhost systemd[1]: systemd-udevd.service: Deactivated successfully. Nov 28 02:59:56 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files. Nov 28 02:59:56 localhost systemd[1]: systemd-udevd.service: Consumed 3.212s CPU time. Nov 28 02:59:56 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Nov 28 02:59:56 localhost systemd-journald[47227]: Journal started Nov 28 02:59:56 localhost systemd-journald[47227]: Runtime Journal (/run/log/journal/5cd59ba25ae47acac865224fa46a5f9e) is 12.2M, max 314.7M, 302.5M free. Nov 28 02:59:56 localhost systemd[1]: Started Journal Service. Nov 28 02:59:56 localhost systemd-journald[47227]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation. Nov 28 02:59:56 localhost systemd-journald[47227]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 28 02:59:56 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 28 02:59:57 localhost systemd-udevd[47236]: Using default interface naming scheme 'rhel-9.0'. Nov 28 02:59:57 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Nov 28 02:59:57 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 28 02:59:57 localhost systemd[1]: Reloading. Nov 28 02:59:57 localhost systemd-rc-local-generator[47847]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 02:59:57 localhost systemd-sysv-generator[47850]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 02:59:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 02:59:57 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 28 02:59:57 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 28 02:59:57 localhost systemd[1]: Finished man-db-cache-update.service. Nov 28 02:59:57 localhost systemd[1]: man-db-cache-update.service: Consumed 1.373s CPU time. Nov 28 02:59:57 localhost systemd[1]: run-rcc4a7a8e9d8c412bb6980a53ecd801fb.service: Deactivated successfully. Nov 28 02:59:57 localhost systemd[1]: run-r5cab8d9fc4ab41ccbc6ba563f3390d5f.service: Deactivated successfully. Nov 28 02:59:59 localhost python3[48166]: ansible-sysctl Invoked with name=vm.unprivileged_userfaultfd reload=True state=present sysctl_file=/etc/sysctl.d/99-tripleo-postcopy.conf sysctl_set=True value=1 ignoreerrors=False Nov 28 02:59:59 localhost python3[48185]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ksm.service || systemctl is-enabled ksm.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 03:00:00 localhost python3[48203]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 28 03:00:00 localhost python3[48203]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 --format json Nov 28 03:00:00 localhost python3[48203]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 -q --tls-verify=false Nov 28 03:00:09 localhost podman[48217]: 2025-11-28 08:00:00.849810715 +0000 UTC m=+0.043416845 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Nov 28 03:00:09 localhost python3[48203]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect bac901955dcf7a32a493c6ef724c092009bbc18467858aa8c55e916b8c2b2b8f --format json Nov 28 03:00:10 localhost python3[48318]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 28 03:00:10 localhost python3[48318]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 --format json Nov 28 03:00:10 localhost python3[48318]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 -q --tls-verify=false Nov 28 03:00:19 localhost podman[48330]: 2025-11-28 08:00:10.354211851 +0000 UTC m=+0.046690909 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Nov 28 03:00:19 localhost python3[48318]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 44feaf8d87c1d40487578230316b622680576d805efdb45dfeea6aad464b41f1 --format json Nov 28 03:00:19 localhost systemd[1]: tmp-crun.f9wZgD.mount: Deactivated successfully. Nov 28 03:00:19 localhost podman[48535]: 2025-11-28 08:00:19.98773249 +0000 UTC m=+0.108311899 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, vcs-type=git, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, RELEASE=main, io.buildah.version=1.33.12, name=rhceph, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, ceph=True, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public) Nov 28 03:00:20 localhost python3[48534]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 28 03:00:20 localhost python3[48534]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 --format json Nov 28 03:00:20 localhost python3[48534]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 -q --tls-verify=false Nov 28 03:00:20 localhost podman[48535]: 2025-11-28 08:00:20.125381285 +0000 UTC m=+0.245960704 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, name=rhceph, com.redhat.component=rhceph-container, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-type=git, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , release=553) Nov 28 03:00:37 localhost podman[48565]: 2025-11-28 08:00:20.157460029 +0000 UTC m=+0.051076296 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 28 03:00:37 localhost python3[48534]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 3a088c12511c977065fdc5f1594cba7b1a79f163578a6ffd0ac4a475b8e67938 --format json Nov 28 03:00:38 localhost python3[49290]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 28 03:00:38 localhost python3[49290]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 --format json Nov 28 03:00:38 localhost python3[49290]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 -q --tls-verify=false Nov 28 03:00:51 localhost podman[49303]: 2025-11-28 08:00:38.477373753 +0000 UTC m=+0.042759614 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 28 03:00:51 localhost python3[49290]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 514d439186251360cf734cbc6d4a44c834664891872edf3798a653dfaacf10c0 --format json Nov 28 03:00:51 localhost python3[49430]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 28 03:00:51 localhost python3[49430]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 --format json Nov 28 03:00:51 localhost python3[49430]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 -q --tls-verify=false Nov 28 03:01:01 localhost podman[49444]: 2025-11-28 08:00:51.757801314 +0000 UTC m=+0.042974310 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Nov 28 03:01:01 localhost python3[49430]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect a9dd7a2ac6f35cb086249f87f74e2f8e74e7e2ad5141ce2228263be6faedce26 --format json Nov 28 03:01:02 localhost python3[49718]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 28 03:01:02 localhost python3[49718]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 --format json Nov 28 03:01:02 localhost python3[49718]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 -q --tls-verify=false Nov 28 03:01:06 localhost podman[49732]: 2025-11-28 08:01:02.199534862 +0000 UTC m=+0.035767993 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Nov 28 03:01:06 localhost python3[49718]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 24976907b2c2553304119aba5731a800204d664feed24ca9eb7f2b4c7d81016b --format json Nov 28 03:01:07 localhost python3[49810]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 28 03:01:07 localhost python3[49810]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 --format json Nov 28 03:01:07 localhost python3[49810]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 -q --tls-verify=false Nov 28 03:01:09 localhost podman[49822]: 2025-11-28 08:01:07.259391475 +0000 UTC m=+0.045091124 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Nov 28 03:01:09 localhost python3[49810]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 57163a7b21fdbb804a27897cb6e6052a5e5c7a339c45d663e80b52375a760dcf --format json Nov 28 03:01:10 localhost python3[49897]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 28 03:01:10 localhost python3[49897]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 --format json Nov 28 03:01:10 localhost python3[49897]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 -q --tls-verify=false Nov 28 03:01:12 localhost podman[49910]: 2025-11-28 08:01:10.179442684 +0000 UTC m=+0.044354152 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Nov 28 03:01:12 localhost python3[49897]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 076d82a27d63c8328729ed27ceb4291585ae18d017befe6fe353df7aa11715ae --format json Nov 28 03:01:12 localhost python3[49989]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 28 03:01:12 localhost python3[49989]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 --format json Nov 28 03:01:12 localhost python3[49989]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 -q --tls-verify=false Nov 28 03:01:15 localhost podman[50002]: 2025-11-28 08:01:12.696980587 +0000 UTC m=+0.030486017 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Nov 28 03:01:15 localhost python3[49989]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect d0dbcb95546840a8d088df044347a7877ad5ea45a2ddba0578e9bb5de4ab0da5 --format json Nov 28 03:01:15 localhost python3[50079]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 28 03:01:15 localhost python3[50079]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 --format json Nov 28 03:01:15 localhost python3[50079]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 -q --tls-verify=false Nov 28 03:01:19 localhost podman[50092]: 2025-11-28 08:01:15.840400036 +0000 UTC m=+0.045283170 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Nov 28 03:01:19 localhost python3[50079]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect e6e981540e553415b2d6eda490d7683db07164af2e7a0af8245623900338a4d6 --format json Nov 28 03:01:20 localhost python3[50181]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Nov 28 03:01:20 localhost python3[50181]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 --format json Nov 28 03:01:20 localhost python3[50181]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 -q --tls-verify=false Nov 28 03:01:23 localhost podman[50193]: 2025-11-28 08:01:20.237374965 +0000 UTC m=+0.044319541 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Nov 28 03:01:23 localhost python3[50181]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 87ee88cbf01fb42e0b22747072843bcca6130a90eda4de6e74b3ccd847bb4040 --format json Nov 28 03:01:23 localhost python3[50349]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:01:25 localhost ansible-async_wrapper.py[50521]: Invoked with 642356236050 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316884.9235888-83883-161618226582021/AnsiballZ_command.py _ Nov 28 03:01:25 localhost ansible-async_wrapper.py[50524]: Starting module and watcher Nov 28 03:01:25 localhost ansible-async_wrapper.py[50524]: Start watching 50525 (3600) Nov 28 03:01:25 localhost ansible-async_wrapper.py[50525]: Start module (50525) Nov 28 03:01:25 localhost ansible-async_wrapper.py[50521]: Return async_wrapper task started. Nov 28 03:01:26 localhost python3[50545]: ansible-ansible.legacy.async_status Invoked with jid=642356236050.50521 mode=status _async_dir=/tmp/.ansible_async Nov 28 03:01:29 localhost puppet-user[50529]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 28 03:01:29 localhost puppet-user[50529]: (file: /etc/puppet/hiera.yaml) Nov 28 03:01:29 localhost puppet-user[50529]: Warning: Undefined variable '::deploy_config_name'; Nov 28 03:01:29 localhost puppet-user[50529]: (file & line not available) Nov 28 03:01:29 localhost puppet-user[50529]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 28 03:01:29 localhost puppet-user[50529]: (file & line not available) Nov 28 03:01:29 localhost puppet-user[50529]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Nov 28 03:01:30 localhost puppet-user[50529]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Nov 28 03:01:30 localhost puppet-user[50529]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.12 seconds Nov 28 03:01:30 localhost puppet-user[50529]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Exec[directory-create-etc-my.cnf.d]/returns: executed successfully Nov 28 03:01:30 localhost puppet-user[50529]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/File[/etc/my.cnf.d/tripleo.cnf]/ensure: created Nov 28 03:01:30 localhost puppet-user[50529]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Augeas[tripleo-mysql-client-conf]/returns: executed successfully Nov 28 03:01:30 localhost puppet-user[50529]: Notice: Applied catalog in 0.05 seconds Nov 28 03:01:30 localhost puppet-user[50529]: Application: Nov 28 03:01:30 localhost puppet-user[50529]: Initial environment: production Nov 28 03:01:30 localhost puppet-user[50529]: Converged environment: production Nov 28 03:01:30 localhost puppet-user[50529]: Run mode: user Nov 28 03:01:30 localhost puppet-user[50529]: Changes: Nov 28 03:01:30 localhost puppet-user[50529]: Total: 3 Nov 28 03:01:30 localhost puppet-user[50529]: Events: Nov 28 03:01:30 localhost puppet-user[50529]: Success: 3 Nov 28 03:01:30 localhost puppet-user[50529]: Total: 3 Nov 28 03:01:30 localhost puppet-user[50529]: Resources: Nov 28 03:01:30 localhost puppet-user[50529]: Changed: 3 Nov 28 03:01:30 localhost puppet-user[50529]: Out of sync: 3 Nov 28 03:01:30 localhost puppet-user[50529]: Total: 10 Nov 28 03:01:30 localhost puppet-user[50529]: Time: Nov 28 03:01:30 localhost puppet-user[50529]: Schedule: 0.00 Nov 28 03:01:30 localhost puppet-user[50529]: File: 0.00 Nov 28 03:01:30 localhost puppet-user[50529]: Exec: 0.02 Nov 28 03:01:30 localhost puppet-user[50529]: Augeas: 0.02 Nov 28 03:01:30 localhost puppet-user[50529]: Transaction evaluation: 0.05 Nov 28 03:01:30 localhost puppet-user[50529]: Catalog application: 0.05 Nov 28 03:01:30 localhost puppet-user[50529]: Config retrieval: 0.15 Nov 28 03:01:30 localhost puppet-user[50529]: Last run: 1764316890 Nov 28 03:01:30 localhost puppet-user[50529]: Filebucket: 0.00 Nov 28 03:01:30 localhost puppet-user[50529]: Total: 0.05 Nov 28 03:01:30 localhost puppet-user[50529]: Version: Nov 28 03:01:30 localhost puppet-user[50529]: Config: 1764316889 Nov 28 03:01:30 localhost puppet-user[50529]: Puppet: 7.10.0 Nov 28 03:01:30 localhost ansible-async_wrapper.py[50525]: Module complete (50525) Nov 28 03:01:30 localhost ansible-async_wrapper.py[50524]: Done in kid B. Nov 28 03:01:36 localhost python3[50672]: ansible-ansible.legacy.async_status Invoked with jid=642356236050.50521 mode=status _async_dir=/tmp/.ansible_async Nov 28 03:01:37 localhost python3[50688]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 28 03:01:37 localhost python3[50704]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:01:38 localhost python3[50752]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:01:38 localhost python3[50795]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/container-puppet/puppetlabs/facter.conf setype=svirt_sandbox_file_t selevel=s0 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316898.162096-84332-52933245925340/source _original_basename=tmp3tjighwz follow=False checksum=53908622cb869db5e2e2a68e737aa2ab1a872111 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 28 03:01:39 localhost python3[50825]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:01:40 localhost python3[50929]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Nov 28 03:01:40 localhost python3[50948]: ansible-file Invoked with path=/var/lib/tripleo-config/container-puppet-config mode=448 recurse=True setype=container_file_t force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 03:01:41 localhost python3[50964]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=False puppet_config=/var/lib/container-puppet/container-puppet.json short_hostname=np0005538513 step=1 update_config_hash_only=False Nov 28 03:01:42 localhost python3[50980]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:01:43 localhost python3[50996]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True Nov 28 03:01:43 localhost python3[51012]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Nov 28 03:01:44 localhost python3[51054]: ansible-tripleo_container_manage Invoked with config_id=tripleo_puppet_step1 config_dir=/var/lib/tripleo-config/container-puppet-config/step_1 config_patterns=container-puppet-*.json config_overrides={} concurrency=6 log_base_path=/var/log/containers/stdouts debug=False Nov 28 03:01:45 localhost podman[51209]: 2025-11-28 08:01:45.219163059 +0000 UTC m=+0.090019322 container create a788096a7b720f3740c34aa5bb7ae69f2852b290197c68c46688dd4c33a52360 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, container_name=container-puppet-collectd) Nov 28 03:01:45 localhost systemd[1]: Started libpod-conmon-a788096a7b720f3740c34aa5bb7ae69f2852b290197c68c46688dd4c33a52360.scope. Nov 28 03:01:45 localhost podman[51227]: 2025-11-28 08:01:45.251684099 +0000 UTC m=+0.101702399 container create 700dcb6d86cc0871b7bbf5977f2c9ead9a93d5086f4486ed0144de2a9b1c2b0e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, version=17.1.12, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-nova_libvirt, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, build-date=2025-11-19T00:35:22Z, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com) Nov 28 03:01:45 localhost systemd[1]: Started libcrun container. Nov 28 03:01:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c554e4db742c3febdfb49af667b5d853d210243bf4484c52c41efc08d328831f/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 28 03:01:45 localhost podman[51209]: 2025-11-28 08:01:45.169411579 +0000 UTC m=+0.040267852 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Nov 28 03:01:45 localhost podman[51209]: 2025-11-28 08:01:45.27596436 +0000 UTC m=+0.146820613 container init a788096a7b720f3740c34aa5bb7ae69f2852b290197c68c46688dd4c33a52360 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, container_name=container-puppet-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, config_id=tripleo_puppet_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-type=git) Nov 28 03:01:45 localhost podman[51209]: 2025-11-28 08:01:45.288352988 +0000 UTC m=+0.159209261 container start a788096a7b720f3740c34aa5bb7ae69f2852b290197c68c46688dd4c33a52360 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, config_id=tripleo_puppet_step1, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, container_name=container-puppet-collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git) Nov 28 03:01:45 localhost podman[51209]: 2025-11-28 08:01:45.290355151 +0000 UTC m=+0.161211414 container attach a788096a7b720f3740c34aa5bb7ae69f2852b290197c68c46688dd4c33a52360 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.expose-services=) Nov 28 03:01:45 localhost systemd[1]: Started libpod-conmon-700dcb6d86cc0871b7bbf5977f2c9ead9a93d5086f4486ed0144de2a9b1c2b0e.scope. Nov 28 03:01:45 localhost podman[51227]: 2025-11-28 08:01:45.204983884 +0000 UTC m=+0.055002194 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 28 03:01:45 localhost systemd[1]: Started libcrun container. Nov 28 03:01:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8fe5cc4372bb33f59044e24a5715f4f503612636c3ff0d7cebe06eec67fcb4f/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 28 03:01:45 localhost podman[51290]: 2025-11-28 08:01:45.288143731 +0000 UTC m=+0.047169189 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Nov 28 03:01:45 localhost podman[51253]: 2025-11-28 08:01:45.319006049 +0000 UTC m=+0.138972318 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Nov 28 03:01:45 localhost podman[51251]: 2025-11-28 08:01:45.321232289 +0000 UTC m=+0.143359685 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Nov 28 03:01:46 localhost podman[51227]: 2025-11-28 08:01:46.377650888 +0000 UTC m=+1.227669208 container init 700dcb6d86cc0871b7bbf5977f2c9ead9a93d5086f4486ed0144de2a9b1c2b0e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, version=17.1.12, container_name=container-puppet-nova_libvirt, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:35:22Z, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git) Nov 28 03:01:46 localhost systemd[1]: tmp-crun.B2eBHk.mount: Deactivated successfully. Nov 28 03:01:46 localhost podman[51290]: 2025-11-28 08:01:46.388097066 +0000 UTC m=+1.147122534 container create 6209037692ad6a27fd5ca43e0f8d40701d26de0335f5c337009fe96a996c7eb5 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_puppet_step1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=container-puppet-iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, version=17.1.12, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:01:46 localhost podman[51227]: 2025-11-28 08:01:46.46612441 +0000 UTC m=+1.316142700 container start 700dcb6d86cc0871b7bbf5977f2c9ead9a93d5086f4486ed0144de2a9b1c2b0e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, build-date=2025-11-19T00:35:22Z, container_name=container-puppet-nova_libvirt, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 28 03:01:46 localhost podman[51227]: 2025-11-28 08:01:46.468863776 +0000 UTC m=+1.318882126 container attach 700dcb6d86cc0871b7bbf5977f2c9ead9a93d5086f4486ed0144de2a9b1c2b0e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_puppet_step1, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, container_name=container-puppet-nova_libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, version=17.1.12) Nov 28 03:01:46 localhost podman[51253]: 2025-11-28 08:01:46.481982718 +0000 UTC m=+1.301948947 container create d33e206cd9e309107e146b933b6ae92c41f3aaab17f4f1c3587ef51309b905d4 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=container-puppet-metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_puppet_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4) Nov 28 03:01:46 localhost podman[51251]: 2025-11-28 08:01:46.5005429 +0000 UTC m=+1.322670256 container create 49d1198114e3cf2834836d3b9377664b111beda2bfbba5e8342177144c6fdf6b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, container_name=container-puppet-crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., name=rhosp17/openstack-cron) Nov 28 03:01:46 localhost systemd[1]: Started libpod-conmon-d33e206cd9e309107e146b933b6ae92c41f3aaab17f4f1c3587ef51309b905d4.scope. Nov 28 03:01:46 localhost systemd[1]: Started libpod-conmon-49d1198114e3cf2834836d3b9377664b111beda2bfbba5e8342177144c6fdf6b.scope. Nov 28 03:01:46 localhost systemd[1]: Started libpod-conmon-6209037692ad6a27fd5ca43e0f8d40701d26de0335f5c337009fe96a996c7eb5.scope. Nov 28 03:01:46 localhost systemd[1]: Started libcrun container. Nov 28 03:01:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29b4c40b440520fe09ba18ff760c1e8bcaccec35e4300eedc14527806169da1c/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 28 03:01:46 localhost systemd[1]: Started libcrun container. Nov 28 03:01:46 localhost systemd[1]: Started libcrun container. Nov 28 03:01:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abe1c4a3f68b93d18940fc38699bd560bf3f77db8c7795c1004a1bfa9999fcdb/merged/tmp/iscsi.host supports timestamps until 2038 (0x7fffffff) Nov 28 03:01:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abe1c4a3f68b93d18940fc38699bd560bf3f77db8c7795c1004a1bfa9999fcdb/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 28 03:01:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa2c8f8913bff67439dab4aeb6db6e2056909e99c44d0de88f7837d07d6c8847/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 28 03:01:46 localhost podman[51253]: 2025-11-28 08:01:46.558438254 +0000 UTC m=+1.378404513 container init d33e206cd9e309107e146b933b6ae92c41f3aaab17f4f1c3587ef51309b905d4 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_puppet_step1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=container-puppet-metrics_qdr, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.4) Nov 28 03:01:46 localhost podman[51251]: 2025-11-28 08:01:46.567349954 +0000 UTC m=+1.389477340 container init 49d1198114e3cf2834836d3b9377664b111beda2bfbba5e8342177144c6fdf6b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=container-puppet-crond, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_puppet_step1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64) Nov 28 03:01:46 localhost podman[51251]: 2025-11-28 08:01:46.579390731 +0000 UTC m=+1.401518117 container start 49d1198114e3cf2834836d3b9377664b111beda2bfbba5e8342177144c6fdf6b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=container-puppet-crond, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, version=17.1.12, com.redhat.component=openstack-cron-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public) Nov 28 03:01:46 localhost podman[51251]: 2025-11-28 08:01:46.579837855 +0000 UTC m=+1.401965221 container attach 49d1198114e3cf2834836d3b9377664b111beda2bfbba5e8342177144c6fdf6b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, url=https://www.redhat.com, config_id=tripleo_puppet_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, distribution-scope=public, version=17.1.12, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=container-puppet-crond, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=) Nov 28 03:01:46 localhost podman[51290]: 2025-11-28 08:01:46.613286963 +0000 UTC m=+1.372312411 container init 6209037692ad6a27fd5ca43e0f8d40701d26de0335f5c337009fe96a996c7eb5 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, container_name=container-puppet-iscsid, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, tcib_managed=true, distribution-scope=public, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:01:46 localhost podman[51253]: 2025-11-28 08:01:46.620879342 +0000 UTC m=+1.440845611 container start d33e206cd9e309107e146b933b6ae92c41f3aaab17f4f1c3587ef51309b905d4 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, name=rhosp17/openstack-qdrouterd, container_name=container-puppet-metrics_qdr, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git) Nov 28 03:01:46 localhost podman[51253]: 2025-11-28 08:01:46.621240383 +0000 UTC m=+1.441206652 container attach d33e206cd9e309107e146b933b6ae92c41f3aaab17f4f1c3587ef51309b905d4 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=container-puppet-metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_puppet_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com) Nov 28 03:01:46 localhost podman[51290]: 2025-11-28 08:01:46.672761467 +0000 UTC m=+1.431786945 container start 6209037692ad6a27fd5ca43e0f8d40701d26de0335f5c337009fe96a996c7eb5 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, container_name=container-puppet-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:01:46 localhost podman[51290]: 2025-11-28 08:01:46.673230972 +0000 UTC m=+1.432256430 container attach 6209037692ad6a27fd5ca43e0f8d40701d26de0335f5c337009fe96a996c7eb5 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=container-puppet-iscsid, config_id=tripleo_puppet_step1, batch=17.1_20251118.1, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 28 03:01:47 localhost podman[51143]: 2025-11-28 08:01:45.078655165 +0000 UTC m=+0.038459676 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Nov 28 03:01:47 localhost podman[51465]: 2025-11-28 08:01:47.753898742 +0000 UTC m=+0.095297658 container create e9e81dc4d463c3e33e320bcac7f504711c3df69a454f9c676b6265b24fcfa267 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, build-date=2025-11-19T00:11:59Z, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-central-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 ceilometer-central, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, container_name=container-puppet-ceilometer, name=rhosp17/openstack-ceilometer-central, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central) Nov 28 03:01:47 localhost podman[51465]: 2025-11-28 08:01:47.705913858 +0000 UTC m=+0.047312814 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Nov 28 03:01:47 localhost systemd[1]: Started libpod-conmon-e9e81dc4d463c3e33e320bcac7f504711c3df69a454f9c676b6265b24fcfa267.scope. Nov 28 03:01:47 localhost systemd[1]: Started libcrun container. Nov 28 03:01:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0de61d8517bf2f9ac2b1f4b59bb4938a9bcb5e653aa4685b7d899b570eb2d566/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 28 03:01:47 localhost podman[51465]: 2025-11-28 08:01:47.823131732 +0000 UTC m=+0.164530618 container init e9e81dc4d463c3e33e320bcac7f504711c3df69a454f9c676b6265b24fcfa267 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, config_id=tripleo_puppet_step1, tcib_managed=true, architecture=x86_64, container_name=container-puppet-ceilometer, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-central, release=1761123044, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-type=git, com.redhat.component=openstack-ceilometer-central-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2025-11-19T00:11:59Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, batch=17.1_20251118.1) Nov 28 03:01:47 localhost podman[51465]: 2025-11-28 08:01:47.830848434 +0000 UTC m=+0.172247330 container start e9e81dc4d463c3e33e320bcac7f504711c3df69a454f9c676b6265b24fcfa267 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, io.openshift.expose-services=, container_name=container-puppet-ceilometer, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-ceilometer-central-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 ceilometer-central, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-central, build-date=2025-11-19T00:11:59Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central) Nov 28 03:01:47 localhost podman[51465]: 2025-11-28 08:01:47.831090521 +0000 UTC m=+0.172489427 container attach e9e81dc4d463c3e33e320bcac7f504711c3df69a454f9c676b6265b24fcfa267 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-ceilometer-central, distribution-scope=public, release=1761123044, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, container_name=container-puppet-ceilometer, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-central-container, version=17.1.12, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2025-11-19T00:11:59Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:01:48 localhost puppet-user[51359]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 28 03:01:48 localhost puppet-user[51359]: (file: /etc/puppet/hiera.yaml) Nov 28 03:01:48 localhost puppet-user[51359]: Warning: Undefined variable '::deploy_config_name'; Nov 28 03:01:48 localhost puppet-user[51359]: (file & line not available) Nov 28 03:01:48 localhost puppet-user[51359]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 28 03:01:48 localhost puppet-user[51359]: (file & line not available) Nov 28 03:01:48 localhost ovs-vsctl[51595]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory) Nov 28 03:01:48 localhost puppet-user[51410]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 28 03:01:48 localhost puppet-user[51410]: (file: /etc/puppet/hiera.yaml) Nov 28 03:01:48 localhost puppet-user[51410]: Warning: Undefined variable '::deploy_config_name'; Nov 28 03:01:48 localhost puppet-user[51410]: (file & line not available) Nov 28 03:01:48 localhost puppet-user[51402]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 28 03:01:48 localhost puppet-user[51402]: (file: /etc/puppet/hiera.yaml) Nov 28 03:01:48 localhost puppet-user[51402]: Warning: Undefined variable '::deploy_config_name'; Nov 28 03:01:48 localhost puppet-user[51402]: (file & line not available) Nov 28 03:01:48 localhost puppet-user[51410]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 28 03:01:48 localhost puppet-user[51410]: (file & line not available) Nov 28 03:01:48 localhost puppet-user[51422]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 28 03:01:48 localhost puppet-user[51422]: (file: /etc/puppet/hiera.yaml) Nov 28 03:01:48 localhost puppet-user[51422]: Warning: Undefined variable '::deploy_config_name'; Nov 28 03:01:48 localhost puppet-user[51422]: (file & line not available) Nov 28 03:01:48 localhost puppet-user[51402]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 28 03:01:48 localhost puppet-user[51402]: (file & line not available) Nov 28 03:01:48 localhost puppet-user[51410]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.07 seconds Nov 28 03:01:48 localhost puppet-user[51364]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 28 03:01:48 localhost puppet-user[51364]: (file: /etc/puppet/hiera.yaml) Nov 28 03:01:48 localhost puppet-user[51364]: Warning: Undefined variable '::deploy_config_name'; Nov 28 03:01:48 localhost puppet-user[51364]: (file & line not available) Nov 28 03:01:48 localhost puppet-user[51422]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 28 03:01:48 localhost puppet-user[51422]: (file & line not available) Nov 28 03:01:48 localhost puppet-user[51402]: Notice: Accepting previously invalid value for target type 'Integer' Nov 28 03:01:48 localhost puppet-user[51364]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 28 03:01:48 localhost puppet-user[51364]: (file & line not available) Nov 28 03:01:48 localhost puppet-user[51410]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/File[/etc/logrotate-crond.conf]/ensure: defined content as '{sha256}1c3202f58bd2ae16cb31badcbb7f0d4e6697157b987d1887736ad96bb73d70b0' Nov 28 03:01:48 localhost puppet-user[51410]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/Cron[logrotate-crond]/ensure: created Nov 28 03:01:48 localhost puppet-user[51410]: Notice: Applied catalog in 0.04 seconds Nov 28 03:01:48 localhost puppet-user[51410]: Application: Nov 28 03:01:48 localhost puppet-user[51410]: Initial environment: production Nov 28 03:01:48 localhost puppet-user[51410]: Converged environment: production Nov 28 03:01:48 localhost puppet-user[51410]: Run mode: user Nov 28 03:01:48 localhost puppet-user[51410]: Changes: Nov 28 03:01:48 localhost puppet-user[51410]: Total: 2 Nov 28 03:01:48 localhost puppet-user[51410]: Events: Nov 28 03:01:48 localhost puppet-user[51410]: Success: 2 Nov 28 03:01:48 localhost puppet-user[51410]: Total: 2 Nov 28 03:01:48 localhost puppet-user[51410]: Resources: Nov 28 03:01:48 localhost puppet-user[51410]: Changed: 2 Nov 28 03:01:48 localhost puppet-user[51410]: Out of sync: 2 Nov 28 03:01:48 localhost puppet-user[51410]: Skipped: 7 Nov 28 03:01:48 localhost puppet-user[51410]: Total: 9 Nov 28 03:01:48 localhost puppet-user[51410]: Time: Nov 28 03:01:48 localhost puppet-user[51410]: File: 0.01 Nov 28 03:01:48 localhost puppet-user[51410]: Cron: 0.01 Nov 28 03:01:48 localhost puppet-user[51410]: Transaction evaluation: 0.04 Nov 28 03:01:48 localhost puppet-user[51410]: Catalog application: 0.04 Nov 28 03:01:48 localhost puppet-user[51410]: Config retrieval: 0.10 Nov 28 03:01:48 localhost puppet-user[51410]: Last run: 1764316908 Nov 28 03:01:48 localhost puppet-user[51410]: Total: 0.04 Nov 28 03:01:48 localhost puppet-user[51410]: Version: Nov 28 03:01:48 localhost puppet-user[51410]: Config: 1764316908 Nov 28 03:01:48 localhost puppet-user[51410]: Puppet: 7.10.0 Nov 28 03:01:48 localhost puppet-user[51402]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.13 seconds Nov 28 03:01:48 localhost puppet-user[51422]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.11 seconds Nov 28 03:01:48 localhost puppet-user[51402]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/owner: owner changed 'qdrouterd' to 'root' Nov 28 03:01:48 localhost puppet-user[51402]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/group: group changed 'qdrouterd' to 'root' Nov 28 03:01:48 localhost puppet-user[51402]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/mode: mode changed '0700' to '0755' Nov 28 03:01:48 localhost puppet-user[51402]: Notice: /Stage[main]/Qdr::Config/File[/etc/qpid-dispatch/ssl]/ensure: created Nov 28 03:01:48 localhost puppet-user[51402]: Notice: /Stage[main]/Qdr::Config/File[qdrouterd.conf]/content: content changed '{sha256}89e10d8896247f992c5f0baf027c25a8ca5d0441be46d8859d9db2067ea74cd3' to '{sha256}780ab1440a8faf33b15233a48577851d7bee558b8306ab6e193d265286e7d4ed' Nov 28 03:01:48 localhost puppet-user[51402]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd]/ensure: created Nov 28 03:01:48 localhost puppet-user[51402]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd/metrics_qdr.log]/ensure: created Nov 28 03:01:48 localhost puppet-user[51402]: Notice: Applied catalog in 0.03 seconds Nov 28 03:01:48 localhost puppet-user[51402]: Application: Nov 28 03:01:48 localhost puppet-user[51402]: Initial environment: production Nov 28 03:01:48 localhost puppet-user[51402]: Converged environment: production Nov 28 03:01:48 localhost puppet-user[51402]: Run mode: user Nov 28 03:01:48 localhost puppet-user[51402]: Changes: Nov 28 03:01:48 localhost puppet-user[51402]: Total: 7 Nov 28 03:01:48 localhost puppet-user[51402]: Events: Nov 28 03:01:48 localhost puppet-user[51402]: Success: 7 Nov 28 03:01:48 localhost puppet-user[51402]: Total: 7 Nov 28 03:01:48 localhost puppet-user[51402]: Resources: Nov 28 03:01:48 localhost puppet-user[51402]: Skipped: 13 Nov 28 03:01:48 localhost puppet-user[51402]: Changed: 5 Nov 28 03:01:48 localhost puppet-user[51402]: Out of sync: 5 Nov 28 03:01:48 localhost puppet-user[51402]: Total: 20 Nov 28 03:01:48 localhost puppet-user[51402]: Time: Nov 28 03:01:48 localhost puppet-user[51402]: File: 0.01 Nov 28 03:01:48 localhost puppet-user[51402]: Transaction evaluation: 0.02 Nov 28 03:01:48 localhost puppet-user[51402]: Catalog application: 0.03 Nov 28 03:01:48 localhost puppet-user[51402]: Config retrieval: 0.16 Nov 28 03:01:48 localhost puppet-user[51402]: Last run: 1764316908 Nov 28 03:01:48 localhost puppet-user[51402]: Total: 0.03 Nov 28 03:01:48 localhost puppet-user[51402]: Version: Nov 28 03:01:48 localhost puppet-user[51402]: Config: 1764316908 Nov 28 03:01:48 localhost puppet-user[51402]: Puppet: 7.10.0 Nov 28 03:01:48 localhost puppet-user[51422]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[reset-iscsi-initiator-name]/returns: executed successfully Nov 28 03:01:48 localhost puppet-user[51422]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/File[/etc/iscsi/.initiator_reset]/ensure: created Nov 28 03:01:48 localhost puppet-user[51359]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.38 seconds Nov 28 03:01:48 localhost puppet-user[51422]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[sync-iqn-to-host]/returns: executed successfully Nov 28 03:01:48 localhost puppet-user[51364]: Warning: Scope(Class[Nova]): The os_region_name parameter is deprecated and will be removed \ Nov 28 03:01:48 localhost puppet-user[51364]: in a future release. Use nova::cinder::os_region_name instead Nov 28 03:01:48 localhost puppet-user[51364]: Warning: Scope(Class[Nova]): The catalog_info parameter is deprecated and will be removed \ Nov 28 03:01:48 localhost puppet-user[51364]: in a future release. Use nova::cinder::catalog_info instead Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/content: content changed '{sha256}aea388a73ebafc7e07a81ddb930a91099211f660eee55fbf92c13007a77501e5' to '{sha256}2523d01ee9c3022c0e9f61d896b1474a168e18472aee141cc278e69fe13f41c1' Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/owner: owner changed 'collectd' to 'root' Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/group: group changed 'collectd' to 'root' Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/mode: mode changed '0644' to '0640' Nov 28 03:01:48 localhost systemd[1]: libpod-49d1198114e3cf2834836d3b9377664b111beda2bfbba5e8342177144c6fdf6b.scope: Deactivated successfully. Nov 28 03:01:48 localhost systemd[1]: libpod-49d1198114e3cf2834836d3b9377664b111beda2bfbba5e8342177144c6fdf6b.scope: Consumed 2.084s CPU time. Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/owner: owner changed 'collectd' to 'root' Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/group: group changed 'collectd' to 'root' Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/mode: mode changed '0755' to '0750' Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-cpu.conf]/ensure: removed Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-interface.conf]/ensure: removed Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-load.conf]/ensure: removed Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-memory.conf]/ensure: removed Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-syslog.conf]/ensure: removed Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/apache.conf]/ensure: removed Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/dns.conf]/ensure: removed Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ipmi.conf]/ensure: removed Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mcelog.conf]/ensure: removed Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mysql.conf]/ensure: removed Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-events.conf]/ensure: removed Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-stats.conf]/ensure: removed Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ping.conf]/ensure: removed Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/pmu.conf]/ensure: removed Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/rdt.conf]/ensure: removed Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/sensors.conf]/ensure: removed Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/snmp.conf]/ensure: removed Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/write_prometheus.conf]/ensure: removed Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Python/File[/usr/lib/python3.9/site-packages]/mode: mode changed '0755' to '0750' Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Python/Collectd::Plugin[python]/File[python.load]/ensure: defined content as '{sha256}0163924a0099dd43fe39cb85e836df147fd2cfee8197dc6866d3c384539eb6ee' Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Python/Concat[/etc/collectd.d/python-config.conf]/File[/etc/collectd.d/python-config.conf]/ensure: defined content as '{sha256}2e5fb20e60b30f84687fc456a37fc62451000d2d85f5bbc1b3fca3a5eac9deeb' Nov 28 03:01:48 localhost puppet-user[51364]: Warning: Unknown variable: '::nova::compute::verify_glance_signatures'. (file: /etc/puppet/modules/nova/manifests/glance.pp, line: 62, column: 41) Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Logfile/Collectd::Plugin[logfile]/File[logfile.load]/ensure: defined content as '{sha256}07bbda08ef9b824089500bdc6ac5a86e7d1ef2ae3ed4ed423c0559fe6361e5af' Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Amqp1/Collectd::Plugin[amqp1]/File[amqp1.load]/ensure: defined content as '{sha256}dee3f10cb1ff461ac3f1e743a5ef3f06993398c6c829895de1dae7f242a64b39' Nov 28 03:01:48 localhost podman[51872]: 2025-11-28 08:01:48.863676254 +0000 UTC m=+0.047150009 container died 49d1198114e3cf2834836d3b9377664b111beda2bfbba5e8342177144c6fdf6b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, version=17.1.12, build-date=2025-11-18T22:49:32Z, container_name=container-puppet-crond, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Ceph/Collectd::Plugin[ceph]/File[ceph.load]/ensure: defined content as '{sha256}c796abffda2e860875295b4fc11cc95c6032b4e13fa8fb128e839a305aa1676c' Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Cpu/Collectd::Plugin[cpu]/File[cpu.load]/ensure: defined content as '{sha256}67d4c8bf6bf5785f4cb6b596712204d9eacbcebbf16fe289907195d4d3cb0e34' Nov 28 03:01:48 localhost systemd[1]: tmp-crun.GS3uEZ.mount: Deactivated successfully. Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Df/Collectd::Plugin[df]/File[df.load]/ensure: defined content as '{sha256}edeb4716d96fc9dca2c6adfe07bae70ba08c6af3944a3900581cba0f08f3c4ba' Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Disk/Collectd::Plugin[disk]/File[disk.load]/ensure: defined content as '{sha256}1d0cb838278f3226fcd381f0fc2e0e1abaf0d590f4ba7bcb2fc6ec113d3ebde7' Nov 28 03:01:48 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-49d1198114e3cf2834836d3b9377664b111beda2bfbba5e8342177144c6fdf6b-userdata-shm.mount: Deactivated successfully. Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[hugepages.load]/ensure: defined content as '{sha256}9b9f35b65a73da8d4037e4355a23b678f2cf61997ccf7a5e1adf2a7ce6415827' Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[older_hugepages.load]/ensure: removed Nov 28 03:01:48 localhost systemd[1]: libpod-d33e206cd9e309107e146b933b6ae92c41f3aaab17f4f1c3587ef51309b905d4.scope: Deactivated successfully. Nov 28 03:01:48 localhost systemd[1]: libpod-d33e206cd9e309107e146b933b6ae92c41f3aaab17f4f1c3587ef51309b905d4.scope: Consumed 2.197s CPU time. Nov 28 03:01:48 localhost puppet-user[51364]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_base_images'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 44, column: 5) Nov 28 03:01:48 localhost puppet-user[51364]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_original_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 48, column: 5) Nov 28 03:01:48 localhost puppet-user[51364]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_resized_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 52, column: 5) Nov 28 03:01:48 localhost puppet-user[51364]: Warning: Scope(Class[Tripleo::Profile::Base::Nova::Compute]): The keymgr_backend parameter has been deprecated Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Interface/Collectd::Plugin[interface]/File[interface.load]/ensure: defined content as '{sha256}b76b315dc312e398940fe029c6dbc5c18d2b974ff7527469fc7d3617b5222046' Nov 28 03:01:48 localhost podman[51872]: 2025-11-28 08:01:48.975347724 +0000 UTC m=+0.158821479 container cleanup 49d1198114e3cf2834836d3b9377664b111beda2bfbba5e8342177144c6fdf6b (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, container_name=container-puppet-crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_puppet_step1, version=17.1.12) Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Load/Collectd::Plugin[load]/File[load.load]/ensure: defined content as '{sha256}af2403f76aebd2f10202d66d2d55e1a8d987eed09ced5a3e3873a4093585dc31' Nov 28 03:01:48 localhost puppet-user[51364]: Warning: Scope(Class[Nova::Compute]): vcpu_pin_set is deprecated, instead use cpu_dedicated_set or cpu_shared_set. Nov 28 03:01:48 localhost puppet-user[51364]: Warning: Scope(Class[Nova::Compute]): verify_glance_signatures is deprecated. Use the same parameter in nova::glance Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Memory/Collectd::Plugin[memory]/File[memory.load]/ensure: defined content as '{sha256}0f270425ee6b05fc9440ee32b9afd1010dcbddd9b04ca78ff693858f7ecb9d0e' Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Unixsock/Collectd::Plugin[unixsock]/File[unixsock.load]/ensure: defined content as '{sha256}9d1ec1c51ba386baa6f62d2e019dbd6998ad924bf868b3edc2d24d3dc3c63885' Nov 28 03:01:48 localhost systemd[1]: libpod-conmon-49d1198114e3cf2834836d3b9377664b111beda2bfbba5e8342177144c6fdf6b.scope: Deactivated successfully. Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Uptime/Collectd::Plugin[uptime]/File[uptime.load]/ensure: defined content as '{sha256}f7a26c6369f904d0ca1af59627ebea15f5e72160bcacdf08d217af282b42e5c0' Nov 28 03:01:48 localhost python3[51054]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-crond --conmon-pidfile /run/container-puppet-crond.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538513 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=crond --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::logging::logrotate --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-crond --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-crond.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[virt.load]/ensure: defined content as '{sha256}9a2bcf913f6bf8a962a0ff351a9faea51ae863cc80af97b77f63f8ab68941c62' Nov 28 03:01:48 localhost puppet-user[51359]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[older_virt.load]/ensure: removed Nov 28 03:01:49 localhost puppet-user[51359]: Notice: Applied catalog in 0.33 seconds Nov 28 03:01:49 localhost puppet-user[51359]: Application: Nov 28 03:01:49 localhost puppet-user[51359]: Initial environment: production Nov 28 03:01:49 localhost puppet-user[51359]: Converged environment: production Nov 28 03:01:49 localhost puppet-user[51359]: Run mode: user Nov 28 03:01:49 localhost puppet-user[51359]: Changes: Nov 28 03:01:49 localhost puppet-user[51359]: Total: 43 Nov 28 03:01:49 localhost puppet-user[51359]: Events: Nov 28 03:01:49 localhost puppet-user[51359]: Success: 43 Nov 28 03:01:49 localhost puppet-user[51359]: Total: 43 Nov 28 03:01:49 localhost puppet-user[51359]: Resources: Nov 28 03:01:49 localhost puppet-user[51359]: Skipped: 14 Nov 28 03:01:49 localhost puppet-user[51359]: Changed: 38 Nov 28 03:01:49 localhost puppet-user[51359]: Out of sync: 38 Nov 28 03:01:49 localhost puppet-user[51359]: Total: 82 Nov 28 03:01:49 localhost puppet-user[51359]: Time: Nov 28 03:01:49 localhost puppet-user[51359]: File: 0.17 Nov 28 03:01:49 localhost puppet-user[51359]: Transaction evaluation: 0.32 Nov 28 03:01:49 localhost puppet-user[51359]: Catalog application: 0.33 Nov 28 03:01:49 localhost puppet-user[51359]: Config retrieval: 0.46 Nov 28 03:01:49 localhost puppet-user[51359]: Last run: 1764316909 Nov 28 03:01:49 localhost puppet-user[51359]: Concat file: 0.00 Nov 28 03:01:49 localhost puppet-user[51359]: Concat fragment: 0.00 Nov 28 03:01:49 localhost puppet-user[51359]: Total: 0.33 Nov 28 03:01:49 localhost puppet-user[51359]: Version: Nov 28 03:01:49 localhost puppet-user[51359]: Config: 1764316908 Nov 28 03:01:49 localhost puppet-user[51359]: Puppet: 7.10.0 Nov 28 03:01:49 localhost puppet-user[51422]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Augeas[chap_algs in /etc/iscsi/iscsid.conf]/returns: executed successfully Nov 28 03:01:49 localhost puppet-user[51422]: Notice: Applied catalog in 0.50 seconds Nov 28 03:01:49 localhost puppet-user[51422]: Application: Nov 28 03:01:49 localhost puppet-user[51422]: Initial environment: production Nov 28 03:01:49 localhost puppet-user[51422]: Converged environment: production Nov 28 03:01:49 localhost puppet-user[51422]: Run mode: user Nov 28 03:01:49 localhost puppet-user[51422]: Changes: Nov 28 03:01:49 localhost puppet-user[51422]: Total: 4 Nov 28 03:01:49 localhost puppet-user[51422]: Events: Nov 28 03:01:49 localhost puppet-user[51422]: Success: 4 Nov 28 03:01:49 localhost puppet-user[51422]: Total: 4 Nov 28 03:01:49 localhost puppet-user[51422]: Resources: Nov 28 03:01:49 localhost puppet-user[51422]: Changed: 4 Nov 28 03:01:49 localhost puppet-user[51422]: Out of sync: 4 Nov 28 03:01:49 localhost puppet-user[51422]: Skipped: 8 Nov 28 03:01:49 localhost puppet-user[51422]: Total: 13 Nov 28 03:01:49 localhost puppet-user[51422]: Time: Nov 28 03:01:49 localhost puppet-user[51422]: File: 0.00 Nov 28 03:01:49 localhost puppet-user[51422]: Exec: 0.06 Nov 28 03:01:49 localhost puppet-user[51422]: Config retrieval: 0.14 Nov 28 03:01:49 localhost puppet-user[51422]: Augeas: 0.42 Nov 28 03:01:49 localhost puppet-user[51422]: Transaction evaluation: 0.50 Nov 28 03:01:49 localhost puppet-user[51422]: Catalog application: 0.50 Nov 28 03:01:49 localhost puppet-user[51422]: Last run: 1764316909 Nov 28 03:01:49 localhost puppet-user[51422]: Total: 0.50 Nov 28 03:01:49 localhost puppet-user[51422]: Version: Nov 28 03:01:49 localhost puppet-user[51422]: Config: 1764316908 Nov 28 03:01:49 localhost puppet-user[51422]: Puppet: 7.10.0 Nov 28 03:01:49 localhost podman[51904]: 2025-11-28 08:01:49.062870406 +0000 UTC m=+0.077133047 container died d33e206cd9e309107e146b933b6ae92c41f3aaab17f4f1c3587ef51309b905d4 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-metrics_qdr, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_puppet_step1, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:01:49 localhost podman[51904]: 2025-11-28 08:01:49.092835796 +0000 UTC m=+0.107098397 container cleanup d33e206cd9e309107e146b933b6ae92c41f3aaab17f4f1c3587ef51309b905d4 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-metrics_qdr, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_puppet_step1) Nov 28 03:01:49 localhost systemd[1]: libpod-conmon-d33e206cd9e309107e146b933b6ae92c41f3aaab17f4f1c3587ef51309b905d4.scope: Deactivated successfully. Nov 28 03:01:49 localhost python3[51054]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-metrics_qdr --conmon-pidfile /run/container-puppet-metrics_qdr.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538513 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=metrics_qdr --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::metrics::qdr#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-metrics_qdr --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-metrics_qdr.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Nov 28 03:01:49 localhost systemd[1]: var-lib-containers-storage-overlay-aa2c8f8913bff67439dab4aeb6db6e2056909e99c44d0de88f7837d07d6c8847-merged.mount: Deactivated successfully. Nov 28 03:01:49 localhost systemd[1]: var-lib-containers-storage-overlay-29b4c40b440520fe09ba18ff760c1e8bcaccec35e4300eedc14527806169da1c-merged.mount: Deactivated successfully. Nov 28 03:01:49 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d33e206cd9e309107e146b933b6ae92c41f3aaab17f4f1c3587ef51309b905d4-userdata-shm.mount: Deactivated successfully. Nov 28 03:01:49 localhost puppet-user[51364]: Warning: Scope(Class[Nova::Compute::Libvirt]): nova::compute::libvirt::images_type will be required if rbd ephemeral storage is used. Nov 28 03:01:49 localhost systemd[1]: libpod-6209037692ad6a27fd5ca43e0f8d40701d26de0335f5c337009fe96a996c7eb5.scope: Deactivated successfully. Nov 28 03:01:49 localhost systemd[1]: libpod-6209037692ad6a27fd5ca43e0f8d40701d26de0335f5c337009fe96a996c7eb5.scope: Consumed 2.542s CPU time. Nov 28 03:01:49 localhost podman[51290]: 2025-11-28 08:01:49.364272533 +0000 UTC m=+4.123297971 container died 6209037692ad6a27fd5ca43e0f8d40701d26de0335f5c337009fe96a996c7eb5 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, config_id=tripleo_puppet_step1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:01:49 localhost systemd[1]: libpod-a788096a7b720f3740c34aa5bb7ae69f2852b290197c68c46688dd4c33a52360.scope: Deactivated successfully. Nov 28 03:01:49 localhost systemd[1]: libpod-a788096a7b720f3740c34aa5bb7ae69f2852b290197c68c46688dd4c33a52360.scope: Consumed 2.719s CPU time. Nov 28 03:01:49 localhost podman[51209]: 2025-11-28 08:01:49.375316199 +0000 UTC m=+4.246172462 container died a788096a7b720f3740c34aa5bb7ae69f2852b290197c68c46688dd4c33a52360 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.12, tcib_managed=true, io.openshift.expose-services=, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, container_name=container-puppet-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Nov 28 03:01:49 localhost systemd[1]: tmp-crun.V8VMN7.mount: Deactivated successfully. Nov 28 03:01:49 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6209037692ad6a27fd5ca43e0f8d40701d26de0335f5c337009fe96a996c7eb5-userdata-shm.mount: Deactivated successfully. Nov 28 03:01:49 localhost podman[52090]: 2025-11-28 08:01:49.482185869 +0000 UTC m=+0.108684928 container cleanup 6209037692ad6a27fd5ca43e0f8d40701d26de0335f5c337009fe96a996c7eb5 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.12, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=container-puppet-iscsid, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 28 03:01:49 localhost systemd[1]: libpod-conmon-6209037692ad6a27fd5ca43e0f8d40701d26de0335f5c337009fe96a996c7eb5.scope: Deactivated successfully. Nov 28 03:01:49 localhost python3[51054]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-iscsid --conmon-pidfile /run/container-puppet-iscsid.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538513 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,iscsid_config --env NAME=iscsid --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::iscsid#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-iscsid --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-iscsid.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/iscsi:/tmp/iscsi.host:z --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Nov 28 03:01:49 localhost podman[52110]: 2025-11-28 08:01:49.507228944 +0000 UTC m=+0.107266293 container create d50eb1873bbbfb25c378d8d09e1b58b68df7de1660980d5d6de8b3858f88a068 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, container_name=container-puppet-ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, distribution-scope=public, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}) Nov 28 03:01:49 localhost systemd[1]: Started libpod-conmon-d50eb1873bbbfb25c378d8d09e1b58b68df7de1660980d5d6de8b3858f88a068.scope. Nov 28 03:01:49 localhost podman[52110]: 2025-11-28 08:01:49.44298505 +0000 UTC m=+0.043022399 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Nov 28 03:01:49 localhost podman[52118]: 2025-11-28 08:01:49.548152467 +0000 UTC m=+0.136602123 container create 01fcb2b9a34ed7002e2c1ab9793a6c6e2f4640510df7d4a54d984a2cebb71adf (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, name=rhosp17/openstack-rsyslog, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-18T22:49:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, container_name=container-puppet-rsyslog, config_id=tripleo_puppet_step1, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, com.redhat.component=openstack-rsyslog-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, release=1761123044, vendor=Red Hat, Inc.) Nov 28 03:01:49 localhost systemd[1]: Started libcrun container. Nov 28 03:01:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4619bcf03a81c0f8085ac49d4104b51527a7e62eb19e6e3702042c0e09f1d20b/merged/etc/sysconfig/modules supports timestamps until 2038 (0x7fffffff) Nov 28 03:01:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4619bcf03a81c0f8085ac49d4104b51527a7e62eb19e6e3702042c0e09f1d20b/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 28 03:01:49 localhost podman[52110]: 2025-11-28 08:01:49.571493328 +0000 UTC m=+0.171530717 container init d50eb1873bbbfb25c378d8d09e1b58b68df7de1660980d5d6de8b3858f88a068 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, container_name=container-puppet-ovn_controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_puppet_step1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc.) Nov 28 03:01:49 localhost podman[52110]: 2025-11-28 08:01:49.579987744 +0000 UTC m=+0.180025133 container start d50eb1873bbbfb25c378d8d09e1b58b68df7de1660980d5d6de8b3858f88a068 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=container-puppet-ovn_controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_id=tripleo_puppet_step1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:01:49 localhost podman[52110]: 2025-11-28 08:01:49.580457849 +0000 UTC m=+0.180495288 container attach d50eb1873bbbfb25c378d8d09e1b58b68df7de1660980d5d6de8b3858f88a068 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=container-puppet-ovn_controller, config_id=tripleo_puppet_step1, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z) Nov 28 03:01:49 localhost podman[52118]: 2025-11-28 08:01:49.482167598 +0000 UTC m=+0.070617264 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Nov 28 03:01:49 localhost systemd[1]: Started libpod-conmon-01fcb2b9a34ed7002e2c1ab9793a6c6e2f4640510df7d4a54d984a2cebb71adf.scope. Nov 28 03:01:49 localhost systemd[1]: Started libcrun container. Nov 28 03:01:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc098275d7f03b1c1fd504750e41e859fc80e308322902c4e50d7df8300f206a/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 28 03:01:49 localhost podman[52100]: 2025-11-28 08:01:49.630165707 +0000 UTC m=+0.241924144 container cleanup a788096a7b720f3740c34aa5bb7ae69f2852b290197c68c46688dd4c33a52360 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, container_name=container-puppet-collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, name=rhosp17/openstack-collectd, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:01:49 localhost systemd[1]: libpod-conmon-a788096a7b720f3740c34aa5bb7ae69f2852b290197c68c46688dd4c33a52360.scope: Deactivated successfully. Nov 28 03:01:49 localhost python3[51054]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-collectd --conmon-pidfile /run/container-puppet-collectd.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538513 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,collectd_client_config,exec --env NAME=collectd --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::metrics::collectd --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-collectd --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-collectd.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Nov 28 03:01:49 localhost podman[52118]: 2025-11-28 08:01:49.683279972 +0000 UTC m=+0.271729628 container init 01fcb2b9a34ed7002e2c1ab9793a6c6e2f4640510df7d4a54d984a2cebb71adf (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.component=openstack-rsyslog-container, build-date=2025-11-18T22:49:49Z, name=rhosp17/openstack-rsyslog, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=container-puppet-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_puppet_step1) Nov 28 03:01:49 localhost podman[52118]: 2025-11-28 08:01:49.691888801 +0000 UTC m=+0.280338457 container start 01fcb2b9a34ed7002e2c1ab9793a6c6e2f4640510df7d4a54d984a2cebb71adf (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, name=rhosp17/openstack-rsyslog, url=https://www.redhat.com, config_id=tripleo_puppet_step1, io.buildah.version=1.41.4, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, release=1761123044, container_name=container-puppet-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:01:49 localhost podman[52118]: 2025-11-28 08:01:49.692079397 +0000 UTC m=+0.280529053 container attach 01fcb2b9a34ed7002e2c1ab9793a6c6e2f4640510df7d4a54d984a2cebb71adf (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_puppet_step1, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:49Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, io.buildah.version=1.41.4, container_name=container-puppet-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-rsyslog, version=17.1.12, com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com) Nov 28 03:01:49 localhost puppet-user[51364]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 1.39 seconds Nov 28 03:01:49 localhost puppet-user[51508]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 28 03:01:49 localhost puppet-user[51508]: (file: /etc/puppet/hiera.yaml) Nov 28 03:01:49 localhost puppet-user[51508]: Warning: Undefined variable '::deploy_config_name'; Nov 28 03:01:49 localhost puppet-user[51508]: (file & line not available) Nov 28 03:01:49 localhost puppet-user[51508]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 28 03:01:49 localhost puppet-user[51508]: (file & line not available) Nov 28 03:01:50 localhost puppet-user[51508]: Warning: Unknown variable: '::ceilometer::cache_backend'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 145, column: 39) Nov 28 03:01:50 localhost puppet-user[51508]: Warning: Unknown variable: '::ceilometer::memcache_servers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 146, column: 39) Nov 28 03:01:50 localhost puppet-user[51508]: Warning: Unknown variable: '::ceilometer::cache_tls_enabled'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 147, column: 39) Nov 28 03:01:50 localhost puppet-user[51508]: Warning: Unknown variable: '::ceilometer::cache_tls_cafile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 148, column: 39) Nov 28 03:01:50 localhost puppet-user[51508]: Warning: Unknown variable: '::ceilometer::cache_tls_certfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 149, column: 39) Nov 28 03:01:50 localhost puppet-user[51508]: Warning: Unknown variable: '::ceilometer::cache_tls_keyfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 150, column: 39) Nov 28 03:01:50 localhost puppet-user[51508]: Warning: Unknown variable: '::ceilometer::cache_tls_allowed_ciphers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 151, column: 39) Nov 28 03:01:50 localhost puppet-user[51508]: Warning: Unknown variable: '::ceilometer::manage_backend_package'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 152, column: 39) Nov 28 03:01:50 localhost puppet-user[51508]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_password'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 63, column: 25) Nov 28 03:01:50 localhost puppet-user[51508]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_url'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 68, column: 25) Nov 28 03:01:50 localhost puppet-user[51508]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_region'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 69, column: 28) Nov 28 03:01:50 localhost puppet-user[51508]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 70, column: 25) Nov 28 03:01:50 localhost puppet-user[51508]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_tenant_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 71, column: 29) Nov 28 03:01:50 localhost puppet-user[51508]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_cacert'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 72, column: 23) Nov 28 03:01:50 localhost puppet-user[51508]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_endpoint_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 73, column: 26) Nov 28 03:01:50 localhost puppet-user[51508]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 74, column: 33) Nov 28 03:01:50 localhost puppet-user[51508]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_project_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 75, column: 36) Nov 28 03:01:50 localhost puppet-user[51508]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 76, column: 26) Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File[/etc/nova/migration/identity]/content: content changed '{sha256}86610d84e745a3992358ae0b747297805d075492e5114c666fa08f8aecce7da0' to '{sha256}e8f4c9c311633f219a6b4c8a97d1389467ae0d86e6640d015eb10a4c73ac6b8b' Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File_line[nova_ssh_port]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/File[/etc/sasl2/libvirt.conf]/content: content changed '{sha256}78510a0d6f14b269ddeb9f9638dfdfba9f976d370ee2ec04ba25352a8af6df35' to '{sha256}6d7bcae773217a30c0772f75d0d1b6d21f5d64e72853f5e3d91bb47799dbb7fe' Nov 28 03:01:50 localhost puppet-user[51364]: Warning: Empty environment setting 'TLS_PASSWORD' Nov 28 03:01:50 localhost puppet-user[51364]: (file: /etc/puppet/modules/tripleo/manifests/profile/base/nova/libvirt.pp, line: 182) Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/Exec[set libvirt sasl credentials]/returns: executed successfully Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File[/etc/nova/migration/authorized_keys]/content: content changed '{sha256}0d05a8832f36c0517b84e9c3ad11069d531c7d2be5297661e5552fd29e3a5e47' to '{sha256}ae9c4ab6bedd07e63d6f2c3a5743334d26ea3ed4d1f695ab855f72927fdb71bc' Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File_line[nova_migration_logindefs]/ensure: created Nov 28 03:01:50 localhost systemd[1]: var-lib-containers-storage-overlay-abe1c4a3f68b93d18940fc38699bd560bf3f77db8c7795c1004a1bfa9999fcdb-merged.mount: Deactivated successfully. Nov 28 03:01:50 localhost systemd[1]: var-lib-containers-storage-overlay-c554e4db742c3febdfb49af667b5d853d210243bf4484c52c41efc08d328831f-merged.mount: Deactivated successfully. Nov 28 03:01:50 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a788096a7b720f3740c34aa5bb7ae69f2852b290197c68c46688dd4c33a52360-userdata-shm.mount: Deactivated successfully. Nov 28 03:01:50 localhost puppet-user[51508]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.36 seconds Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/never_download_image_if_on_rbd]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/disable_compute_service_check_for_ffu]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ssl_only]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/my_ip]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/http_timeout]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/host]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[publisher/telemetry_secret]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/host]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_name]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_password]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_url]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/region_name]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/username]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/password]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/cpu_allocation_ratio]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_name]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/interface]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ram_allocation_ratio]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/user_domain_name]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_domain_name]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_type]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/disk_allocation_ratio]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[compute/instance_discovery_method]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[DEFAULT/polling_namespaces]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/dhcp_domain]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[polling/tenant_name_discovery]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[coordination/backend_url]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[vif_plug_ovs/ovsdb_connection]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[notifications/notification_format]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/backend]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/enabled]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/memcache_servers]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/tls_enabled]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Rabbit[ceilometer_config]/Ceilometer_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/rpc_address_prefix]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/notify_address_prefix]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[notifications/notify_on_state_change]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova/Nova_config[cinder/cross_az_attach]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Glance/Nova_config[glance/valid_interfaces]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/driver]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/transport_url]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/topics]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Default[ceilometer_config]/Ceilometer_config[DEFAULT/transport_url]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/debug]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/log_dir]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_type]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_url]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Notice: Applied catalog in 0.41 seconds Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/password]/ensure: created Nov 28 03:01:50 localhost puppet-user[51508]: Application: Nov 28 03:01:50 localhost puppet-user[51508]: Initial environment: production Nov 28 03:01:50 localhost puppet-user[51508]: Converged environment: production Nov 28 03:01:50 localhost puppet-user[51508]: Run mode: user Nov 28 03:01:50 localhost puppet-user[51508]: Changes: Nov 28 03:01:50 localhost puppet-user[51508]: Total: 31 Nov 28 03:01:50 localhost puppet-user[51508]: Events: Nov 28 03:01:50 localhost puppet-user[51508]: Success: 31 Nov 28 03:01:50 localhost puppet-user[51508]: Total: 31 Nov 28 03:01:50 localhost puppet-user[51508]: Resources: Nov 28 03:01:50 localhost puppet-user[51508]: Skipped: 22 Nov 28 03:01:50 localhost puppet-user[51508]: Changed: 31 Nov 28 03:01:50 localhost puppet-user[51508]: Out of sync: 31 Nov 28 03:01:50 localhost puppet-user[51508]: Total: 151 Nov 28 03:01:50 localhost puppet-user[51508]: Time: Nov 28 03:01:50 localhost puppet-user[51508]: Package: 0.03 Nov 28 03:01:50 localhost puppet-user[51508]: Ceilometer config: 0.32 Nov 28 03:01:50 localhost puppet-user[51508]: Transaction evaluation: 0.40 Nov 28 03:01:50 localhost puppet-user[51508]: Catalog application: 0.41 Nov 28 03:01:50 localhost puppet-user[51508]: Config retrieval: 0.43 Nov 28 03:01:50 localhost puppet-user[51508]: Last run: 1764316910 Nov 28 03:01:50 localhost puppet-user[51508]: Resources: 0.00 Nov 28 03:01:50 localhost puppet-user[51508]: Total: 0.41 Nov 28 03:01:50 localhost puppet-user[51508]: Version: Nov 28 03:01:50 localhost puppet-user[51508]: Config: 1764316909 Nov 28 03:01:50 localhost puppet-user[51508]: Puppet: 7.10.0 Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_domain_name]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_name]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/user_domain_name]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/username]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/region_name]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/valid_interfaces]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/password]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_type]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_url]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/region_name]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_name]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_domain_name]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/username]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/user_domain_name]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/os_region_name]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/catalog_info]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/manager_interval]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_base_images]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_original_minimum_age_seconds]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_resized_minimum_age_seconds]/ensure: created Nov 28 03:01:50 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/precache_concurrency]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/project_domain_name]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/user_domain_name]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Provider/Nova_config[compute/provider_config_location]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Provider/File[/etc/nova/provider_config]/ensure: created Nov 28 03:01:51 localhost systemd[1]: libpod-e9e81dc4d463c3e33e320bcac7f504711c3df69a454f9c676b6265b24fcfa267.scope: Deactivated successfully. Nov 28 03:01:51 localhost systemd[1]: libpod-e9e81dc4d463c3e33e320bcac7f504711c3df69a454f9c676b6265b24fcfa267.scope: Consumed 2.995s CPU time. Nov 28 03:01:51 localhost podman[51465]: 2025-11-28 08:01:51.138628684 +0000 UTC m=+3.480027610 container died e9e81dc4d463c3e33e320bcac7f504711c3df69a454f9c676b6265b24fcfa267 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, name=rhosp17/openstack-ceilometer-central, distribution-scope=public, container_name=container-puppet-ceilometer, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-central, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, build-date=2025-11-19T00:11:59Z, com.redhat.component=openstack-ceilometer-central-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central) Nov 28 03:01:51 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e9e81dc4d463c3e33e320bcac7f504711c3df69a454f9c676b6265b24fcfa267-userdata-shm.mount: Deactivated successfully. Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/use_cow_images]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/mkisofs_cmd]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]/ensure: created Nov 28 03:01:51 localhost systemd[1]: var-lib-containers-storage-overlay-0de61d8517bf2f9ac2b1f4b59bb4938a9bcb5e653aa4685b7d899b570eb2d566-merged.mount: Deactivated successfully. Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_huge_pages]/ensure: created Nov 28 03:01:51 localhost podman[52382]: 2025-11-28 08:01:51.284165365 +0000 UTC m=+0.132587646 container cleanup e9e81dc4d463c3e33e320bcac7f504711c3df69a454f9c676b6265b24fcfa267 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, build-date=2025-11-19T00:11:59Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 ceilometer-central, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-ceilometer-central-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, name=rhosp17/openstack-ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-ceilometer, io.openshift.expose-services=, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central) Nov 28 03:01:51 localhost systemd[1]: libpod-conmon-e9e81dc4d463c3e33e320bcac7f504711c3df69a454f9c676b6265b24fcfa267.scope: Deactivated successfully. Nov 28 03:01:51 localhost python3[51054]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ceilometer --conmon-pidfile /run/container-puppet-ceilometer.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538513 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config --env NAME=ceilometer --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::ceilometer::agent::polling#012include tripleo::profile::base::ceilometer::agent::polling#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ceilometer --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ceilometer.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/resume_guests_state_on_host_boot]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[key_manager/backend]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/sync_power_state_interval]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/consecutive_build_service_disable_threshold]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/live_migration_wait_for_vif_plug]/ensure: created Nov 28 03:01:51 localhost puppet-user[52177]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 28 03:01:51 localhost puppet-user[52177]: (file: /etc/puppet/hiera.yaml) Nov 28 03:01:51 localhost puppet-user[52177]: Warning: Undefined variable '::deploy_config_name'; Nov 28 03:01:51 localhost puppet-user[52177]: (file & line not available) Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/max_disk_devices_to_attach]/ensure: created Nov 28 03:01:51 localhost puppet-user[52177]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 28 03:01:51 localhost puppet-user[52177]: (file & line not available) Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Vncproxy::Common/Nova_config[vnc/novncproxy_base_url]/ensure: created Nov 28 03:01:51 localhost puppet-user[52233]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 28 03:01:51 localhost puppet-user[52233]: (file: /etc/puppet/hiera.yaml) Nov 28 03:01:51 localhost puppet-user[52233]: Warning: Undefined variable '::deploy_config_name'; Nov 28 03:01:51 localhost puppet-user[52233]: (file & line not available) Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/server_proxyclient_address]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]/ensure: created Nov 28 03:01:51 localhost puppet-user[52233]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 28 03:01:51 localhost puppet-user[52233]: (file & line not available) Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[spice/enabled]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/default_floating_pool]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]/ensure: created Nov 28 03:01:51 localhost puppet-user[52177]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.25 seconds Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/valid_interfaces]/ensure: created Nov 28 03:01:51 localhost puppet-user[52233]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.22 seconds Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]/ensure: created Nov 28 03:01:51 localhost ovs-vsctl[52536]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote=tcp:172.17.0.103:6642,tcp:172.17.0.104:6642,tcp:172.17.0.105:6642 Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]/ensure: created Nov 28 03:01:51 localhost puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_type]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_uri]/ensure: created Nov 28 03:01:51 localhost ovs-vsctl[52538]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-type=geneve Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_tunnelled]/ensure: created Nov 28 03:01:51 localhost puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-type]/ensure: created Nov 28 03:01:51 localhost ovs-vsctl[52540]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-ip=172.19.0.106 Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_inbound_addr]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_post_copy]/ensure: created Nov 28 03:01:51 localhost puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-ip]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_auto_converge]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tls]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tcp]/ensure: created Nov 28 03:01:51 localhost puppet-user[52233]: Notice: /Stage[main]/Rsyslog::Base/File[/etc/rsyslog.conf]/content: content changed '{sha256}d6f679f6a4eb6f33f9fc20c846cb30bef93811e1c86bc4da1946dc3100b826c3' to '{sha256}7963bd801fadd49a17561f4d3f80738c3f504b413b11c443432d8303138041f2' Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_user]/ensure: created Nov 28 03:01:51 localhost puppet-user[52233]: Notice: /Stage[main]/Rsyslog::Config::Global/Rsyslog::Component::Global_config[MaxMessageSize]/Rsyslog::Generate_concat[rsyslog::concat::global_config::MaxMessageSize]/Concat[/etc/rsyslog.d/00_rsyslog.conf]/File[/etc/rsyslog.d/00_rsyslog.conf]/ensure: defined content as '{sha256}a291d5cc6d5884a978161f4c7b5831d43edd07797cc590bae366e7f150b8643b' Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_secret_uuid]/ensure: created Nov 28 03:01:51 localhost ovs-vsctl[52543]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:hostname=np0005538513.localdomain Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Rbd/File[/etc/nova/secret.xml]/ensure: defined content as '{sha256}af00b55795dabd7a8ca15fb762e773701eb5c91ea4ae135b9bcdde564d7077dd' Nov 28 03:01:51 localhost puppet-user[52233]: Notice: /Stage[main]/Rsyslog::Config::Templates/Rsyslog::Component::Template[rsyslog-node-index]/Rsyslog::Generate_concat[rsyslog::concat::template::rsyslog-node-index]/Concat[/etc/rsyslog.d/50_openstack_logs.conf]/File[/etc/rsyslog.d/50_openstack_logs.conf]/ensure: defined content as '{sha256}bc8c213fdf58f8f987d47662b8c132319595d70e171ac4f45ccffbcd69fa92c7' Nov 28 03:01:51 localhost puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:hostname]/value: value changed 'np0005538513.novalocal' to 'np0005538513.localdomain' Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_type]/ensure: created Nov 28 03:01:51 localhost puppet-user[52233]: Notice: Applied catalog in 0.12 seconds Nov 28 03:01:51 localhost puppet-user[52233]: Application: Nov 28 03:01:51 localhost puppet-user[52233]: Initial environment: production Nov 28 03:01:51 localhost puppet-user[52233]: Converged environment: production Nov 28 03:01:51 localhost puppet-user[52233]: Run mode: user Nov 28 03:01:51 localhost puppet-user[52233]: Changes: Nov 28 03:01:51 localhost puppet-user[52233]: Total: 3 Nov 28 03:01:51 localhost puppet-user[52233]: Events: Nov 28 03:01:51 localhost puppet-user[52233]: Success: 3 Nov 28 03:01:51 localhost puppet-user[52233]: Total: 3 Nov 28 03:01:51 localhost puppet-user[52233]: Resources: Nov 28 03:01:51 localhost puppet-user[52233]: Skipped: 11 Nov 28 03:01:51 localhost puppet-user[52233]: Changed: 3 Nov 28 03:01:51 localhost puppet-user[52233]: Out of sync: 3 Nov 28 03:01:51 localhost puppet-user[52233]: Total: 25 Nov 28 03:01:51 localhost puppet-user[52233]: Time: Nov 28 03:01:51 localhost puppet-user[52233]: Concat file: 0.00 Nov 28 03:01:51 localhost puppet-user[52233]: Concat fragment: 0.00 Nov 28 03:01:51 localhost puppet-user[52233]: File: 0.02 Nov 28 03:01:51 localhost puppet-user[52233]: Transaction evaluation: 0.12 Nov 28 03:01:51 localhost puppet-user[52233]: Catalog application: 0.12 Nov 28 03:01:51 localhost puppet-user[52233]: Config retrieval: 0.27 Nov 28 03:01:51 localhost puppet-user[52233]: Last run: 1764316911 Nov 28 03:01:51 localhost puppet-user[52233]: Total: 0.12 Nov 28 03:01:51 localhost puppet-user[52233]: Version: Nov 28 03:01:51 localhost puppet-user[52233]: Config: 1764316911 Nov 28 03:01:51 localhost puppet-user[52233]: Puppet: 7.10.0 Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_pool]/ensure: created Nov 28 03:01:51 localhost ovs-vsctl[52545]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge=br-int Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_ceph_conf]/ensure: created Nov 28 03:01:51 localhost puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_store_name]/ensure: created Nov 28 03:01:51 localhost ovs-vsctl[52547]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote-probe-interval=60000 Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_poll_interval]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_timeout]/ensure: created Nov 28 03:01:51 localhost puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote-probe-interval]/ensure: created Nov 28 03:01:51 localhost ovs-vsctl[52554]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-openflow-probe-interval=60 Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]/ensure: created Nov 28 03:01:51 localhost puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-openflow-probe-interval]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/preallocate_images]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/server_listen]/ensure: created Nov 28 03:01:51 localhost ovs-vsctl[52557]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-monitor-all=true Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]/ensure: created Nov 28 03:01:51 localhost puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-monitor-all]/ensure: created Nov 28 03:01:51 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]/ensure: created Nov 28 03:01:52 localhost ovs-vsctl[52562]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-ofctrl-wait-before-clear=8000 Nov 28 03:01:52 localhost puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-ofctrl-wait-before-clear]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]/ensure: created Nov 28 03:01:52 localhost ovs-vsctl[52564]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-tos=0 Nov 28 03:01:52 localhost puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-tos]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_disk_discard]/ensure: created Nov 28 03:01:52 localhost ovs-vsctl[52572]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-chassis-mac-mappings=datacentre:fa:16:3e:ab:c7:63 Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_machine_type]/ensure: created Nov 28 03:01:52 localhost puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-chassis-mac-mappings]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/enabled_perf_events]/ensure: created Nov 28 03:01:52 localhost ovs-vsctl[52582]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge-mappings=datacentre:br-ex Nov 28 03:01:52 localhost puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge-mappings]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/rx_queue_size]/ensure: created Nov 28 03:01:52 localhost ovs-vsctl[52584]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-match-northd-version=false Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/tx_queue_size]/ensure: created Nov 28 03:01:52 localhost puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-match-northd-version]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/file_backed_memory]/ensure: created Nov 28 03:01:52 localhost ovs-vsctl[52591]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:garp-max-timeout-sec=0 Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/volume_use_multipath]/ensure: created Nov 28 03:01:52 localhost puppet-user[52177]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:garp-max-timeout-sec]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/num_pcie_ports]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/mem_stats_period_seconds]/ensure: created Nov 28 03:01:52 localhost systemd[1]: libpod-01fcb2b9a34ed7002e2c1ab9793a6c6e2f4640510df7d4a54d984a2cebb71adf.scope: Deactivated successfully. Nov 28 03:01:52 localhost systemd[1]: libpod-01fcb2b9a34ed7002e2c1ab9793a6c6e2f4640510df7d4a54d984a2cebb71adf.scope: Consumed 2.342s CPU time. Nov 28 03:01:52 localhost podman[52118]: 2025-11-28 08:01:52.182250913 +0000 UTC m=+2.770700569 container died 01fcb2b9a34ed7002e2c1ab9793a6c6e2f4640510df7d4a54d984a2cebb71adf (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, url=https://www.redhat.com, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=container-puppet-rsyslog, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-rsyslog, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1761123044, build-date=2025-11-18T22:49:49Z, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc.) Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/pmem_namespaces]/ensure: created Nov 28 03:01:52 localhost puppet-user[52177]: Notice: Applied catalog in 0.50 seconds Nov 28 03:01:52 localhost puppet-user[52177]: Application: Nov 28 03:01:52 localhost puppet-user[52177]: Initial environment: production Nov 28 03:01:52 localhost puppet-user[52177]: Converged environment: production Nov 28 03:01:52 localhost puppet-user[52177]: Run mode: user Nov 28 03:01:52 localhost puppet-user[52177]: Changes: Nov 28 03:01:52 localhost puppet-user[52177]: Total: 14 Nov 28 03:01:52 localhost puppet-user[52177]: Events: Nov 28 03:01:52 localhost puppet-user[52177]: Success: 14 Nov 28 03:01:52 localhost puppet-user[52177]: Total: 14 Nov 28 03:01:52 localhost puppet-user[52177]: Resources: Nov 28 03:01:52 localhost puppet-user[52177]: Skipped: 12 Nov 28 03:01:52 localhost puppet-user[52177]: Changed: 14 Nov 28 03:01:52 localhost puppet-user[52177]: Out of sync: 14 Nov 28 03:01:52 localhost puppet-user[52177]: Total: 29 Nov 28 03:01:52 localhost puppet-user[52177]: Time: Nov 28 03:01:52 localhost puppet-user[52177]: Exec: 0.02 Nov 28 03:01:52 localhost puppet-user[52177]: Config retrieval: 0.28 Nov 28 03:01:52 localhost puppet-user[52177]: Vs config: 0.44 Nov 28 03:01:52 localhost puppet-user[52177]: Transaction evaluation: 0.49 Nov 28 03:01:52 localhost puppet-user[52177]: Catalog application: 0.50 Nov 28 03:01:52 localhost puppet-user[52177]: Last run: 1764316912 Nov 28 03:01:52 localhost puppet-user[52177]: Total: 0.50 Nov 28 03:01:52 localhost puppet-user[52177]: Version: Nov 28 03:01:52 localhost puppet-user[52177]: Config: 1764316911 Nov 28 03:01:52 localhost puppet-user[52177]: Puppet: 7.10.0 Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/swtpm_enabled]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_model_extra_flags]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/disk_cachemodes]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_filters]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_outputs]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_filters]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_outputs]/ensure: created Nov 28 03:01:52 localhost systemd[1]: tmp-crun.SMWbOA.mount: Deactivated successfully. Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_filters]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_outputs]/ensure: created Nov 28 03:01:52 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-01fcb2b9a34ed7002e2c1ab9793a6c6e2f4640510df7d4a54d984a2cebb71adf-userdata-shm.mount: Deactivated successfully. Nov 28 03:01:52 localhost systemd[1]: var-lib-containers-storage-overlay-cc098275d7f03b1c1fd504750e41e859fc80e308322902c4e50d7df8300f206a-merged.mount: Deactivated successfully. Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_filters]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_outputs]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_filters]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_outputs]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_filters]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_outputs]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_group]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_ro]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_rw]/ensure: created Nov 28 03:01:52 localhost podman[52602]: 2025-11-28 08:01:52.355940847 +0000 UTC m=+0.160332856 container cleanup 01fcb2b9a34ed7002e2c1ab9793a6c6e2f4640510df7d4a54d984a2cebb71adf (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, container_name=container-puppet-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, name=rhosp17/openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, vendor=Red Hat, Inc.) Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_ro_perms]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_rw_perms]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_group]/ensure: created Nov 28 03:01:52 localhost systemd[1]: libpod-conmon-01fcb2b9a34ed7002e2c1ab9793a6c6e2f4640510df7d4a54d984a2cebb71adf.scope: Deactivated successfully. Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_ro]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_rw]/ensure: created Nov 28 03:01:52 localhost python3[51054]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-rsyslog --conmon-pidfile /run/container-puppet-rsyslog.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538513 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment --env NAME=rsyslog --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::logging::rsyslog --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-rsyslog --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-rsyslog.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_ro_perms]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_rw_perms]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_group]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_ro]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_rw]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_ro_perms]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_rw_perms]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_group]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_ro]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_rw]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_ro_perms]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_rw_perms]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_group]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_ro]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_rw]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_ro_perms]/ensure: created Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_rw_perms]/ensure: created Nov 28 03:01:52 localhost systemd[1]: libpod-d50eb1873bbbfb25c378d8d09e1b58b68df7de1660980d5d6de8b3858f88a068.scope: Deactivated successfully. Nov 28 03:01:52 localhost systemd[1]: libpod-d50eb1873bbbfb25c378d8d09e1b58b68df7de1660980d5d6de8b3858f88a068.scope: Consumed 2.859s CPU time. Nov 28 03:01:52 localhost podman[52110]: 2025-11-28 08:01:52.60973222 +0000 UTC m=+3.209769579 container died d50eb1873bbbfb25c378d8d09e1b58b68df7de1660980d5d6de8b3858f88a068 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, container_name=container-puppet-ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, distribution-scope=public, build-date=2025-11-18T23:34:05Z, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=) Nov 28 03:01:52 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Compute::Libvirt::Qemu/Augeas[qemu-conf-limits]/returns: executed successfully Nov 28 03:01:53 localhost systemd[1]: var-lib-containers-storage-overlay-4619bcf03a81c0f8085ac49d4104b51527a7e62eb19e6e3702042c0e09f1d20b-merged.mount: Deactivated successfully. Nov 28 03:01:53 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d50eb1873bbbfb25c378d8d09e1b58b68df7de1660980d5d6de8b3858f88a068-userdata-shm.mount: Deactivated successfully. Nov 28 03:01:53 localhost podman[52683]: 2025-11-28 08:01:53.75864049 +0000 UTC m=+1.138717571 container cleanup d50eb1873bbbfb25c378d8d09e1b58b68df7de1660980d5d6de8b3858f88a068 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-ovn-controller, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, container_name=container-puppet-ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}) Nov 28 03:01:53 localhost systemd[1]: libpod-conmon-d50eb1873bbbfb25c378d8d09e1b58b68df7de1660980d5d6de8b3858f88a068.scope: Deactivated successfully. Nov 28 03:01:53 localhost python3[51054]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ovn_controller --conmon-pidfile /run/container-puppet-ovn_controller.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538513 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,vs_config,exec --env NAME=ovn_controller --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::neutron::agents::ovn#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ovn_controller --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ovn_controller.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /etc/sysconfig/modules:/etc/sysconfig/modules --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Nov 28 03:01:53 localhost podman[52231]: 2025-11-28 08:01:49.833324774 +0000 UTC m=+0.084557881 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Nov 28 03:01:53 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Migration::Qemu/Augeas[qemu-conf-migration-ports]/returns: executed successfully Nov 28 03:01:53 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/debug]/ensure: created Nov 28 03:01:53 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/log_dir]/ensure: created Nov 28 03:01:53 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/backend]/ensure: created Nov 28 03:01:54 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/enabled]/ensure: created Nov 28 03:01:54 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/memcache_servers]/ensure: created Nov 28 03:01:54 localhost podman[52745]: 2025-11-28 08:01:54.046622035 +0000 UTC m=+0.090010222 container create 414d294dbe8cc421ef37142db37c20e224431ab3dc4c4553c8883b904f3c4bbd (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, config_id=tripleo_puppet_step1, build-date=2025-11-19T00:23:27Z, io.openshift.expose-services=, container_name=container-puppet-neutron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-server, summary=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-neutron-server-container) Nov 28 03:01:54 localhost systemd[1]: Started libpod-conmon-414d294dbe8cc421ef37142db37c20e224431ab3dc4c4553c8883b904f3c4bbd.scope. Nov 28 03:01:54 localhost podman[52745]: 2025-11-28 08:01:53.993613043 +0000 UTC m=+0.037001230 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Nov 28 03:01:54 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/tls_enabled]/ensure: created Nov 28 03:01:54 localhost systemd[1]: Started libcrun container. Nov 28 03:01:54 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95fcfabaf1695644f149e65ad3965d41d9d311fa2ace2d9f1efd734cfe54eb68/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Nov 28 03:01:54 localhost podman[52745]: 2025-11-28 08:01:54.122186333 +0000 UTC m=+0.165574520 container init 414d294dbe8cc421ef37142db37c20e224431ab3dc4c4553c8883b904f3c4bbd (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-server-container, release=1761123044, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=container-puppet-neutron, description=Red Hat OpenStack Platform 17.1 neutron-server, build-date=2025-11-19T00:23:27Z) Nov 28 03:01:54 localhost podman[52745]: 2025-11-28 08:01:54.1345159 +0000 UTC m=+0.177904087 container start 414d294dbe8cc421ef37142db37c20e224431ab3dc4c4553c8883b904f3c4bbd (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-server, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-server, version=17.1.12, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=container-puppet-neutron, com.redhat.component=openstack-neutron-server-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:23:27Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, vendor=Red Hat, Inc.) Nov 28 03:01:54 localhost podman[52745]: 2025-11-28 08:01:54.135099388 +0000 UTC m=+0.178487575 container attach 414d294dbe8cc421ef37142db37c20e224431ab3dc4c4553c8883b904f3c4bbd (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, tcib_managed=true, build-date=2025-11-19T00:23:27Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp17/openstack-neutron-server, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-server-container, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_puppet_step1, container_name=container-puppet-neutron, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-server) Nov 28 03:01:54 localhost puppet-user[51364]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Nov 28 03:01:54 localhost puppet-user[51364]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created Nov 28 03:01:54 localhost puppet-user[51364]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/ssl]/ensure: created Nov 28 03:01:54 localhost puppet-user[51364]: Notice: /Stage[main]/Nova/Oslo::Messaging::Default[nova_config]/Nova_config[DEFAULT/transport_url]/ensure: created Nov 28 03:01:54 localhost puppet-user[51364]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/driver]/ensure: created Nov 28 03:01:54 localhost puppet-user[51364]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/transport_url]/ensure: created Nov 28 03:01:54 localhost puppet-user[51364]: Notice: /Stage[main]/Nova/Oslo::Concurrency[nova_config]/Nova_config[oslo_concurrency/lock_path]/ensure: created Nov 28 03:01:54 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_type]/ensure: created Nov 28 03:01:54 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/region_name]/ensure: created Nov 28 03:01:54 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_url]/ensure: created Nov 28 03:01:54 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/username]/ensure: created Nov 28 03:01:54 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/password]/ensure: created Nov 28 03:01:54 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/user_domain_name]/ensure: created Nov 28 03:01:54 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_name]/ensure: created Nov 28 03:01:54 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_domain_name]/ensure: created Nov 28 03:01:54 localhost puppet-user[51364]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/send_service_user_token]/ensure: created Nov 28 03:01:54 localhost puppet-user[51364]: Notice: /Stage[main]/Ssh::Server::Config/Concat[/etc/ssh/sshd_config]/File[/etc/ssh/sshd_config]/ensure: defined content as '{sha256}3ccd56cc76ec60fa08fd698d282c9c89b1e8c485a00f47d57569ed8f6f8a16e4' Nov 28 03:01:54 localhost puppet-user[51364]: Notice: Applied catalog in 4.88 seconds Nov 28 03:01:54 localhost puppet-user[51364]: Application: Nov 28 03:01:54 localhost puppet-user[51364]: Initial environment: production Nov 28 03:01:54 localhost puppet-user[51364]: Converged environment: production Nov 28 03:01:54 localhost puppet-user[51364]: Run mode: user Nov 28 03:01:54 localhost puppet-user[51364]: Changes: Nov 28 03:01:54 localhost puppet-user[51364]: Total: 183 Nov 28 03:01:54 localhost puppet-user[51364]: Events: Nov 28 03:01:54 localhost puppet-user[51364]: Success: 183 Nov 28 03:01:54 localhost puppet-user[51364]: Total: 183 Nov 28 03:01:54 localhost puppet-user[51364]: Resources: Nov 28 03:01:54 localhost puppet-user[51364]: Changed: 183 Nov 28 03:01:54 localhost puppet-user[51364]: Out of sync: 183 Nov 28 03:01:54 localhost puppet-user[51364]: Skipped: 57 Nov 28 03:01:54 localhost puppet-user[51364]: Total: 487 Nov 28 03:01:54 localhost puppet-user[51364]: Time: Nov 28 03:01:54 localhost puppet-user[51364]: Concat file: 0.00 Nov 28 03:01:54 localhost puppet-user[51364]: Concat fragment: 0.00 Nov 28 03:01:54 localhost puppet-user[51364]: Anchor: 0.00 Nov 28 03:01:54 localhost puppet-user[51364]: File line: 0.00 Nov 28 03:01:54 localhost puppet-user[51364]: Virtlogd config: 0.00 Nov 28 03:01:54 localhost puppet-user[51364]: Virtqemud config: 0.01 Nov 28 03:01:54 localhost puppet-user[51364]: Exec: 0.01 Nov 28 03:01:54 localhost puppet-user[51364]: Virtsecretd config: 0.02 Nov 28 03:01:54 localhost puppet-user[51364]: Virtstoraged config: 0.02 Nov 28 03:01:54 localhost puppet-user[51364]: Package: 0.02 Nov 28 03:01:54 localhost puppet-user[51364]: Virtproxyd config: 0.03 Nov 28 03:01:54 localhost puppet-user[51364]: File: 0.03 Nov 28 03:01:54 localhost puppet-user[51364]: Virtnodedevd config: 0.06 Nov 28 03:01:54 localhost puppet-user[51364]: Augeas: 1.39 Nov 28 03:01:54 localhost puppet-user[51364]: Config retrieval: 1.65 Nov 28 03:01:54 localhost puppet-user[51364]: Last run: 1764316914 Nov 28 03:01:54 localhost puppet-user[51364]: Nova config: 3.08 Nov 28 03:01:54 localhost puppet-user[51364]: Transaction evaluation: 4.87 Nov 28 03:01:54 localhost puppet-user[51364]: Catalog application: 4.88 Nov 28 03:01:54 localhost puppet-user[51364]: Resources: 0.00 Nov 28 03:01:54 localhost puppet-user[51364]: Total: 4.88 Nov 28 03:01:54 localhost puppet-user[51364]: Version: Nov 28 03:01:54 localhost puppet-user[51364]: Config: 1764316908 Nov 28 03:01:54 localhost puppet-user[51364]: Puppet: 7.10.0 Nov 28 03:01:55 localhost systemd[1]: libpod-700dcb6d86cc0871b7bbf5977f2c9ead9a93d5086f4486ed0144de2a9b1c2b0e.scope: Deactivated successfully. Nov 28 03:01:55 localhost systemd[1]: libpod-700dcb6d86cc0871b7bbf5977f2c9ead9a93d5086f4486ed0144de2a9b1c2b0e.scope: Consumed 8.714s CPU time. Nov 28 03:01:55 localhost podman[51227]: 2025-11-28 08:01:55.839845727 +0000 UTC m=+10.689864037 container died 700dcb6d86cc0871b7bbf5977f2c9ead9a93d5086f4486ed0144de2a9b1c2b0e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, config_id=tripleo_puppet_step1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=container-puppet-nova_libvirt, name=rhosp17/openstack-nova-libvirt, batch=17.1_20251118.1, tcib_managed=true, release=1761123044, build-date=2025-11-19T00:35:22Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 28 03:01:55 localhost puppet-user[52776]: Error: Facter: error while resolving custom fact "haproxy_version": undefined method `strip' for nil:NilClass Nov 28 03:01:55 localhost systemd[1]: tmp-crun.5f2XKT.mount: Deactivated successfully. Nov 28 03:01:55 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-700dcb6d86cc0871b7bbf5977f2c9ead9a93d5086f4486ed0144de2a9b1c2b0e-userdata-shm.mount: Deactivated successfully. Nov 28 03:01:55 localhost systemd[1]: var-lib-containers-storage-overlay-b8fe5cc4372bb33f59044e24a5715f4f503612636c3ff0d7cebe06eec67fcb4f-merged.mount: Deactivated successfully. Nov 28 03:01:56 localhost podman[52822]: 2025-11-28 08:01:56.028505628 +0000 UTC m=+0.178562905 container cleanup 700dcb6d86cc0871b7bbf5977f2c9ead9a93d5086f4486ed0144de2a9b1c2b0e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, url=https://www.redhat.com, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, config_id=tripleo_puppet_step1, name=rhosp17/openstack-nova-libvirt, vcs-type=git, build-date=2025-11-19T00:35:22Z, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., container_name=container-puppet-nova_libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1) Nov 28 03:01:56 localhost systemd[1]: libpod-conmon-700dcb6d86cc0871b7bbf5977f2c9ead9a93d5086f4486ed0144de2a9b1c2b0e.scope: Deactivated successfully. Nov 28 03:01:56 localhost puppet-user[52776]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 28 03:01:56 localhost puppet-user[52776]: (file: /etc/puppet/hiera.yaml) Nov 28 03:01:56 localhost puppet-user[52776]: Warning: Undefined variable '::deploy_config_name'; Nov 28 03:01:56 localhost puppet-user[52776]: (file & line not available) Nov 28 03:01:56 localhost python3[51054]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-nova_libvirt --conmon-pidfile /run/container-puppet-nova_libvirt.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538513 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password --env NAME=nova_libvirt --env STEP_CONFIG=include ::tripleo::packages#012# TODO(emilien): figure how to deal with libvirt profile.#012# We'll probably treat it like we do with Neutron plugins.#012# Until then, just include it in the default nova-compute role.#012include tripleo::profile::base::nova::compute::libvirt#012#012include tripleo::profile::base::nova::libvirt#012#012include tripleo::profile::base::nova::compute::libvirt_guests#012#012include tripleo::profile::base::sshd#012include tripleo::profile::base::nova::migration::target --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-nova_libvirt --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-nova_libvirt.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 28 03:01:56 localhost puppet-user[52776]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 28 03:01:56 localhost puppet-user[52776]: (file & line not available) Nov 28 03:01:56 localhost puppet-user[52776]: Warning: Unknown variable: 'dhcp_agents_per_net'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/neutron.pp, line: 154, column: 37) Nov 28 03:01:56 localhost puppet-user[52776]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.62 seconds Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/host]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dns_domain]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dhcp_agent_notification]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/global_physnet_mtu]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/vlan_transparent]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron/Neutron_config[agent/root_helper]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron/Neutron_config[agent/report_interval]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/debug]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_host]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_protocol]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_workers]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/state_path]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/hwol_qos_enabled]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[agent/root_helper]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection_timeout]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovsdb_probe_interval]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_nb_connection]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_sb_connection]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/transport_url]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/control_exchange]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron/Oslo::Concurrency[neutron_config]/Neutron_config[oslo_concurrency/lock_path]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/driver]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/transport_url]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Nov 28 03:01:56 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created Nov 28 03:01:57 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/debug]/ensure: created Nov 28 03:01:57 localhost puppet-user[52776]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/log_dir]/ensure: created Nov 28 03:01:57 localhost puppet-user[52776]: Notice: Applied catalog in 0.49 seconds Nov 28 03:01:57 localhost puppet-user[52776]: Application: Nov 28 03:01:57 localhost puppet-user[52776]: Initial environment: production Nov 28 03:01:57 localhost puppet-user[52776]: Converged environment: production Nov 28 03:01:57 localhost puppet-user[52776]: Run mode: user Nov 28 03:01:57 localhost puppet-user[52776]: Changes: Nov 28 03:01:57 localhost puppet-user[52776]: Total: 33 Nov 28 03:01:57 localhost puppet-user[52776]: Events: Nov 28 03:01:57 localhost puppet-user[52776]: Success: 33 Nov 28 03:01:57 localhost puppet-user[52776]: Total: 33 Nov 28 03:01:57 localhost puppet-user[52776]: Resources: Nov 28 03:01:57 localhost puppet-user[52776]: Skipped: 21 Nov 28 03:01:57 localhost puppet-user[52776]: Changed: 33 Nov 28 03:01:57 localhost puppet-user[52776]: Out of sync: 33 Nov 28 03:01:57 localhost puppet-user[52776]: Total: 155 Nov 28 03:01:57 localhost puppet-user[52776]: Time: Nov 28 03:01:57 localhost puppet-user[52776]: Resources: 0.00 Nov 28 03:01:57 localhost puppet-user[52776]: Ovn metadata agent config: 0.02 Nov 28 03:01:57 localhost puppet-user[52776]: Neutron config: 0.41 Nov 28 03:01:57 localhost puppet-user[52776]: Transaction evaluation: 0.48 Nov 28 03:01:57 localhost puppet-user[52776]: Catalog application: 0.49 Nov 28 03:01:57 localhost puppet-user[52776]: Config retrieval: 0.69 Nov 28 03:01:57 localhost puppet-user[52776]: Last run: 1764316917 Nov 28 03:01:57 localhost puppet-user[52776]: Total: 0.49 Nov 28 03:01:57 localhost puppet-user[52776]: Version: Nov 28 03:01:57 localhost puppet-user[52776]: Config: 1764316916 Nov 28 03:01:57 localhost puppet-user[52776]: Puppet: 7.10.0 Nov 28 03:01:57 localhost systemd[1]: libpod-414d294dbe8cc421ef37142db37c20e224431ab3dc4c4553c8883b904f3c4bbd.scope: Deactivated successfully. Nov 28 03:01:57 localhost systemd[1]: libpod-414d294dbe8cc421ef37142db37c20e224431ab3dc4c4553c8883b904f3c4bbd.scope: Consumed 3.508s CPU time. Nov 28 03:01:57 localhost podman[52961]: 2025-11-28 08:01:57.778382214 +0000 UTC m=+0.054940423 container died 414d294dbe8cc421ef37142db37c20e224431ab3dc4c4553c8883b904f3c4bbd (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-19T00:23:27Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=container-puppet-neutron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.component=openstack-neutron-server-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, description=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-server) Nov 28 03:01:57 localhost systemd[1]: tmp-crun.OZZPSB.mount: Deactivated successfully. Nov 28 03:01:57 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-414d294dbe8cc421ef37142db37c20e224431ab3dc4c4553c8883b904f3c4bbd-userdata-shm.mount: Deactivated successfully. Nov 28 03:01:57 localhost systemd[1]: var-lib-containers-storage-overlay-95fcfabaf1695644f149e65ad3965d41d9d311fa2ace2d9f1efd734cfe54eb68-merged.mount: Deactivated successfully. Nov 28 03:01:57 localhost podman[52961]: 2025-11-28 08:01:57.862562283 +0000 UTC m=+0.139120392 container cleanup 414d294dbe8cc421ef37142db37c20e224431ab3dc4c4553c8883b904f3c4bbd (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_puppet_step1, tcib_managed=true, version=17.1.12, vcs-type=git, build-date=2025-11-19T00:23:27Z, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, release=1761123044, distribution-scope=public, name=rhosp17/openstack-neutron-server, container_name=container-puppet-neutron, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-server-container, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}) Nov 28 03:01:57 localhost systemd[1]: libpod-conmon-414d294dbe8cc421ef37142db37c20e224431ab3dc4c4553c8883b904f3c4bbd.scope: Deactivated successfully. Nov 28 03:01:57 localhost python3[51054]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-neutron --conmon-pidfile /run/container-puppet-neutron.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005538513 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config --env NAME=neutron --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::neutron::ovn_metadata#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-neutron --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005538513', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-neutron.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Nov 28 03:01:58 localhost python3[53013]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:01:59 localhost python3[53045]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:02:00 localhost python3[53095]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:02:00 localhost python3[53138]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316919.989835-84859-62296000541824/source dest=/usr/libexec/tripleo-container-shutdown mode=0700 owner=root group=root _original_basename=tripleo-container-shutdown follow=False checksum=7d67b1986212f5548057505748cd74cfcf9c0d35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:02:01 localhost python3[53200]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:02:01 localhost python3[53243]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316920.861091-84859-277553588498609/source dest=/usr/libexec/tripleo-start-podman-container mode=0700 owner=root group=root _original_basename=tripleo-start-podman-container follow=False checksum=536965633b8d3b1ce794269ffb07be0105a560a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:02:02 localhost python3[53305]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:02:02 localhost python3[53348]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316921.7847958-85003-138959531371064/source dest=/usr/lib/systemd/system/tripleo-container-shutdown.service mode=0644 owner=root group=root _original_basename=tripleo-container-shutdown-service follow=False checksum=66c1d41406ba8714feb9ed0a35259a7a57ef9707 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:02:03 localhost python3[53410]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:02:03 localhost python3[53453]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316922.690198-85033-39475210643575/source dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset mode=0644 owner=root group=root _original_basename=91-tripleo-container-shutdown-preset follow=False checksum=bccb1207dcbcfaa5ca05f83c8f36ce4c2460f081 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:02:03 localhost python3[53483]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:02:03 localhost systemd[1]: Reloading. Nov 28 03:02:04 localhost systemd-rc-local-generator[53507]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:02:04 localhost systemd-sysv-generator[53511]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:02:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:02:04 localhost systemd[1]: Reloading. Nov 28 03:02:04 localhost systemd-rc-local-generator[53542]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:02:04 localhost systemd-sysv-generator[53546]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:02:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:02:04 localhost systemd[1]: Starting TripleO Container Shutdown... Nov 28 03:02:04 localhost systemd[1]: Finished TripleO Container Shutdown. Nov 28 03:02:05 localhost python3[53606]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:02:05 localhost python3[53649]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316924.7225726-85081-142511355041184/source dest=/usr/lib/systemd/system/netns-placeholder.service mode=0644 owner=root group=root _original_basename=netns-placeholder-service follow=False checksum=8e9c6d5ce3a6e7f71c18780ec899f32f23de4c71 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:02:06 localhost python3[53711]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:02:06 localhost python3[53754]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316925.683529-85215-240822367027995/source dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset mode=0644 owner=root group=root _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:02:06 localhost python3[53784]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:02:06 localhost systemd[1]: Reloading. Nov 28 03:02:07 localhost systemd-rc-local-generator[53807]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:02:07 localhost systemd-sysv-generator[53814]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:02:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:02:07 localhost systemd[1]: Reloading. Nov 28 03:02:07 localhost systemd-sysv-generator[53851]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:02:07 localhost systemd-rc-local-generator[53846]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:02:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:02:07 localhost systemd[1]: Starting Create netns directory... Nov 28 03:02:07 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 28 03:02:07 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 28 03:02:07 localhost systemd[1]: Finished Create netns directory. Nov 28 03:02:08 localhost python3[53876]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Nov 28 03:02:08 localhost python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for metrics_qdr, new hash: d871e9c8e59a273b3131348d6d370386 Nov 28 03:02:08 localhost python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for collectd, new hash: da9a0dc7b40588672419e3ce10063e21 Nov 28 03:02:08 localhost python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for iscsid, new hash: c9c242145d21d40ef98889981c05ca84 Nov 28 03:02:08 localhost python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtlogd_wrapper, new hash: 0f0904943dda1bf1d123bdf96d71020f Nov 28 03:02:08 localhost python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtnodedevd, new hash: 0f0904943dda1bf1d123bdf96d71020f Nov 28 03:02:08 localhost python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtproxyd, new hash: 0f0904943dda1bf1d123bdf96d71020f Nov 28 03:02:08 localhost python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtqemud, new hash: 0f0904943dda1bf1d123bdf96d71020f Nov 28 03:02:08 localhost python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtsecretd, new hash: 0f0904943dda1bf1d123bdf96d71020f Nov 28 03:02:08 localhost python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtstoraged, new hash: 0f0904943dda1bf1d123bdf96d71020f Nov 28 03:02:08 localhost python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for rsyslog, new hash: 138ccb6252fd89d73a6c37a3f993f3eb Nov 28 03:02:08 localhost python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_compute, new hash: 684be86bd5476b8c779d4769a9adf982 Nov 28 03:02:08 localhost python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_ipmi, new hash: 684be86bd5476b8c779d4769a9adf982 Nov 28 03:02:08 localhost python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for logrotate_crond, new hash: 53ed83bb0cae779ff95edb2002262c6f Nov 28 03:02:08 localhost python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for nova_libvirt_init_secret, new hash: 0f0904943dda1bf1d123bdf96d71020f Nov 28 03:02:08 localhost python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for nova_migration_target, new hash: 0f0904943dda1bf1d123bdf96d71020f Nov 28 03:02:08 localhost python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for ovn_metadata_agent, new hash: dfc67f7a8d1f67548a53836c6db3b704 Nov 28 03:02:08 localhost python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for nova_compute, new hash: c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f Nov 28 03:02:08 localhost python3[53876]: ansible-container_puppet_config [WARNING] Config change detected for nova_wait_for_compute_service, new hash: 0f0904943dda1bf1d123bdf96d71020f Nov 28 03:02:09 localhost python3[53934]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step1 config_dir=/var/lib/tripleo-config/container-startup-config/step_1 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Nov 28 03:02:09 localhost podman[53970]: 2025-11-28 08:02:09.780880589 +0000 UTC m=+0.089464568 container create eaab9f0d01f832d0ebbf25d29bac31f0bdd739e36a3ea01c9b6ae9413e729d4b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, container_name=metrics_qdr_init_logs, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc.) Nov 28 03:02:09 localhost systemd[1]: Started libpod-conmon-eaab9f0d01f832d0ebbf25d29bac31f0bdd739e36a3ea01c9b6ae9413e729d4b.scope. Nov 28 03:02:09 localhost systemd[1]: Started libcrun container. Nov 28 03:02:09 localhost podman[53970]: 2025-11-28 08:02:09.730903187 +0000 UTC m=+0.039487196 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Nov 28 03:02:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb58edccbf984a5f7da43e28613aa15e67186fff8a0f55da5237efa2fd62c40a/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff) Nov 28 03:02:09 localhost podman[53970]: 2025-11-28 08:02:09.844057553 +0000 UTC m=+0.152641562 container init eaab9f0d01f832d0ebbf25d29bac31f0bdd739e36a3ea01c9b6ae9413e729d4b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr_init_logs, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step1, distribution-scope=public) Nov 28 03:02:09 localhost podman[53970]: 2025-11-28 08:02:09.858081922 +0000 UTC m=+0.166665901 container start eaab9f0d01f832d0ebbf25d29bac31f0bdd739e36a3ea01c9b6ae9413e729d4b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr_init_logs, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64) Nov 28 03:02:09 localhost podman[53970]: 2025-11-28 08:02:09.858348081 +0000 UTC m=+0.166932130 container attach eaab9f0d01f832d0ebbf25d29bac31f0bdd739e36a3ea01c9b6ae9413e729d4b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, vcs-type=git, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, container_name=metrics_qdr_init_logs, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public) Nov 28 03:02:09 localhost systemd[1]: libpod-eaab9f0d01f832d0ebbf25d29bac31f0bdd739e36a3ea01c9b6ae9413e729d4b.scope: Deactivated successfully. Nov 28 03:02:09 localhost podman[53970]: 2025-11-28 08:02:09.863551187 +0000 UTC m=+0.172135196 container died eaab9f0d01f832d0ebbf25d29bac31f0bdd739e36a3ea01c9b6ae9413e729d4b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, tcib_managed=true, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr_init_logs, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:02:09 localhost podman[53990]: 2025-11-28 08:02:09.955110731 +0000 UTC m=+0.078509686 container cleanup eaab9f0d01f832d0ebbf25d29bac31f0bdd739e36a3ea01c9b6ae9413e729d4b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr_init_logs, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:02:09 localhost systemd[1]: libpod-conmon-eaab9f0d01f832d0ebbf25d29bac31f0bdd739e36a3ea01c9b6ae9413e729d4b.scope: Deactivated successfully. Nov 28 03:02:09 localhost python3[53934]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr_init_logs --conmon-pidfile /run/metrics_qdr_init_logs.pid --detach=False --label config_id=tripleo_step1 --label container_name=metrics_qdr_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr_init_logs.log --network none --privileged=False --user root --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 /bin/bash -c chown -R qdrouterd:qdrouterd /var/log/qdrouterd Nov 28 03:02:10 localhost podman[54064]: 2025-11-28 08:02:10.441520817 +0000 UTC m=+0.086058389 container create 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 28 03:02:10 localhost systemd[1]: Started libpod-conmon-63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.scope. Nov 28 03:02:10 localhost podman[54064]: 2025-11-28 08:02:10.401209425 +0000 UTC m=+0.045747057 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Nov 28 03:02:10 localhost systemd[1]: Started libcrun container. Nov 28 03:02:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/701465e48d77119796b927e784ad3d56a8f99cc392002110647f3a4cfb83b9e9/merged/var/lib/qdrouterd supports timestamps until 2038 (0x7fffffff) Nov 28 03:02:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/701465e48d77119796b927e784ad3d56a8f99cc392002110647f3a4cfb83b9e9/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff) Nov 28 03:02:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:02:10 localhost podman[54064]: 2025-11-28 08:02:10.600251262 +0000 UTC m=+0.244788824 container init 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.buildah.version=1.41.4, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, version=17.1.12, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:02:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:02:10 localhost podman[54064]: 2025-11-28 08:02:10.643607852 +0000 UTC m=+0.288145414 container start 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:02:10 localhost python3[53934]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr --conmon-pidfile /run/metrics_qdr.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=d871e9c8e59a273b3131348d6d370386 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step1 --label container_name=metrics_qdr --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr.log --network host --privileged=False --user qdrouterd --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro --volume /var/lib/metrics_qdr:/var/lib/qdrouterd:z --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Nov 28 03:02:10 localhost podman[54087]: 2025-11-28 08:02:10.747078067 +0000 UTC m=+0.095941385 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=starting, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team) Nov 28 03:02:10 localhost systemd[1]: var-lib-containers-storage-overlay-bb58edccbf984a5f7da43e28613aa15e67186fff8a0f55da5237efa2fd62c40a-merged.mount: Deactivated successfully. Nov 28 03:02:10 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eaab9f0d01f832d0ebbf25d29bac31f0bdd739e36a3ea01c9b6ae9413e729d4b-userdata-shm.mount: Deactivated successfully. Nov 28 03:02:10 localhost podman[54087]: 2025-11-28 08:02:10.964670689 +0000 UTC m=+0.313534007 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, vcs-type=git, release=1761123044, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:02:11 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:02:11 localhost python3[54159]: ansible-file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:02:11 localhost python3[54175]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_metrics_qdr_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:02:12 localhost python3[54236]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764316931.6473696-85346-43577087931402/source dest=/etc/systemd/system/tripleo_metrics_qdr.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:02:12 localhost python3[54252]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 28 03:02:12 localhost systemd[1]: Reloading. Nov 28 03:02:12 localhost systemd-rc-local-generator[54275]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:02:12 localhost systemd-sysv-generator[54280]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:02:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:02:13 localhost python3[54304]: ansible-systemd Invoked with state=restarted name=tripleo_metrics_qdr.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:02:13 localhost systemd[1]: Reloading. Nov 28 03:02:13 localhost systemd-sysv-generator[54334]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:02:13 localhost systemd-rc-local-generator[54329]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:02:13 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:02:13 localhost systemd[1]: Starting metrics_qdr container... Nov 28 03:02:13 localhost systemd[1]: Started metrics_qdr container. Nov 28 03:02:14 localhost python3[54384]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks1.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:02:15 localhost python3[54505]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks1.json short_hostname=np0005538513 step=1 update_config_hash_only=False Nov 28 03:02:16 localhost python3[54521]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:02:16 localhost python3[54537]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True Nov 28 03:02:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:02:41 localhost podman[54615]: 2025-11-28 08:02:41.848226492 +0000 UTC m=+0.083427375 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_id=tripleo_step1, release=1761123044, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12) Nov 28 03:02:42 localhost podman[54615]: 2025-11-28 08:02:42.055465872 +0000 UTC m=+0.290666755 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12) Nov 28 03:02:42 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:03:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:03:12 localhost systemd[1]: tmp-crun.nVrcET.mount: Deactivated successfully. Nov 28 03:03:12 localhost podman[54646]: 2025-11-28 08:03:12.837826933 +0000 UTC m=+0.072841765 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, tcib_managed=true, architecture=x86_64, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4) Nov 28 03:03:13 localhost podman[54646]: 2025-11-28 08:03:13.055379954 +0000 UTC m=+0.290394776 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, vcs-type=git) Nov 28 03:03:13 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:03:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:03:43 localhost podman[54753]: 2025-11-28 08:03:43.841960346 +0000 UTC m=+0.082467243 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12) Nov 28 03:03:44 localhost podman[54753]: 2025-11-28 08:03:44.06210597 +0000 UTC m=+0.302612847 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 28 03:03:44 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:04:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:04:14 localhost systemd[1]: tmp-crun.evT8Ov.mount: Deactivated successfully. Nov 28 03:04:14 localhost podman[54782]: 2025-11-28 08:04:14.848891904 +0000 UTC m=+0.082142223 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, container_name=metrics_qdr, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, release=1761123044, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:04:15 localhost podman[54782]: 2025-11-28 08:04:15.04538454 +0000 UTC m=+0.278634829 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:04:15 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:04:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:04:45 localhost podman[54889]: 2025-11-28 08:04:45.851305214 +0000 UTC m=+0.084887870 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, name=rhosp17/openstack-qdrouterd) Nov 28 03:04:46 localhost podman[54889]: 2025-11-28 08:04:46.042502871 +0000 UTC m=+0.276085547 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container) Nov 28 03:04:46 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:05:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:05:16 localhost systemd[1]: tmp-crun.dn5J1f.mount: Deactivated successfully. Nov 28 03:05:16 localhost podman[54919]: 2025-11-28 08:05:16.842830456 +0000 UTC m=+0.076723339 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z) Nov 28 03:05:17 localhost podman[54919]: 2025-11-28 08:05:17.048513315 +0000 UTC m=+0.282406218 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 28 03:05:17 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:05:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:05:47 localhost systemd[1]: tmp-crun.Nmnydd.mount: Deactivated successfully. Nov 28 03:05:47 localhost podman[55025]: 2025-11-28 08:05:47.842004643 +0000 UTC m=+0.077467454 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044) Nov 28 03:05:48 localhost podman[55025]: 2025-11-28 08:05:48.042471665 +0000 UTC m=+0.277934406 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.) Nov 28 03:05:48 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:06:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:06:18 localhost podman[55055]: 2025-11-28 08:06:18.845423396 +0000 UTC m=+0.084257839 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step1, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4) Nov 28 03:06:19 localhost podman[55055]: 2025-11-28 08:06:19.036666675 +0000 UTC m=+0.275501128 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-18T22:49:46Z) Nov 28 03:06:19 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:06:25 localhost sshd[55084]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:06:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:06:49 localhost podman[55161]: 2025-11-28 08:06:49.840548034 +0000 UTC m=+0.079480522 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, distribution-scope=public, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T22:49:46Z) Nov 28 03:06:50 localhost podman[55161]: 2025-11-28 08:06:50.030393986 +0000 UTC m=+0.269326464 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, version=17.1.12, build-date=2025-11-18T22:49:46Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true) Nov 28 03:06:50 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:06:55 localhost sshd[55191]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:07:08 localhost ceph-osd[32506]: osd.5 pg_epoch: 20 pg[2.0( empty local-lis/les=0/0 n=0 ec=20/20 lis/c=0/0 les/c/f=0/0/0 sis=20) [4,5,3] r=1 lpr=20 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:07:08 localhost ceph-osd[32506]: osd.5 pg_epoch: 22 pg[3.0( empty local-lis/les=0/0 n=0 ec=22/22 lis/c=0/0 les/c/f=0/0/0 sis=22) [5,4,0] r=0 lpr=22 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:07:10 localhost ceph-osd[32506]: osd.5 pg_epoch: 23 pg[3.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=0/0 les/c/f=0/0/0 sis=22) [5,4,0] r=0 lpr=22 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:07:12 localhost ceph-osd[32506]: osd.5 pg_epoch: 24 pg[4.0( empty local-lis/les=0/0 n=0 ec=24/24 lis/c=0/0 les/c/f=0/0/0 sis=24) [3,4,5] r=2 lpr=24 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:07:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 26 pg[5.0( empty local-lis/les=0/0 n=0 ec=26/26 lis/c=0/0 les/c/f=0/0/0 sis=26) [2,3,4] r=0 lpr=26 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:07:15 localhost ceph-osd[31557]: osd.2 pg_epoch: 27 pg[5.0( empty local-lis/les=26/27 n=0 ec=26/26 lis/c=0/0 les/c/f=0/0/0 sis=26) [2,3,4] r=0 lpr=26 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:07:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:07:20 localhost podman[55194]: 2025-11-28 08:07:20.828957372 +0000 UTC m=+0.063953263 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, release=1761123044, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public) Nov 28 03:07:21 localhost podman[55194]: 2025-11-28 08:07:21.021335295 +0000 UTC m=+0.256331156 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, batch=17.1_20251118.1, release=1761123044, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git) Nov 28 03:07:21 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:07:28 localhost ceph-osd[31557]: osd.2 pg_epoch: 32 pg[6.0( empty local-lis/les=0/0 n=0 ec=32/32 lis/c=0/0 les/c/f=0/0/0 sis=32) [0,4,2] r=2 lpr=32 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:07:30 localhost ceph-osd[32506]: osd.5 pg_epoch: 33 pg[7.0( empty local-lis/les=0/0 n=0 ec=33/33 lis/c=0/0 les/c/f=0/0/0 sis=33) [1,5,3] r=1 lpr=33 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:07:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:07:51 localhost podman[55270]: 2025-11-28 08:07:51.847428757 +0000 UTC m=+0.087010878 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd) Nov 28 03:07:52 localhost podman[55270]: 2025-11-28 08:07:52.029898488 +0000 UTC m=+0.269480659 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, architecture=x86_64, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:07:52 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:07:59 localhost ceph-osd[32506]: osd.5 pg_epoch: 38 pg[2.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=38 pruub=13.060708046s) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active pruub 1172.546264648s@ mbc={}] start_peering_interval up [4,5,3] -> [4,5,3], acting [4,5,3] -> [4,5,3], acting_primary 4 -> 4, up_primary 4 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:07:59 localhost ceph-osd[32506]: osd.5 pg_epoch: 38 pg[2.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=38 pruub=13.057423592s) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1172.546264648s@ mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.16( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.14( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.15( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.18( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.12( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.13( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.11( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.10( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.f( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.e( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.d( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.c( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.b( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.a( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.9( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.1( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.6( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.3( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.5( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.2( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.4( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.7( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.8( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.19( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.1b( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.1a( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.1d( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.1c( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.1e( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.1f( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:00 localhost ceph-osd[32506]: osd.5 pg_epoch: 39 pg[2.17( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [4,5,3] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:01 localhost ceph-osd[32506]: osd.5 pg_epoch: 40 pg[4.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=40 pruub=15.073258400s) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active pruub 1176.569458008s@ mbc={}] start_peering_interval up [3,4,5] -> [3,4,5], acting [3,4,5] -> [3,4,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:01 localhost ceph-osd[32506]: osd.5 pg_epoch: 40 pg[4.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=40 pruub=15.070759773s) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1176.569458008s@ mbc={}] state: transitioning to Stray Nov 28 03:08:01 localhost ceph-osd[32506]: osd.5 pg_epoch: 40 pg[3.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=40 pruub=13.069881439s) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active pruub 1174.570434570s@ mbc={}] start_peering_interval up [5,4,0] -> [5,4,0], acting [5,4,0] -> [5,4,0], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:01 localhost ceph-osd[32506]: osd.5 pg_epoch: 40 pg[3.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=40 pruub=13.069881439s) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1174.570434570s@ mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.1a( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.1d( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.19( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.1b( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.18( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.1c( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.1f( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.1( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.e( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.2( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.4( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.3( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.7( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.6( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.5( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.f( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.c( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.a( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.b( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.d( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.8( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.9( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.16( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.17( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.15( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.14( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.13( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.12( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.10( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.1e( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[4.11( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [3,4,5] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.19( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.16( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.14( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.17( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.12( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.13( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.15( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.10( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.11( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.e( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.f( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.c( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.b( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.8( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.2( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.7( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.4( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.3( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.a( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.d( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.5( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.6( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.18( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1a( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1d( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1b( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1f( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1e( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.9( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1c( empty local-lis/les=22/23 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.0( empty local-lis/les=40/41 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost python3[55315]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1b( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.18( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.17( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.19( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.16( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.15( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.14( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1a( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.11( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.13( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.d( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.e( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.5( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.10( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.c( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.2( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.3( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.4( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.8( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.9( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.a( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.6( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.7( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.b( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.f( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1d( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1f( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1e( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.1c( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:02 localhost ceph-osd[32506]: osd.5 pg_epoch: 41 pg[3.12( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=22/22 les/c/f=23/23/0 sis=40) [5,4,0] r=0 lpr=40 pi=[22,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:03 localhost ceph-osd[31557]: osd.2 pg_epoch: 42 pg[5.0( empty local-lis/les=26/27 n=0 ec=26/26 lis/c=26/26 les/c/f=27/27/0 sis=42 pruub=15.915790558s) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active pruub 1184.430175781s@ mbc={}] start_peering_interval up [2,3,4] -> [2,3,4], acting [2,3,4] -> [2,3,4], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:03 localhost ceph-osd[31557]: osd.2 pg_epoch: 42 pg[6.0( empty local-lis/les=32/33 n=0 ec=32/32 lis/c=32/32 les/c/f=33/33/0 sis=42 pruub=13.975545883s) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 active pruub 1182.489990234s@ mbc={}] start_peering_interval up [0,4,2] -> [0,4,2], acting [0,4,2] -> [0,4,2], acting_primary 0 -> 0, up_primary 0 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:03 localhost ceph-osd[31557]: osd.2 pg_epoch: 42 pg[6.0( empty local-lis/les=32/33 n=0 ec=32/32 lis/c=32/32 les/c/f=33/33/0 sis=42 pruub=13.971534729s) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1182.489990234s@ mbc={}] state: transitioning to Stray Nov 28 03:08:03 localhost ceph-osd[31557]: osd.2 pg_epoch: 42 pg[5.0( empty local-lis/les=26/27 n=0 ec=26/26 lis/c=26/26 les/c/f=27/27/0 sis=42 pruub=15.915790558s) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.430175781s@ mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.1c( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.1f( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.1d( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.12( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.10( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.17( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.16( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.14( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.19( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.1a( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.11( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.17( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.16( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.15( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.14( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.15( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.13( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.1e( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.12( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.13( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.11( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.10( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.f( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.c( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.e( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.d( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.d( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.c( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.e( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.f( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.3( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.3( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.2( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.2( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.1( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.4( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.a( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1b( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.b( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.b( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.7( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.1b( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.18( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.8( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.5( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.6( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.8( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.5( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.9( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1a( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.6( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.7( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.4( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.19( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.a( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.9( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1c( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1f( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1d( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1e( empty local-lis/les=26/27 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[6.18( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [0,4,2] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.0( empty local-lis/les=42/43 n=0 ec=26/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.e( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.4( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.5( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.6( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.3( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.2( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.7( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.a( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.9( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.16( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.15( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.17( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1e( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.d( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.c( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.f( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.8( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.11( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1d( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.12( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1c( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.14( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1f( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1b( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.19( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.b( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.13( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.18( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.1a( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost ceph-osd[31557]: osd.2 pg_epoch: 43 pg[5.10( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=26/26 les/c/f=27/27/0 sis=42) [2,3,4] r=0 lpr=42 pi=[26,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:04 localhost python3[55331]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:08:04 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.0 scrub starts Nov 28 03:08:04 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.0 scrub ok Nov 28 03:08:05 localhost ceph-osd[32506]: osd.5 pg_epoch: 44 pg[7.0( v 36'39 (0'0,36'39] local-lis/les=33/34 n=22 ec=33/33 lis/c=33/33 les/c/f=34/34/0 sis=44 pruub=12.864492416s) [1,5,3] r=1 lpr=44 pi=[33,44)/1 luod=0'0 lua=36'37 crt=36'39 lcod 36'38 mlcod 0'0 active pruub 1178.443603516s@ mbc={}] start_peering_interval up [1,5,3] -> [1,5,3], acting [1,5,3] -> [1,5,3], acting_primary 1 -> 1, up_primary 1 -> 1, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:05 localhost ceph-osd[32506]: osd.5 pg_epoch: 44 pg[7.0( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=33/33 lis/c=33/33 les/c/f=34/34/0 sis=44 pruub=12.862508774s) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 lcod 36'38 mlcod 0'0 unknown NOTIFY pruub 1178.443603516s@ mbc={}] state: transitioning to Stray Nov 28 03:08:05 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.0 scrub starts Nov 28 03:08:05 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.0 scrub ok Nov 28 03:08:06 localhost ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.d( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:06 localhost ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.2( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=2 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:06 localhost ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.7( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:06 localhost ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.3( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=2 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:06 localhost ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.4( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=2 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:06 localhost ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.5( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=2 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:06 localhost ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.6( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=2 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:06 localhost ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.c( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:06 localhost ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.f( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:06 localhost ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.e( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:06 localhost ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.9( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:06 localhost ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.8( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:06 localhost ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.b( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:06 localhost ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.a( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=33/34 n=1 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:06 localhost ceph-osd[32506]: osd.5 pg_epoch: 45 pg[7.1( v 36'39 (0'0,36'39] local-lis/les=33/34 n=2 ec=44/33 lis/c=33/33 les/c/f=34/34/0 sis=44) [1,5,3] r=1 lpr=44 pi=[33,44)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Nov 28 03:08:06 localhost python3[55347]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:08:07 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.6 scrub starts Nov 28 03:08:07 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.6 scrub ok Nov 28 03:08:08 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.1b scrub starts Nov 28 03:08:08 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.1b scrub ok Nov 28 03:08:08 localhost python3[55395]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:08:08 localhost python3[55438]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317288.1489336-92600-76918684453434/source dest=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring mode=600 _original_basename=ceph.client.openstack.keyring follow=False checksum=98ffd20e3b9db1cae39a950d9da1f69e92796658 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:08:10 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.18 deep-scrub starts Nov 28 03:08:10 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.18 deep-scrub ok Nov 28 03:08:11 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.17 scrub starts Nov 28 03:08:11 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.17 scrub ok Nov 28 03:08:13 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.3 scrub starts Nov 28 03:08:13 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.3 scrub ok Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[3.15( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,1,0] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.839219093s) [0,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.589965820s@ mbc={}] start_peering_interval up [2,3,4] -> [0,1,5], acting [2,3,4] -> [0,1,5], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.836143494s) [3,4,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.587158203s@ mbc={}] start_peering_interval up [0,4,2] -> [3,4,5], acting [0,4,2] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.14( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.836041451s) [3,4,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.587158203s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837333679s) [1,3,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.588378906s@ mbc={}] start_peering_interval up [2,3,4] -> [1,3,2], acting [2,3,4] -> [1,3,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.19( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.839117050s) [0,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.589965820s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1a( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.845196724s) [4,2,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.596191406s@ mbc={}] start_peering_interval up [0,4,2] -> [4,2,0], acting [0,4,2] -> [4,2,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.843434334s) [4,5,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.594848633s@ mbc={}] start_peering_interval up [0,4,2] -> [4,5,0], acting [0,4,2] -> [4,5,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.17( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837383270s) [3,5,4] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.588867188s@ mbc={}] start_peering_interval up [2,3,4] -> [3,5,4], acting [2,3,4] -> [3,5,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1a( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.844755173s) [4,2,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.596191406s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.15( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.843380928s) [4,5,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.594848633s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.17( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837240219s) [3,5,4] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.588867188s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.16( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837261200s) [1,3,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.588378906s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.836681366s) [4,3,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.588378906s@ mbc={}] start_peering_interval up [2,3,4] -> [4,3,5], acting [2,3,4] -> [4,3,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.15( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.836624146s) [4,3,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.588378906s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837795258s) [3,2,4] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.589721680s@ mbc={}] start_peering_interval up [2,3,4] -> [3,2,4], acting [2,3,4] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.14( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837754250s) [3,2,4] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.589721680s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.17( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.835426331s) [5,0,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.587402344s@ mbc={}] start_peering_interval up [0,4,2] -> [5,0,1], acting [0,4,2] -> [5,0,1], acting_primary 0 -> 5, up_primary 0 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.17( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.835368156s) [5,0,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.587402344s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.838191032s) [5,0,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.590209961s@ mbc={}] start_peering_interval up [2,3,4] -> [5,0,1], acting [2,3,4] -> [5,0,1], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.10( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834941864s) [0,2,4] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.587158203s@ mbc={}] start_peering_interval up [0,4,2] -> [0,2,4], acting [0,4,2] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.10( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834882736s) [0,2,4] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.587158203s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.13( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.838109016s) [5,0,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.590209961s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834904671s) [3,5,4] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.587280273s@ mbc={}] start_peering_interval up [0,4,2] -> [3,5,4], acting [0,4,2] -> [3,5,4], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837042809s) [5,1,3] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.589477539s@ mbc={}] start_peering_interval up [2,3,4] -> [5,1,3], acting [2,3,4] -> [5,1,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.16( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834480286s) [0,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.587036133s@ mbc={}] start_peering_interval up [0,4,2] -> [0,1,5], acting [0,4,2] -> [0,1,5], acting_primary 0 -> 0, up_primary 0 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.12( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.836954117s) [5,1,3] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.589477539s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.836494446s) [1,2,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.589233398s@ mbc={}] start_peering_interval up [2,3,4] -> [1,2,0], acting [2,3,4] -> [1,2,0], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.16( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834243774s) [0,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.587036133s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.12( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.833994865s) [5,4,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.586791992s@ mbc={}] start_peering_interval up [0,4,2] -> [5,4,0], acting [0,4,2] -> [5,4,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.11( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.836399078s) [1,2,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.589233398s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.12( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.833915710s) [5,4,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.586791992s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.833595276s) [3,2,1] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.586547852s@ mbc={}] start_peering_interval up [0,4,2] -> [3,2,1], acting [0,4,2] -> [3,2,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.11( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834869385s) [3,5,4] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.587280273s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.10( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837560654s) [4,5,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.590576172s@ mbc={}] start_peering_interval up [2,3,4] -> [4,5,0], acting [2,3,4] -> [4,5,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.10( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837445259s) [4,5,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.590576172s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.c( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.842554092s) [3,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595947266s@ mbc={}] start_peering_interval up [0,4,2] -> [3,1,5], acting [0,4,2] -> [3,1,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.13( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.833251953s) [3,2,1] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.586547852s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.835626602s) [4,2,3] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.588867188s@ mbc={}] start_peering_interval up [2,3,4] -> [4,2,3], acting [2,3,4] -> [4,2,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.c( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.842500687s) [3,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.595947266s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.f( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.835380554s) [4,2,3] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.588867188s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.d( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.841554642s) [1,3,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595092773s@ mbc={}] start_peering_interval up [0,4,2] -> [1,3,2], acting [0,4,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.e( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.841343880s) [4,3,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.594970703s@ mbc={}] start_peering_interval up [0,4,2] -> [4,3,2], acting [0,4,2] -> [4,3,2], acting_primary 0 -> 4, up_primary 0 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.e( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.832862854s) [2,0,4] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.586547852s@ mbc={}] start_peering_interval up [2,3,4] -> [2,0,4], acting [2,3,4] -> [2,0,4], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.e( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.841270447s) [4,3,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.594970703s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.e( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.832862854s) [2,0,4] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1193.586547852s@ mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.d( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834959984s) [2,4,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.588867188s@ mbc={}] start_peering_interval up [2,3,4] -> [2,4,0], acting [2,3,4] -> [2,4,0], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.d( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.841462135s) [1,3,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.595092773s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.841094971s) [3,5,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595092773s@ mbc={}] start_peering_interval up [0,4,2] -> [3,5,1], acting [0,4,2] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.f( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.841048241s) [3,5,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.595092773s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834777832s) [3,4,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.588867188s@ mbc={}] start_peering_interval up [2,3,4] -> [3,4,2], acting [2,3,4] -> [3,4,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.d( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834959984s) [2,4,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1193.588867188s@ mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.3( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.841209412s) [4,5,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595336914s@ mbc={}] start_peering_interval up [0,4,2] -> [4,5,0], acting [0,4,2] -> [4,5,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.c( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834661484s) [3,4,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.588867188s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.3( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.841119766s) [4,5,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.595336914s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.833509445s) [0,1,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.587768555s@ mbc={}] start_peering_interval up [2,3,4] -> [0,1,2], acting [2,3,4] -> [0,1,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.3( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.833410263s) [0,1,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.587768555s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.2( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.840950966s) [1,3,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595336914s@ mbc={}] start_peering_interval up [0,4,2] -> [1,3,2], acting [0,4,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831896782s) [4,3,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.586425781s@ mbc={}] start_peering_interval up [2,3,4] -> [4,3,5], acting [2,3,4] -> [4,3,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831829071s) [4,3,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.586425781s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.2( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.840855598s) [1,3,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.595336914s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.833280563s) [4,0,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.588012695s@ mbc={}] start_peering_interval up [2,3,4] -> [4,0,2], acting [2,3,4] -> [4,0,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.2( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.833249092s) [4,0,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.588012695s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.840673447s) [2,1,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595458984s@ mbc={}] start_peering_interval up [0,4,2] -> [2,1,3], acting [0,4,2] -> [2,1,3], acting_primary 0 -> 2, up_primary 0 -> 2, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.840673447s) [2,1,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1193.595458984s@ mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1b( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834815025s) [1,0,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.589965820s@ mbc={}] start_peering_interval up [2,3,4] -> [1,0,2], acting [2,3,4] -> [1,0,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.18( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.840824127s) [0,1,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595947266s@ mbc={}] start_peering_interval up [0,4,2] -> [0,1,2], acting [0,4,2] -> [0,1,2], acting_primary 0 -> 0, up_primary 0 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.7( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.840670586s) [4,3,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595825195s@ mbc={}] start_peering_interval up [0,4,2] -> [4,3,2], acting [0,4,2] -> [4,3,2], acting_primary 0 -> 4, up_primary 0 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831583977s) [5,3,4] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.586669922s@ mbc={}] start_peering_interval up [2,3,4] -> [5,3,4], acting [2,3,4] -> [5,3,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.7( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.840625763s) [4,3,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.595825195s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.4( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831522942s) [5,3,4] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.586669922s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.18( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.840631485s) [0,1,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.595947266s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.b( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834583282s) [5,0,4] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.589965820s@ mbc={}] start_peering_interval up [2,3,4] -> [5,0,4], acting [2,3,4] -> [5,0,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.839269638s) [1,2,3] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.594848633s@ mbc={}] start_peering_interval up [0,4,2] -> [1,2,3], acting [0,4,2] -> [1,2,3], acting_primary 0 -> 1, up_primary 0 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.8( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.839148521s) [1,2,3] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.594848633s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.b( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834393501s) [5,0,4] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.589965820s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834494591s) [4,2,3] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.590209961s@ mbc={}] start_peering_interval up [2,3,4] -> [4,2,3], acting [2,3,4] -> [4,2,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1b( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.840360641s) [5,1,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.596191406s@ mbc={}] start_peering_interval up [0,4,2] -> [5,1,0], acting [0,4,2] -> [5,1,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1b( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834750175s) [1,0,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.589965820s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831048012s) [0,4,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.587036133s@ mbc={}] start_peering_interval up [2,3,4] -> [0,4,5], acting [2,3,4] -> [0,4,5], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.18( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.834384918s) [4,2,3] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.590209961s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1b( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.840190887s) [5,1,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.596191406s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.5( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.830997467s) [0,4,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.587036133s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.5( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.838831902s) [4,2,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595092773s@ mbc={}] start_peering_interval up [0,4,2] -> [4,2,0], acting [0,4,2] -> [4,2,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.5( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.838751793s) [4,2,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.595092773s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.6( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.839011192s) [3,4,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595336914s@ mbc={}] start_peering_interval up [0,4,2] -> [3,4,5], acting [0,4,2] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.b( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837341309s) [3,1,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.593750000s@ mbc={}] start_peering_interval up [0,4,2] -> [3,1,2], acting [0,4,2] -> [3,1,2], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.6( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.838896751s) [3,4,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.595336914s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.833964348s) [2,4,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.590576172s@ mbc={}] start_peering_interval up [2,3,4] -> [2,4,3], acting [2,3,4] -> [2,4,3], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.b( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837283134s) [3,1,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.593750000s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.1b( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,1,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.6( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.830921173s) [3,1,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.587524414s@ mbc={}] start_peering_interval up [2,3,4] -> [3,1,2], acting [2,3,4] -> [3,1,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1a( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.833964348s) [2,4,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1193.590576172s@ mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.6( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.830874443s) [3,1,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.587524414s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.19( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.839150429s) [1,3,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.596069336s@ mbc={}] start_peering_interval up [0,4,2] -> [1,3,2], acting [0,4,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831374168s) [4,3,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.588256836s@ mbc={}] start_peering_interval up [2,3,4] -> [4,3,5], acting [2,3,4] -> [4,3,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.19( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.839101791s) [1,3,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.596069336s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.4( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.838333130s) [3,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595214844s@ mbc={}] start_peering_interval up [0,4,2] -> [3,1,5], acting [0,4,2] -> [3,1,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.4( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.838268280s) [3,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.595214844s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.7( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831327438s) [4,3,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.588256836s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831075668s) [1,5,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.588256836s@ mbc={}] start_peering_interval up [2,3,4] -> [1,5,0], acting [2,3,4] -> [1,5,0], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.a( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.836670876s) [4,0,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.593872070s@ mbc={}] start_peering_interval up [0,4,2] -> [4,0,2], acting [0,4,2] -> [4,0,2], acting_primary 0 -> 4, up_primary 0 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1f( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.981576920s) [0,1,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734985352s@ mbc={}] start_peering_interval up [5,4,0] -> [0,1,5], acting [5,4,0] -> [0,1,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.9( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831021309s) [1,5,0] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.588256836s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.9( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.836584091s) [0,2,4] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.593872070s@ mbc={}] start_peering_interval up [0,4,2] -> [0,2,4], acting [0,4,2] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.a( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.836582184s) [4,0,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.593872070s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1f( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.981451035s) [0,1,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734985352s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.9( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.836521149s) [0,2,4] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.593872070s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.838540077s) [3,5,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.596069336s@ mbc={}] start_peering_interval up [0,4,2] -> [3,5,1], acting [0,4,2] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1f( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.838509560s) [3,5,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.596069336s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.a( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.830760002s) [0,2,4] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.588256836s@ mbc={}] start_peering_interval up [2,3,4] -> [0,2,4], acting [2,3,4] -> [0,2,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1c( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831855774s) [4,2,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.589477539s@ mbc={}] start_peering_interval up [2,3,4] -> [4,2,0], acting [2,3,4] -> [4,2,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1e( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.838161469s) [5,1,3] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595825195s@ mbc={}] start_peering_interval up [0,4,2] -> [5,1,3], acting [0,4,2] -> [5,1,3], acting_primary 0 -> 5, up_primary 0 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831745148s) [3,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.589477539s@ mbc={}] start_peering_interval up [2,3,4] -> [3,1,5], acting [2,3,4] -> [3,1,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1d( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.980996132s) [5,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734741211s@ mbc={}] start_peering_interval up [5,4,0] -> [5,4,3], acting [5,4,0] -> [5,4,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1e( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.838104248s) [5,1,3] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.595825195s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1d( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831678391s) [3,1,5] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.589477539s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.8( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831061363s) [2,0,1] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.588867188s@ mbc={}] start_peering_interval up [2,3,4] -> [2,0,1], acting [2,3,4] -> [2,0,1], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1d( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.980996132s) [5,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.734741211s@ mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.8( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831061363s) [2,0,1] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1193.588867188s@ mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1d( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837743759s) [3,5,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.595581055s@ mbc={}] start_peering_interval up [0,4,2] -> [3,5,1], acting [0,4,2] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.830633163s) [0,1,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.588745117s@ mbc={}] start_peering_interval up [2,3,4] -> [0,1,2], acting [2,3,4] -> [0,1,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1d( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.837689400s) [3,5,1] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.595581055s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1a( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.978971481s) [5,3,4] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733276367s@ mbc={}] start_peering_interval up [5,4,0] -> [5,3,4], acting [5,4,0] -> [5,3,4], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1e( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.830577850s) [0,1,2] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.588745117s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1a( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.978971481s) [5,3,4] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.733276367s@ mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1b( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.975111961s) [5,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.729736328s@ mbc={}] start_peering_interval up [5,4,0] -> [5,4,3], acting [5,4,0] -> [5,4,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.a( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.830696106s) [0,2,4] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.588256836s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1f( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831474304s) [4,5,3] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.589599609s@ mbc={}] start_peering_interval up [2,3,4] -> [4,5,3], acting [2,3,4] -> [4,5,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1f( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831237793s) [4,5,3] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.589599609s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[3.e( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,4,0] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1c( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.828446388s) [5,3,4] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active pruub 1193.586547852s@ mbc={}] start_peering_interval up [0,4,2] -> [5,3,4], acting [0,4,2] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[6.1c( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.828015327s) [5,3,4] r=-1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.586547852s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[5.1c( empty local-lis/les=42/43 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=14.831803322s) [4,2,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.589477539s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[3.8( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,0,4] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1b( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.975111961s) [5,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.729736328s@ mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.1e( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,1,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.9( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.979107857s) [5,1,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734375000s@ mbc={}] start_peering_interval up [5,4,0] -> [5,1,3], acting [5,4,0] -> [5,1,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.9( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.979107857s) [5,1,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.734375000s@ mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.6( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.979057312s) [0,4,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734375000s@ mbc={}] start_peering_interval up [5,4,0] -> [0,4,5], acting [5,4,0] -> [0,4,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.978583336s) [0,4,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734008789s@ mbc={}] start_peering_interval up [5,4,0] -> [0,4,2], acting [5,4,0] -> [0,4,2], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.6( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.978943825s) [0,4,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734375000s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.978516579s) [0,4,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734008789s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.4( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,3,4] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.b( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,0,4] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.12( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.979392052s) [0,4,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.736083984s@ mbc={}] start_peering_interval up [5,4,0] -> [0,4,5], acting [5,4,0] -> [0,4,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.12( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.979353905s) [0,4,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.736083984s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.17( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,0,1] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.15( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.975153923s) [2,1,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732910156s@ mbc={}] start_peering_interval up [5,4,0] -> [2,1,0], acting [5,4,0] -> [2,1,0], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.13( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,0,1] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.15( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.975037575s) [2,1,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732910156s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.12( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,1,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.17( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.974172592s) [0,4,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732666016s@ mbc={}] start_peering_interval up [5,4,0] -> [0,4,5], acting [5,4,0] -> [0,4,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.12( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,4,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.17( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.974113464s) [0,4,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732666016s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.19( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.973925591s) [0,1,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732788086s@ mbc={}] start_peering_interval up [5,4,0] -> [0,1,2], acting [5,4,0] -> [0,1,2], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.1c( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,3,4] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.973986626s) [0,4,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733398438s@ mbc={}] start_peering_interval up [3,4,5] -> [0,4,5], acting [3,4,5] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.973930359s) [0,4,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.733398438s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.11( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.973855972s) [3,4,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733520508s@ mbc={}] start_peering_interval up [3,4,5] -> [3,4,2], acting [3,4,5] -> [3,4,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.18( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.026166916s) [5,1,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.785888672s@ mbc={}] start_peering_interval up [4,5,3] -> [5,1,3], acting [4,5,3] -> [5,1,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.11( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.973798752s) [3,4,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.733520508s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.18( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.026166916s) [5,1,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.785888672s@ mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.17( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.030043602s) [1,5,3] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.789794922s@ mbc={}] start_peering_interval up [4,5,3] -> [1,5,3], acting [4,5,3] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.17( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.030009270s) [1,5,3] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.789794922s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.16( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972895622s) [1,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732788086s@ mbc={}] start_peering_interval up [5,4,0] -> [1,3,5], acting [5,4,0] -> [1,3,5], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.16( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972863197s) [1,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732788086s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.10( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.973284721s) [3,2,4] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733398438s@ mbc={}] start_peering_interval up [3,4,5] -> [3,2,4], acting [3,4,5] -> [3,2,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.10( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.973226547s) [3,2,4] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.733398438s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.13( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.973079681s) [4,2,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733398438s@ mbc={}] start_peering_interval up [3,4,5] -> [4,2,3], acting [3,4,5] -> [4,2,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.16( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.030028343s) [1,2,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.790405273s@ mbc={}] start_peering_interval up [4,5,3] -> [1,2,0], acting [4,5,3] -> [1,2,0], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.15( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.029592514s) [5,0,4] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.790039062s@ mbc={}] start_peering_interval up [4,5,3] -> [5,0,4], acting [4,5,3] -> [5,0,4], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.15( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.029592514s) [5,0,4] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.790039062s@ mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.16( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.029875755s) [1,2,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.790405273s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.13( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,4,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.13( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972949028s) [4,2,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.733398438s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.14( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972229958s) [1,2,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732910156s@ mbc={}] start_peering_interval up [5,4,0] -> [1,2,0], acting [5,4,0] -> [1,2,0], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.14( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972191811s) [1,2,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732910156s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.19( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.973726273s) [0,1,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732788086s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.12( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972481728s) [0,5,4] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733398438s@ mbc={}] start_peering_interval up [3,4,5] -> [0,5,4], acting [3,4,5] -> [0,5,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.19( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,3,1] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.12( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972426414s) [0,5,4] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.733398438s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.14( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025187492s) [4,2,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.786132812s@ mbc={}] start_peering_interval up [4,5,3] -> [4,2,0], acting [4,5,3] -> [4,2,0], acting_primary 4 -> 4, up_primary 4 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.14( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025151253s) [4,2,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.786132812s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.13( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.029186249s) [2,4,3] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.790527344s@ mbc={}] start_peering_interval up [4,5,3] -> [2,4,3], acting [4,5,3] -> [2,4,3], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.14( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972023964s) [5,0,1] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733398438s@ mbc={}] start_peering_interval up [3,4,5] -> [5,0,1], acting [3,4,5] -> [5,0,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.10( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,0,4] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.13( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.029132843s) [2,4,3] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.790527344s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.14( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972023964s) [5,0,1] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.733398438s@ mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.12( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.028998375s) [5,3,1] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.790527344s@ mbc={}] start_peering_interval up [4,5,3] -> [5,3,1], acting [4,5,3] -> [5,3,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.1d( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,4,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.f( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,4,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.12( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.028998375s) [5,3,1] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.790527344s@ mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.1c( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,1,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.17( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972898483s) [3,1,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734497070s@ mbc={}] start_peering_interval up [3,4,5] -> [3,1,5], acting [3,4,5] -> [3,1,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.13( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971303940s) [1,3,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732910156s@ mbc={}] start_peering_interval up [5,4,0] -> [1,3,2], acting [5,4,0] -> [1,3,2], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.17( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972848892s) [3,1,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734497070s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.11( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.029056549s) [4,3,2] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.790893555s@ mbc={}] start_peering_interval up [4,5,3] -> [4,3,2], acting [4,5,3] -> [4,3,2], acting_primary 4 -> 4, up_primary 4 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.c( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972228050s) [4,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734008789s@ mbc={}] start_peering_interval up [5,4,0] -> [4,3,5], acting [5,4,0] -> [4,3,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.11( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.029020309s) [4,3,2] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.790893555s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.1d( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,1,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.c( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972180367s) [4,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734008789s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.10( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972114563s) [1,5,3] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734008789s@ mbc={}] start_peering_interval up [5,4,0] -> [1,5,3], acting [5,4,0] -> [1,5,3], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.16( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971117973s) [0,4,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733154297s@ mbc={}] start_peering_interval up [3,4,5] -> [0,4,5], acting [3,4,5] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.10( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.972062111s) [1,5,3] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734008789s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.16( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971092224s) [0,4,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.733154297s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.13( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971239090s) [1,3,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732910156s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.1c( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,3,4] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.10( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.029185295s) [2,0,4] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.791259766s@ mbc={}] start_peering_interval up [4,5,3] -> [2,0,4], acting [4,5,3] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.10( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.029110909s) [2,0,4] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.791259766s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.9( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971055984s) [5,0,1] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733398438s@ mbc={}] start_peering_interval up [3,4,5] -> [5,0,1], acting [3,4,5] -> [5,0,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.9( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971055984s) [5,0,1] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.733398438s@ mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.1f( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.f( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.028552055s) [2,4,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.791015625s@ mbc={}] start_peering_interval up [4,5,3] -> [2,4,0], acting [4,5,3] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.f( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.028477669s) [2,4,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.791015625s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.e( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971155167s) [2,4,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734008789s@ mbc={}] start_peering_interval up [5,4,0] -> [2,4,0], acting [5,4,0] -> [2,4,0], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.3( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.8( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.970267296s) [5,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733154297s@ mbc={}] start_peering_interval up [3,4,5] -> [5,4,3], acting [3,4,5] -> [5,4,3], acting_primary 3 -> 5, up_primary 3 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.e( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.028509140s) [3,2,4] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.791381836s@ mbc={}] start_peering_interval up [4,5,3] -> [3,2,4], acting [4,5,3] -> [3,2,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.e( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971103668s) [2,4,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734008789s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.8( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.970267296s) [5,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.733154297s@ mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.e( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.028468132s) [3,2,4] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.791381836s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.c( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,0,1] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.862487793s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1182.625488281s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.862393379s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1182.625488281s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.5( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,0,1] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.f( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971565247s) [1,5,0] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734619141s@ mbc={}] start_peering_interval up [5,4,0] -> [1,5,0], acting [5,4,0] -> [1,5,0], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.a( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,3,1] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.f( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971468925s) [1,5,0] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734619141s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.d( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.028242111s) [5,1,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.791503906s@ mbc={}] start_peering_interval up [4,5,3] -> [5,1,3], acting [4,5,3] -> [5,1,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971169472s) [0,1,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734497070s@ mbc={}] start_peering_interval up [3,4,5] -> [0,1,5], acting [3,4,5] -> [0,1,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.2( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,1,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.d( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.028242111s) [5,1,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.791503906s@ mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.1( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,1,0] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.971089363s) [0,1,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734497070s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.9( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.861575127s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1182.625000000s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.9( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.861518860s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1182.625000000s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.969545364s) [1,0,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733276367s@ mbc={}] start_peering_interval up [3,4,5] -> [1,0,2], acting [3,4,5] -> [1,0,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.d( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.970258713s) [1,2,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733886719s@ mbc={}] start_peering_interval up [5,4,0] -> [1,2,3], acting [5,4,0] -> [1,2,3], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.c( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.027631760s) [2,0,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.791381836s@ mbc={}] start_peering_interval up [4,5,3] -> [2,0,1], acting [4,5,3] -> [2,0,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.c( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.027595520s) [2,0,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.791381836s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.969427109s) [1,0,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.733276367s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.d( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.970044136s) [1,2,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.733886719s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968944550s) [4,2,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732788086s@ mbc={}] start_peering_interval up [3,4,5] -> [4,2,3], acting [3,4,5] -> [4,2,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.b( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.027610779s) [5,1,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.791625977s@ mbc={}] start_peering_interval up [4,5,3] -> [5,1,0], acting [4,5,3] -> [5,1,0], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968889236s) [4,2,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732788086s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.b( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.027610779s) [5,1,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.791625977s@ mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.a( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.970264435s) [4,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734375000s@ mbc={}] start_peering_interval up [5,4,0] -> [4,3,5], acting [5,4,0] -> [4,3,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.a( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.970205307s) [4,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734375000s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.860547066s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1182.624877930s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.860388756s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1182.624877930s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.15( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968656540s) [5,3,1] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733276367s@ mbc={}] start_peering_interval up [3,4,5] -> [5,3,1], acting [3,4,5] -> [5,3,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.15( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968656540s) [5,3,1] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.733276367s@ mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.a( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.026748657s) [2,3,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.791381836s@ mbc={}] start_peering_interval up [4,5,3] -> [2,3,1], acting [4,5,3] -> [2,3,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.a( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.026714325s) [2,3,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.791381836s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968041420s) [4,3,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732788086s@ mbc={}] start_peering_interval up [3,4,5] -> [4,3,2], acting [3,4,5] -> [4,3,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967896461s) [4,3,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732788086s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.b( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.969691277s) [3,4,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734619141s@ mbc={}] start_peering_interval up [5,4,0] -> [3,4,5], acting [5,4,0] -> [3,4,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968936920s) [3,2,4] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733886719s@ mbc={}] start_peering_interval up [3,4,5] -> [3,2,4], acting [3,4,5] -> [3,2,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.9( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.027315140s) [3,4,5] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.792358398s@ mbc={}] start_peering_interval up [4,5,3] -> [3,4,5], acting [4,5,3] -> [3,4,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.b( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.969641685s) [3,4,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734619141s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.5( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967989922s) [1,5,0] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733154297s@ mbc={}] start_peering_interval up [3,4,5] -> [1,5,0], acting [3,4,5] -> [1,5,0], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.8( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.969229698s) [2,0,4] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734375000s@ mbc={}] start_peering_interval up [5,4,0] -> [2,0,4], acting [5,4,0] -> [2,0,4], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.9( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.027246475s) [3,4,5] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.792358398s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.5( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967956543s) [1,5,0] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.733154297s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.8( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.969168663s) [2,0,4] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734375000s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.3( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.026410103s) [4,3,5] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.791625977s@ mbc={}] start_peering_interval up [4,5,3] -> [4,3,5], acting [4,5,3] -> [4,3,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.859489441s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1182.624877930s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.3( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.026229858s) [4,3,5] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.791625977s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.859434128s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1182.624877930s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.6( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966850281s) [5,3,4] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732421875s@ mbc={}] start_peering_interval up [3,4,5] -> [5,3,4], acting [3,4,5] -> [5,3,4], acting_primary 3 -> 5, up_primary 3 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.6( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966850281s) [5,3,4] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1186.732421875s@ mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968319893s) [3,2,4] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.733886719s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.7( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966467857s) [0,5,4] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732177734s@ mbc={}] start_peering_interval up [3,4,5] -> [0,5,4], acting [3,4,5] -> [0,5,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.7( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966419220s) [0,5,4] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732177734s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025950432s) [3,5,4] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.791748047s@ mbc={}] start_peering_interval up [4,5,3] -> [3,5,4], acting [4,5,3] -> [3,5,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025921822s) [3,5,4] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.791748047s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.3( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.855010033s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1182.621093750s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.7( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.968351364s) [3,5,4] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734375000s@ mbc={}] start_peering_interval up [5,4,0] -> [3,5,4], acting [5,4,0] -> [3,5,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.3( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.965216637s) [2,4,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.731445312s@ mbc={}] start_peering_interval up [3,4,5] -> [2,4,3], acting [3,4,5] -> [2,4,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.6( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025517464s) [3,2,4] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.791625977s@ mbc={}] start_peering_interval up [4,5,3] -> [3,2,4], acting [4,5,3] -> [3,2,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.3( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.965181351s) [2,4,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.731445312s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.6( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025468826s) [3,2,4] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.791625977s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.5( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.026338577s) [2,0,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.792724609s@ mbc={}] start_peering_interval up [4,5,3] -> [2,0,1], acting [4,5,3] -> [2,0,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.2( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967695236s) [3,5,1] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734008789s@ mbc={}] start_peering_interval up [5,4,0] -> [3,5,1], acting [5,4,0] -> [3,5,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.3( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.854744911s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1182.621093750s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.4( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967777252s) [3,2,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734252930s@ mbc={}] start_peering_interval up [5,4,0] -> [3,2,1], acting [5,4,0] -> [3,2,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.7( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967906952s) [3,5,4] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734375000s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.2( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967460632s) [3,5,1] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734008789s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.5( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.026021004s) [2,0,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.792724609s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.4( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967555046s) [3,2,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734252930s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.854517937s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1182.621337891s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.4( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.965051651s) [0,5,1] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.731933594s@ mbc={}] start_peering_interval up [3,4,5] -> [0,5,1], acting [3,4,5] -> [0,5,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.854477882s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1182.621337891s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.4( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.965014458s) [0,5,1] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.731933594s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.2( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025288582s) [1,0,2] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.792236328s@ mbc={}] start_peering_interval up [4,5,3] -> [1,0,2], acting [4,5,3] -> [1,0,2], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.3( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967137337s) [4,0,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734130859s@ mbc={}] start_peering_interval up [5,4,0] -> [4,0,5], acting [5,4,0] -> [4,0,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.2( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025241852s) [1,0,2] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.792236328s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.1( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.853694916s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1182.620727539s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.2( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966079712s) [2,1,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.733154297s@ mbc={}] start_peering_interval up [3,4,5] -> [2,1,3], acting [3,4,5] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.4( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025568962s) [3,2,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.792602539s@ mbc={}] start_peering_interval up [4,5,3] -> [3,2,1], acting [4,5,3] -> [3,2,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.3( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.967079163s) [4,0,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734130859s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.2( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966045380s) [2,1,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.733154297s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.4( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.025537491s) [3,2,1] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.792602539s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.1( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.853642464s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1182.620727539s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.5( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966808319s) [4,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734008789s@ mbc={}] start_peering_interval up [5,4,0] -> [4,3,5], acting [5,4,0] -> [4,3,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.5( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966762543s) [4,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734008789s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.7( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.024511337s) [4,2,3] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.791870117s@ mbc={}] start_peering_interval up [4,5,3] -> [4,2,3], acting [4,5,3] -> [4,2,3], acting_primary 4 -> 4, up_primary 4 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.7( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.024435043s) [4,2,3] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.791870117s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.853452682s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1182.620971680s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.964606285s) [4,5,0] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732177734s@ mbc={}] start_peering_interval up [3,4,5] -> [4,5,0], acting [3,4,5] -> [4,5,0], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46 pruub=8.853414536s) [4,2,3] r=-1 lpr=46 pi=[44,46)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1182.620971680s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.8( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.024182320s) [1,2,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.791748047s@ mbc={}] start_peering_interval up [4,5,3] -> [1,2,0], acting [4,5,3] -> [1,2,0], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.964551926s) [4,5,0] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732177734s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.8( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.024150848s) [1,2,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.791748047s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.964995384s) [2,1,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732788086s@ mbc={}] start_peering_interval up [3,4,5] -> [2,1,0], acting [3,4,5] -> [2,1,0], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.18( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.964865685s) [3,2,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732666016s@ mbc={}] start_peering_interval up [5,4,0] -> [3,2,1], acting [5,4,0] -> [3,2,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.964897156s) [2,3,4] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.732666016s@ mbc={}] start_peering_interval up [3,4,5] -> [2,3,4], acting [3,4,5] -> [2,3,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.964949608s) [2,1,0] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732788086s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.19( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.024235725s) [3,4,2] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.792236328s@ mbc={}] start_peering_interval up [4,5,3] -> [3,4,2], acting [4,5,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.18( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.964830399s) [3,2,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732666016s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.19( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.024185181s) [3,4,2] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.792236328s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.964869499s) [2,3,4] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.732666016s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.963286400s) [2,4,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.731445312s@ mbc={}] start_peering_interval up [3,4,5] -> [2,4,3], acting [3,4,5] -> [2,4,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.963726044s) [2,1,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.731933594s@ mbc={}] start_peering_interval up [3,4,5] -> [2,1,3], acting [3,4,5] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.963196754s) [2,4,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.731445312s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1a( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.024097443s) [1,5,3] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.792358398s@ mbc={}] start_peering_interval up [4,5,3] -> [1,5,3], acting [4,5,3] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1b( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.024379730s) [5,4,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.792724609s@ mbc={}] start_peering_interval up [4,5,3] -> [5,4,3], acting [4,5,3] -> [5,4,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.963662148s) [2,1,3] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.731933594s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.962235451s) [4,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.730468750s@ mbc={}] start_peering_interval up [3,4,5] -> [4,3,5], acting [3,4,5] -> [4,3,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1b( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.024379730s) [5,4,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.792724609s@ mbc={}] state: transitioning to Primary Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.962183952s) [4,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.730468750s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1c( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.024049759s) [2,1,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.792602539s@ mbc={}] start_peering_interval up [4,5,3] -> [2,1,0], acting [4,5,3] -> [2,1,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.961515427s) [4,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.730224609s@ mbc={}] start_peering_interval up [3,4,5] -> [4,3,5], acting [3,4,5] -> [4,3,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1c( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.023998260s) [2,1,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.792602539s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1c( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966394424s) [1,3,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.735107422s@ mbc={}] start_peering_interval up [5,4,0] -> [1,3,2], acting [5,4,0] -> [1,3,2], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1a( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.023966789s) [1,5,3] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.792358398s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.1b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.961445808s) [4,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.730224609s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1d( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.023722649s) [2,4,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.792602539s@ mbc={}] start_peering_interval up [4,5,3] -> [2,4,0], acting [4,5,3] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.18( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.959779739s) [4,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.728759766s@ mbc={}] start_peering_interval up [3,4,5] -> [4,3,5], acting [3,4,5] -> [4,3,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1c( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.966338158s) [1,3,2] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.735107422s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.18( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.959726334s) [4,3,5] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.728759766s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1d( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.023674965s) [2,4,0] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.792602539s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.19( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.960644722s) [2,3,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.729858398s@ mbc={}] start_peering_interval up [3,4,5] -> [2,3,1], acting [3,4,5] -> [2,3,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1e( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.023437500s) [3,5,4] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.792724609s@ mbc={}] start_peering_interval up [4,5,3] -> [3,5,4], acting [4,5,3] -> [3,5,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1e( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.023385048s) [3,5,4] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.792724609s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[4.19( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.960531235s) [2,3,1] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.729858398s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1e( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.965315819s) [3,2,4] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active pruub 1186.734985352s@ mbc={}] start_peering_interval up [5,4,0] -> [3,2,4], acting [5,4,0] -> [3,2,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1f( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.022989273s) [0,4,2] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active pruub 1184.792724609s@ mbc={}] start_peering_interval up [4,5,3] -> [0,4,2], acting [4,5,3] -> [0,4,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[3.1e( empty local-lis/les=40/41 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46 pruub=12.965266228s) [3,2,4] r=-1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1186.734985352s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[2.1f( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46 pruub=11.022940636s) [0,4,2] r=-1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.792724609s@ mbc={}] state: transitioning to Stray Nov 28 03:08:13 localhost python3[55500]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[7.b( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=1 lpr=46 pi=[44,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.e( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [3,2,4] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[3.1( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [0,4,2] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.1f( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [0,4,2] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[7.9( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=1 lpr=46 pi=[44,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.6( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [3,2,4] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.f( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [3,2,4] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.14( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [4,2,0] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.11( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [4,3,2] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.d( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [4,2,3] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[7.f( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=1 lpr=46 pi=[44,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[3.4( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [3,2,1] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.1f( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,5,1] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.c( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [4,3,2] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[3.18( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [3,2,1] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.1d( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,1,5] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.19( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [3,4,2] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.7( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [4,2,3] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.16( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [0,1,5] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.4( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [3,2,1] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.1d( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,5,1] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[7.1( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=1 lpr=46 pi=[44,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.c( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,1,5] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[3.1e( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [3,2,4] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[7.3( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=1 lpr=46 pi=[44,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[7.5( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=1 lpr=46 pi=[44,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.6( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,4,5] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost python3[55543]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317293.6414745-92600-22399435218611/source dest=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring mode=600 _original_basename=ceph.client.manila.keyring follow=False checksum=880d8421ed22fd6e089f5c7c842f51482074b0c0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.5( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [0,4,5] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[3.19( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [0,1,2] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.3( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,5,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[7.7( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=1 lpr=46 pi=[44,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.1( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,3,5] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[7.d( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=46) [4,2,3] r=1 lpr=46 pi=[44,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.13( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [4,2,3] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.7( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,3,5] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.4( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,1,5] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[3.1c( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,3,2] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.11( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [3,4,2] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.2( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [1,0,2] r=2 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.15( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,5,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.10( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [3,2,4] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[4.a( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,0,2] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.15( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,3,5] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[3.d( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,2,3] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.8( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [1,2,0] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[3.14( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,2,0] r=1 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[3.13( empty local-lis/les=0/0 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [1,3,2] r=2 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 46 pg[2.16( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [1,2,0] r=1 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.1f( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,5,3] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.10( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,5,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.19( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [0,1,5] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 47 pg[5.b( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,0,4] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.f( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,5,1] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 47 pg[6.12( empty local-lis/les=46/47 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,4,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.17( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,5,4] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 47 pg[2.15( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [5,0,4] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.14( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,4,5] r=2 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[6.11( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [3,5,4] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 47 pg[4.8( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [5,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 47 pg[5.4( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,3,4] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 47 pg[4.6( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [5,3,4] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 47 pg[2.1b( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [5,4,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 47 pg[3.1a( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [5,3,4] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 47 pg[3.1d( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [5,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 47 pg[3.1b( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [5,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 47 pg[6.1c( empty local-lis/les=46/47 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,3,4] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 46 pg[5.9( empty local-lis/les=0/0 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [1,5,0] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 47 pg[6.1e( empty local-lis/les=46/47 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,1,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 47 pg[6.17( empty local-lis/les=46/47 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,0,1] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 47 pg[4.15( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [5,3,1] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 47 pg[2.d( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [5,1,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 47 pg[3.9( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [5,1,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 47 pg[2.12( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [5,3,1] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 47 pg[5.13( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,0,1] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 47 pg[4.14( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [5,0,1] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 47 pg[2.b( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [5,1,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 47 pg[4.9( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [5,0,1] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 47 pg[6.1b( empty local-lis/les=46/47 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,1,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 47 pg[4.1c( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,3,4] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 47 pg[5.12( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [5,1,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 47 pg[4.1f( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 47 pg[3.8( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,0,4] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 47 pg[2.f( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,4,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 47 pg[2.10( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,0,4] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 47 pg[3.e( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,4,0] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 47 pg[5.e( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [2,0,4] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 47 pg[2.13( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,4,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[32506]: osd.5 pg_epoch: 47 pg[2.18( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [5,1,3] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 47 pg[5.1a( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [2,4,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 47 pg[5.d( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [2,4,0] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 47 pg[2.1d( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,4,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 47 pg[4.3( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,4,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 47 pg[4.1d( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,1,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 47 pg[2.1c( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,1,0] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 47 pg[4.19( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,3,1] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 47 pg[2.c( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,0,1] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 47 pg[2.5( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,0,1] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 47 pg[4.2( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,1,3] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 47 pg[2.a( empty local-lis/les=46/47 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=46) [2,3,1] r=0 lpr=46 pi=[38,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 47 pg[3.15( empty local-lis/les=46/47 n=0 ec=40/22 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,1,0] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 47 pg[4.1( empty local-lis/les=46/47 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=46) [2,1,0] r=0 lpr=46 pi=[40,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 47 pg[5.8( empty local-lis/les=46/47 n=0 ec=42/26 lis/c=42/42 les/c/f=43/43/0 sis=46) [2,0,1] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:14 localhost ceph-osd[31557]: osd.2 pg_epoch: 47 pg[6.1( empty local-lis/les=46/47 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46) [2,1,3] r=0 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:15 localhost ceph-osd[32506]: osd.5 pg_epoch: 48 pg[7.2( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=14.795694351s) [3,5,1] r=1 lpr=48 pi=[44,48)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1190.620849609s@ mbc={}] start_peering_interval up [1,5,3] -> [3,5,1], acting [1,5,3] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:15 localhost ceph-osd[32506]: osd.5 pg_epoch: 48 pg[7.6( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=14.800373077s) [3,5,1] r=1 lpr=48 pi=[44,48)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1190.625488281s@ mbc={}] start_peering_interval up [1,5,3] -> [3,5,1], acting [1,5,3] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:15 localhost ceph-osd[32506]: osd.5 pg_epoch: 48 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=14.799404144s) [3,5,1] r=1 lpr=48 pi=[44,48)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1190.624633789s@ mbc={}] start_peering_interval up [1,5,3] -> [3,5,1], acting [1,5,3] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:15 localhost ceph-osd[32506]: osd.5 pg_epoch: 48 pg[7.6( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=14.800303459s) [3,5,1] r=1 lpr=48 pi=[44,48)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1190.625488281s@ mbc={}] state: transitioning to Stray Nov 28 03:08:15 localhost ceph-osd[32506]: osd.5 pg_epoch: 48 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=14.799332619s) [3,5,1] r=1 lpr=48 pi=[44,48)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1190.624633789s@ mbc={}] state: transitioning to Stray Nov 28 03:08:15 localhost ceph-osd[32506]: osd.5 pg_epoch: 48 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=14.799490929s) [3,5,1] r=1 lpr=48 pi=[44,48)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1190.624755859s@ mbc={}] start_peering_interval up [1,5,3] -> [3,5,1], acting [1,5,3] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:15 localhost ceph-osd[32506]: osd.5 pg_epoch: 48 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=14.799461365s) [3,5,1] r=1 lpr=48 pi=[44,48)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1190.624755859s@ mbc={}] state: transitioning to Stray Nov 28 03:08:15 localhost ceph-osd[32506]: osd.5 pg_epoch: 48 pg[7.2( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=14.795141220s) [3,5,1] r=1 lpr=48 pi=[44,48)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1190.620849609s@ mbc={}] state: transitioning to Stray Nov 28 03:08:19 localhost python3[55606]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:08:19 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.11 deep-scrub starts Nov 28 03:08:19 localhost python3[55649]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317299.1397874-92600-218611359240169/source dest=/var/lib/tripleo-config/ceph/ceph.conf mode=644 _original_basename=ceph.conf follow=False checksum=3f1634d98b90f8c800fba4d3a33fb1546a043fff backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:08:22 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.1d scrub starts Nov 28 03:08:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:08:22 localhost systemd[1]: tmp-crun.bzP97W.mount: Deactivated successfully. Nov 28 03:08:22 localhost podman[55664]: 2025-11-28 08:08:22.852700776 +0000 UTC m=+0.090409105 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:08:23 localhost podman[55664]: 2025-11-28 08:08:23.046369528 +0000 UTC m=+0.284077857 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, version=17.1.12) Nov 28 03:08:23 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:08:23 localhost ceph-osd[31557]: osd.2 pg_epoch: 50 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=14.753958702s) [3,4,2] r=2 lpr=50 pi=[46,50)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1203.783691406s@ mbc={}] start_peering_interval up [4,2,3] -> [3,4,2], acting [4,2,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:23 localhost ceph-osd[31557]: osd.2 pg_epoch: 50 pg[7.3( v 36'39 (0'0,36'39] local-lis/les=46/47 n=2 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=50 pruub=14.758782387s) [3,4,2] r=2 lpr=50 pi=[46,50)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1203.788940430s@ mbc={}] start_peering_interval up [4,2,3] -> [3,4,2], acting [4,2,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:23 localhost ceph-osd[31557]: osd.2 pg_epoch: 50 pg[7.3( v 36'39 (0'0,36'39] local-lis/les=46/47 n=2 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=50 pruub=14.758728027s) [3,4,2] r=2 lpr=50 pi=[46,50)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1203.788940430s@ mbc={}] state: transitioning to Stray Nov 28 03:08:23 localhost ceph-osd[31557]: osd.2 pg_epoch: 50 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=50 pruub=14.763359070s) [3,4,2] r=2 lpr=50 pi=[46,50)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1203.793457031s@ mbc={}] start_peering_interval up [4,2,3] -> [3,4,2], acting [4,2,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:23 localhost ceph-osd[31557]: osd.2 pg_epoch: 50 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=50 pruub=14.758196831s) [3,4,2] r=2 lpr=50 pi=[46,50)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1203.788574219s@ mbc={}] start_peering_interval up [4,2,3] -> [3,4,2], acting [4,2,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:23 localhost ceph-osd[31557]: osd.2 pg_epoch: 50 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=50 pruub=14.763068199s) [3,4,2] r=2 lpr=50 pi=[46,50)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1203.793457031s@ mbc={}] state: transitioning to Stray Nov 28 03:08:23 localhost ceph-osd[31557]: osd.2 pg_epoch: 50 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=50 pruub=14.758165359s) [3,4,2] r=2 lpr=50 pi=[46,50)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1203.788574219s@ mbc={}] state: transitioning to Stray Nov 28 03:08:23 localhost ceph-osd[31557]: osd.2 pg_epoch: 50 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=46/46 les/c/f=47/47/0 sis=50 pruub=14.753871918s) [3,4,2] r=2 lpr=50 pi=[46,50)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1203.783691406s@ mbc={}] state: transitioning to Stray Nov 28 03:08:25 localhost ceph-osd[32506]: osd.5 pg_epoch: 52 pg[7.c( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=52 pruub=12.537527084s) [0,1,2] r=-1 lpr=52 pi=[44,52)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1198.625610352s@ mbc={}] start_peering_interval up [1,5,3] -> [0,1,2], acting [1,5,3] -> [0,1,2], acting_primary 1 -> 0, up_primary 1 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:25 localhost ceph-osd[32506]: osd.5 pg_epoch: 52 pg[7.4( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=52 pruub=12.537282944s) [0,1,2] r=-1 lpr=52 pi=[44,52)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1198.625366211s@ mbc={}] start_peering_interval up [1,5,3] -> [0,1,2], acting [1,5,3] -> [0,1,2], acting_primary 1 -> 0, up_primary 1 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:25 localhost ceph-osd[32506]: osd.5 pg_epoch: 52 pg[7.c( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=52 pruub=12.537391663s) [0,1,2] r=-1 lpr=52 pi=[44,52)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1198.625610352s@ mbc={}] state: transitioning to Stray Nov 28 03:08:25 localhost ceph-osd[32506]: osd.5 pg_epoch: 52 pg[7.4( v 36'39 (0'0,36'39] local-lis/les=44/45 n=2 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=52 pruub=12.537203789s) [0,1,2] r=-1 lpr=52 pi=[44,52)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1198.625366211s@ mbc={}] state: transitioning to Stray Nov 28 03:08:26 localhost python3[55742]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:08:26 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.1a scrub starts Nov 28 03:08:26 localhost python3[55787]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317305.7807572-93166-33917102108375/source _original_basename=tmp8_keiueq follow=False checksum=f17091ee142621a3c8290c8c96b5b52d67b3a864 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:08:27 localhost ceph-osd[31557]: osd.2 pg_epoch: 52 pg[7.4( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=52) [0,1,2] r=2 lpr=52 pi=[44,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:27 localhost ceph-osd[31557]: osd.2 pg_epoch: 52 pg[7.c( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=52) [0,1,2] r=2 lpr=52 pi=[44,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:27 localhost python3[55849]: ansible-ansible.legacy.stat Invoked with path=/usr/local/sbin/containers-tmpwatch follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:08:28 localhost python3[55892]: ansible-ansible.legacy.copy Invoked with dest=/usr/local/sbin/containers-tmpwatch group=root mode=493 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317307.4584112-93255-250670734165250/source _original_basename=tmpzgnqv8_t follow=False checksum=84397b037dad9813fed388c4bcdd4871f384cd22 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:08:29 localhost python3[55922]: ansible-cron Invoked with job=/usr/local/sbin/containers-tmpwatch name=Remove old logs special_time=daily user=root state=present backup=False minute=* hour=* day=* month=* weekday=* disabled=False env=False cron_file=None insertafter=None insertbefore=None Nov 28 03:08:29 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.1b scrub starts Nov 28 03:08:29 localhost python3[55940]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_2 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:08:29 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.1b scrub ok Nov 28 03:08:31 localhost ansible-async_wrapper.py[56112]: Invoked with 478729855719 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317310.48299-93338-213523086718522/AnsiballZ_command.py _ Nov 28 03:08:31 localhost ansible-async_wrapper.py[56115]: Starting module and watcher Nov 28 03:08:31 localhost ansible-async_wrapper.py[56115]: Start watching 56116 (3600) Nov 28 03:08:31 localhost ansible-async_wrapper.py[56116]: Start module (56116) Nov 28 03:08:31 localhost ansible-async_wrapper.py[56112]: Return async_wrapper task started. Nov 28 03:08:31 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.d scrub starts Nov 28 03:08:31 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.d scrub ok Nov 28 03:08:31 localhost python3[56136]: ansible-ansible.legacy.async_status Invoked with jid=478729855719.56112 mode=status _async_dir=/tmp/.ansible_async Nov 28 03:08:32 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.e scrub starts Nov 28 03:08:32 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.e scrub ok Nov 28 03:08:32 localhost ceph-osd[31557]: osd.2 pg_epoch: 54 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=46/47 n=2 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=54 pruub=13.465290070s) [2,0,4] r=0 lpr=54 pi=[46,54)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1211.793701172s@ mbc={}] start_peering_interval up [4,2,3] -> [2,0,4], acting [4,2,3] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:32 localhost ceph-osd[31557]: osd.2 pg_epoch: 54 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=46/47 n=2 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=54 pruub=13.465290070s) [2,0,4] r=0 lpr=54 pi=[46,54)/1 crt=36'39 mlcod 0'0 unknown pruub 1211.793701172s@ mbc={}] state: transitioning to Primary Nov 28 03:08:32 localhost ceph-osd[31557]: osd.2 pg_epoch: 54 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=46/47 n=2 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=54 pruub=13.464475632s) [2,0,4] r=0 lpr=54 pi=[46,54)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1211.793334961s@ mbc={}] start_peering_interval up [4,2,3] -> [2,0,4], acting [4,2,3] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:32 localhost ceph-osd[31557]: osd.2 pg_epoch: 54 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=46/47 n=2 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=54 pruub=13.464475632s) [2,0,4] r=0 lpr=54 pi=[46,54)/1 crt=36'39 mlcod 0'0 unknown pruub 1211.793334961s@ mbc={}] state: transitioning to Primary Nov 28 03:08:33 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 4.6 scrub starts Nov 28 03:08:33 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 4.6 scrub ok Nov 28 03:08:34 localhost ceph-osd[31557]: osd.2 pg_epoch: 55 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=54/55 n=2 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=54) [2,0,4] r=0 lpr=54 pi=[46,54)/1 crt=36'39 mlcod 0'0 active+degraded mbc={255={(2+1)=2}}] state: react AllReplicasActivated Activating complete Nov 28 03:08:34 localhost ceph-osd[31557]: osd.2 pg_epoch: 55 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=54/55 n=2 ec=44/33 lis/c=46/46 les/c/f=47/49/0 sis=54) [2,0,4] r=0 lpr=54 pi=[46,54)/1 crt=36'39 mlcod 0'0 active+degraded mbc={255={(2+1)=2}}] state: react AllReplicasActivated Activating complete Nov 28 03:08:34 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 5.4 scrub starts Nov 28 03:08:34 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.1a deep-scrub starts Nov 28 03:08:34 localhost puppet-user[56134]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 28 03:08:34 localhost puppet-user[56134]: (file: /etc/puppet/hiera.yaml) Nov 28 03:08:34 localhost puppet-user[56134]: Warning: Undefined variable '::deploy_config_name'; Nov 28 03:08:34 localhost puppet-user[56134]: (file & line not available) Nov 28 03:08:34 localhost puppet-user[56134]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 28 03:08:34 localhost puppet-user[56134]: (file & line not available) Nov 28 03:08:34 localhost puppet-user[56134]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Nov 28 03:08:34 localhost puppet-user[56134]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Nov 28 03:08:34 localhost puppet-user[56134]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.10 seconds Nov 28 03:08:34 localhost puppet-user[56134]: Notice: Applied catalog in 0.03 seconds Nov 28 03:08:34 localhost puppet-user[56134]: Application: Nov 28 03:08:34 localhost puppet-user[56134]: Initial environment: production Nov 28 03:08:34 localhost puppet-user[56134]: Converged environment: production Nov 28 03:08:34 localhost puppet-user[56134]: Run mode: user Nov 28 03:08:34 localhost puppet-user[56134]: Changes: Nov 28 03:08:34 localhost puppet-user[56134]: Events: Nov 28 03:08:34 localhost puppet-user[56134]: Resources: Nov 28 03:08:34 localhost puppet-user[56134]: Total: 10 Nov 28 03:08:34 localhost puppet-user[56134]: Time: Nov 28 03:08:34 localhost puppet-user[56134]: Schedule: 0.00 Nov 28 03:08:34 localhost puppet-user[56134]: File: 0.00 Nov 28 03:08:34 localhost puppet-user[56134]: Exec: 0.01 Nov 28 03:08:34 localhost puppet-user[56134]: Augeas: 0.01 Nov 28 03:08:34 localhost puppet-user[56134]: Transaction evaluation: 0.03 Nov 28 03:08:34 localhost puppet-user[56134]: Catalog application: 0.03 Nov 28 03:08:34 localhost puppet-user[56134]: Config retrieval: 0.13 Nov 28 03:08:34 localhost puppet-user[56134]: Last run: 1764317314 Nov 28 03:08:34 localhost puppet-user[56134]: Filebucket: 0.00 Nov 28 03:08:34 localhost puppet-user[56134]: Total: 0.04 Nov 28 03:08:34 localhost puppet-user[56134]: Version: Nov 28 03:08:34 localhost puppet-user[56134]: Config: 1764317314 Nov 28 03:08:34 localhost puppet-user[56134]: Puppet: 7.10.0 Nov 28 03:08:34 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.1a deep-scrub ok Nov 28 03:08:34 localhost ansible-async_wrapper.py[56116]: Module complete (56116) Nov 28 03:08:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 03:08:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 4244 writes, 20K keys, 4244 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4244 writes, 311 syncs, 13.65 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 990 writes, 3824 keys, 990 commit groups, 1.0 writes per commit group, ingest: 1.67 MB, 0.00 MB/s#012Interval WAL: 990 writes, 168 syncs, 5.89 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memt Nov 28 03:08:34 localhost ceph-osd[32506]: osd.5 pg_epoch: 56 pg[7.6( v 36'39 (0'0,36'39] local-lis/les=48/49 n=2 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=13.480182648s) [0,4,5] r=2 lpr=56 pi=[48,56)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1208.854370117s@ mbc={}] start_peering_interval up [3,5,1] -> [0,4,5], acting [3,5,1] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:34 localhost ceph-osd[32506]: osd.5 pg_epoch: 56 pg[7.6( v 36'39 (0'0,36'39] local-lis/les=48/49 n=2 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=13.480103493s) [0,4,5] r=2 lpr=56 pi=[48,56)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1208.854370117s@ mbc={}] state: transitioning to Stray Nov 28 03:08:34 localhost ceph-osd[32506]: osd.5 pg_epoch: 56 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=48/49 n=1 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=13.480300903s) [0,4,5] r=2 lpr=56 pi=[48,56)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1208.854492188s@ mbc={}] start_peering_interval up [3,5,1] -> [0,4,5], acting [3,5,1] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:34 localhost ceph-osd[32506]: osd.5 pg_epoch: 56 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=48/49 n=1 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=56 pruub=13.480031013s) [0,4,5] r=2 lpr=56 pi=[48,56)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1208.854492188s@ mbc={}] state: transitioning to Stray Nov 28 03:08:35 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 3.e deep-scrub starts Nov 28 03:08:35 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 3.e deep-scrub ok Nov 28 03:08:36 localhost ansible-async_wrapper.py[56115]: Done in kid B. Nov 28 03:08:36 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 3.8 scrub starts Nov 28 03:08:38 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 6.1 scrub starts Nov 28 03:08:38 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.b scrub starts Nov 28 03:08:38 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 6.1 scrub ok Nov 28 03:08:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 03:08:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.3 total, 600.0 interval#012Cumulative writes: 4737 writes, 21K keys, 4737 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4737 writes, 417 syncs, 11.36 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1354 writes, 5212 keys, 1354 commit groups, 1.0 writes per commit group, ingest: 2.02 MB, 0.00 MB/s#012Interval WAL: 1354 writes, 222 syncs, 6.10 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.008 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.008 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.01 0.00 1 0.008 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.3 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.3 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.3 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 m Nov 28 03:08:40 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 3.15 scrub starts Nov 28 03:08:40 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 3.15 scrub ok Nov 28 03:08:41 localhost python3[56390]: ansible-ansible.legacy.async_status Invoked with jid=478729855719.56112 mode=status _async_dir=/tmp/.ansible_async Nov 28 03:08:42 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 6.1b scrub starts Nov 28 03:08:42 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 6.1b scrub ok Nov 28 03:08:42 localhost python3[56406]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 28 03:08:42 localhost python3[56422]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:08:43 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 6.1e deep-scrub starts Nov 28 03:08:43 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 6.1e deep-scrub ok Nov 28 03:08:43 localhost ceph-osd[31557]: osd.2 pg_epoch: 58 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=50/51 n=1 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=58 pruub=13.559302330s) [1,5,3] r=-1 lpr=58 pi=[50,58)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1222.071411133s@ mbc={}] start_peering_interval up [3,4,2] -> [1,5,3], acting [3,4,2] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:43 localhost ceph-osd[31557]: osd.2 pg_epoch: 58 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=50/51 n=1 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=58 pruub=13.558609962s) [1,5,3] r=-1 lpr=58 pi=[50,58)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1222.071411133s@ mbc={}] state: transitioning to Stray Nov 28 03:08:43 localhost ceph-osd[31557]: osd.2 pg_epoch: 58 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=50/51 n=1 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=58 pruub=13.554260254s) [1,5,3] r=-1 lpr=58 pi=[50,58)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1222.067749023s@ mbc={}] start_peering_interval up [3,4,2] -> [1,5,3], acting [3,4,2] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:43 localhost ceph-osd[31557]: osd.2 pg_epoch: 58 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=50/51 n=1 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=58 pruub=13.554097176s) [1,5,3] r=-1 lpr=58 pi=[50,58)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1222.067749023s@ mbc={}] state: transitioning to Stray Nov 28 03:08:43 localhost python3[56472]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:08:43 localhost python3[56490]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpcfhnpj94 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 28 03:08:44 localhost python3[56520]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:08:44 localhost ceph-osd[32506]: osd.5 pg_epoch: 58 pg[7.f( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=58) [1,5,3] r=1 lpr=58 pi=[50,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:44 localhost ceph-osd[32506]: osd.5 pg_epoch: 58 pg[7.7( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=58) [1,5,3] r=1 lpr=58 pi=[50,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:45 localhost python3[56623]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Nov 28 03:08:45 localhost ceph-osd[32506]: osd.5 pg_epoch: 60 pg[7.8( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=60 pruub=9.058769226s) [3,4,5] r=2 lpr=60 pi=[44,60)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1214.625488281s@ mbc={}] start_peering_interval up [1,5,3] -> [3,4,5], acting [1,5,3] -> [3,4,5], acting_primary 1 -> 3, up_primary 1 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:45 localhost ceph-osd[32506]: osd.5 pg_epoch: 60 pg[7.8( v 36'39 (0'0,36'39] local-lis/les=44/45 n=1 ec=44/33 lis/c=44/44 les/c/f=45/45/0 sis=60 pruub=9.058665276s) [3,4,5] r=2 lpr=60 pi=[44,60)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1214.625488281s@ mbc={}] state: transitioning to Stray Nov 28 03:08:45 localhost python3[56642]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:08:46 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.8 scrub starts Nov 28 03:08:46 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 5.8 scrub ok Nov 28 03:08:46 localhost python3[56674]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:08:47 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 6.17 scrub starts Nov 28 03:08:47 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 6.17 scrub ok Nov 28 03:08:47 localhost ceph-osd[31557]: osd.2 pg_epoch: 62 pg[7.9( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=46/46 les/c/f=47/47/0 sis=62 pruub=15.181772232s) [0,2,4] r=1 lpr=62 pi=[46,62)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1227.784179688s@ mbc={}] start_peering_interval up [4,2,3] -> [0,2,4], acting [4,2,3] -> [0,2,4], acting_primary 4 -> 0, up_primary 4 -> 0, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:47 localhost ceph-osd[31557]: osd.2 pg_epoch: 62 pg[7.9( v 36'39 (0'0,36'39] local-lis/les=46/47 n=1 ec=44/33 lis/c=46/46 les/c/f=47/47/0 sis=62 pruub=15.181417465s) [0,2,4] r=1 lpr=62 pi=[46,62)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1227.784179688s@ mbc={}] state: transitioning to Stray Nov 28 03:08:47 localhost python3[56724]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:08:47 localhost python3[56742]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:08:48 localhost python3[56804]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:08:48 localhost python3[56822]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:08:48 localhost python3[56884]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:08:49 localhost python3[56902]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:08:49 localhost ceph-osd[31557]: osd.2 pg_epoch: 64 pg[7.a( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=64) [2,0,4] r=0 lpr=64 pi=[48,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Nov 28 03:08:49 localhost ceph-osd[32506]: osd.5 pg_epoch: 64 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=48/49 n=1 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=64 pruub=15.188872337s) [2,0,4] r=-1 lpr=64 pi=[48,64)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1224.851440430s@ mbc={}] start_peering_interval up [3,5,1] -> [2,0,4], acting [3,5,1] -> [2,0,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:49 localhost ceph-osd[32506]: osd.5 pg_epoch: 64 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=48/49 n=1 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=64 pruub=15.188785553s) [2,0,4] r=-1 lpr=64 pi=[48,64)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1224.851440430s@ mbc={}] state: transitioning to Stray Nov 28 03:08:49 localhost python3[56964]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:08:50 localhost python3[56982]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:08:50 localhost ceph-osd[31557]: osd.2 pg_epoch: 65 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=64/65 n=1 ec=44/33 lis/c=48/48 les/c/f=49/49/0 sis=64) [2,0,4] r=0 lpr=64 pi=[48,64)/1 crt=36'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Nov 28 03:08:50 localhost python3[57012]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:08:50 localhost systemd[1]: Reloading. Nov 28 03:08:50 localhost systemd-rc-local-generator[57035]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:08:50 localhost systemd-sysv-generator[57041]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:08:50 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:08:51 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.d scrub starts Nov 28 03:08:51 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.d scrub ok Nov 28 03:08:51 localhost ceph-osd[31557]: osd.2 pg_epoch: 66 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=50/51 n=1 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=66 pruub=13.378162384s) [3,1,2] r=2 lpr=66 pi=[50,66)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1230.071655273s@ mbc={}] start_peering_interval up [3,4,2] -> [3,1,2], acting [3,4,2] -> [3,1,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:51 localhost ceph-osd[31557]: osd.2 pg_epoch: 66 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=50/51 n=1 ec=44/33 lis/c=50/50 les/c/f=51/51/0 sis=66 pruub=13.378087044s) [3,1,2] r=2 lpr=66 pi=[50,66)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1230.071655273s@ mbc={}] state: transitioning to Stray Nov 28 03:08:51 localhost python3[57097]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:08:51 localhost python3[57115]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:08:52 localhost python3[57177]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:08:52 localhost python3[57195]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:08:53 localhost python3[57225]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:08:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:08:53 localhost systemd[1]: Reloading. Nov 28 03:08:53 localhost systemd-sysv-generator[57267]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:08:53 localhost systemd-rc-local-generator[57264]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:08:53 localhost podman[57227]: 2025-11-28 08:08:53.191597834 +0000 UTC m=+0.108058007 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-qdrouterd-container) Nov 28 03:08:53 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:08:53 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.1c scrub starts Nov 28 03:08:53 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.1c scrub ok Nov 28 03:08:53 localhost ceph-osd[31557]: osd.2 pg_epoch: 68 pg[7.c( v 36'39 (0'0,36'39] local-lis/les=52/53 n=1 ec=44/33 lis/c=52/52 les/c/f=53/53/0 sis=68 pruub=13.965437889s) [1,3,2] r=2 lpr=68 pi=[52,68)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1232.697998047s@ mbc={}] start_peering_interval up [0,1,2] -> [1,3,2], acting [0,1,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:53 localhost ceph-osd[31557]: osd.2 pg_epoch: 68 pg[7.c( v 36'39 (0'0,36'39] local-lis/les=52/53 n=1 ec=44/33 lis/c=52/52 les/c/f=53/53/0 sis=68 pruub=13.965130806s) [1,3,2] r=2 lpr=68 pi=[52,68)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1232.697998047s@ mbc={}] state: transitioning to Stray Nov 28 03:08:53 localhost systemd[1]: Starting Create netns directory... Nov 28 03:08:53 localhost podman[57227]: 2025-11-28 08:08:53.438571377 +0000 UTC m=+0.355031590 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, release=1761123044) Nov 28 03:08:53 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 28 03:08:53 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 28 03:08:53 localhost systemd[1]: Finished Create netns directory. Nov 28 03:08:53 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:08:53 localhost python3[57314]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Nov 28 03:08:55 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 4.8 scrub starts Nov 28 03:08:55 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 4.8 scrub ok Nov 28 03:08:55 localhost ceph-osd[31557]: osd.2 pg_epoch: 70 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=54/55 n=2 ec=44/33 lis/c=54/54 les/c/f=55/55/0 sis=70 pruub=10.763648033s) [1,3,5] r=-1 lpr=70 pi=[54,70)/1 crt=36'39 mlcod 0'0 active pruub 1231.589477539s@ mbc={255={}}] start_peering_interval up [2,0,4] -> [1,3,5], acting [2,0,4] -> [1,3,5], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:08:55 localhost ceph-osd[31557]: osd.2 pg_epoch: 70 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=54/55 n=2 ec=44/33 lis/c=54/54 les/c/f=55/55/0 sis=70 pruub=10.763505936s) [1,3,5] r=-1 lpr=70 pi=[54,70)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1231.589477539s@ mbc={}] state: transitioning to Stray Nov 28 03:08:56 localhost python3[57372]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step2 config_dir=/var/lib/tripleo-config/container-startup-config/step_2 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Nov 28 03:08:56 localhost podman[57448]: 2025-11-28 08:08:56.417422853 +0000 UTC m=+0.082531379 container create ecffbe7cad180bba146604f75d423ebc7fbe7687684a13ecf44cbde356aeddeb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=nova_virtqemud_init_logs, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, config_id=tripleo_step2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Nov 28 03:08:56 localhost podman[57449]: 2025-11-28 08:08:56.446815398 +0000 UTC m=+0.104110048 container create ecd8620dfff7bb7b28b391e9a1ddd7c8109888d70f83e22569879e1b5ac9f23a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute_init_log, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vcs-type=git, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, config_id=tripleo_step2, com.redhat.component=openstack-nova-compute-container, release=1761123044, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team) Nov 28 03:08:56 localhost systemd[1]: Started libpod-conmon-ecffbe7cad180bba146604f75d423ebc7fbe7687684a13ecf44cbde356aeddeb.scope. Nov 28 03:08:56 localhost podman[57448]: 2025-11-28 08:08:56.372694435 +0000 UTC m=+0.037803051 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 28 03:08:56 localhost systemd[1]: Started libpod-conmon-ecd8620dfff7bb7b28b391e9a1ddd7c8109888d70f83e22569879e1b5ac9f23a.scope. Nov 28 03:08:56 localhost podman[57449]: 2025-11-28 08:08:56.375874291 +0000 UTC m=+0.033168971 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 28 03:08:56 localhost systemd[1]: Started libcrun container. Nov 28 03:08:56 localhost systemd[1]: Started libcrun container. Nov 28 03:08:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1223dc0c4e426b2bfe915961a21ef517e6d9229ac1981a406c76a3ee9d07ac7d/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff) Nov 28 03:08:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc38caeb419d32ecfc51d7c0645586de2c8ada37c45a7d9cd6676aef2fc0ca1f/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Nov 28 03:08:56 localhost podman[57449]: 2025-11-28 08:08:56.497434454 +0000 UTC m=+0.154729104 container init ecd8620dfff7bb7b28b391e9a1ddd7c8109888d70f83e22569879e1b5ac9f23a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, container_name=nova_compute_init_log, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step2, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:08:56 localhost podman[57449]: 2025-11-28 08:08:56.504684373 +0000 UTC m=+0.161979013 container start ecd8620dfff7bb7b28b391e9a1ddd7c8109888d70f83e22569879e1b5ac9f23a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, container_name=nova_compute_init_log, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step2, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container) Nov 28 03:08:56 localhost python3[57372]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute_init_log --conmon-pidfile /run/nova_compute_init_log.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1764316155 --label config_id=tripleo_step2 --label container_name=nova_compute_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute_init_log.log --network none --privileged=False --user root --volume /var/log/containers/nova:/var/log/nova:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /bin/bash -c chown -R nova:nova /var/log/nova Nov 28 03:08:56 localhost systemd[1]: libpod-ecd8620dfff7bb7b28b391e9a1ddd7c8109888d70f83e22569879e1b5ac9f23a.scope: Deactivated successfully. Nov 28 03:08:56 localhost podman[57448]: 2025-11-28 08:08:56.547218774 +0000 UTC m=+0.212327320 container init ecffbe7cad180bba146604f75d423ebc7fbe7687684a13ecf44cbde356aeddeb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, tcib_managed=true, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, config_id=tripleo_step2, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=nova_virtqemud_init_logs, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-type=git, version=17.1.12, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 28 03:08:56 localhost podman[57485]: 2025-11-28 08:08:56.563770633 +0000 UTC m=+0.041260114 container died ecd8620dfff7bb7b28b391e9a1ddd7c8109888d70f83e22569879e1b5ac9f23a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step2, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, container_name=nova_compute_init_log) Nov 28 03:08:56 localhost systemd[1]: libpod-ecffbe7cad180bba146604f75d423ebc7fbe7687684a13ecf44cbde356aeddeb.scope: Deactivated successfully. Nov 28 03:08:56 localhost podman[57485]: 2025-11-28 08:08:56.594125998 +0000 UTC m=+0.071615439 container cleanup ecd8620dfff7bb7b28b391e9a1ddd7c8109888d70f83e22569879e1b5ac9f23a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20251118.1, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, name=rhosp17/openstack-nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, container_name=nova_compute_init_log, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12, managed_by=tripleo_ansible, config_id=tripleo_step2, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:08:56 localhost systemd[1]: libpod-conmon-ecd8620dfff7bb7b28b391e9a1ddd7c8109888d70f83e22569879e1b5ac9f23a.scope: Deactivated successfully. Nov 28 03:08:56 localhost podman[57448]: 2025-11-28 08:08:56.606567003 +0000 UTC m=+0.271675519 container start ecffbe7cad180bba146604f75d423ebc7fbe7687684a13ecf44cbde356aeddeb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, config_id=tripleo_step2, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, container_name=nova_virtqemud_init_logs, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 28 03:08:56 localhost python3[57372]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud_init_logs --conmon-pidfile /run/nova_virtqemud_init_logs.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1764316155 --label config_id=tripleo_step2 --label container_name=nova_virtqemud_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud_init_logs.log --network none --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --user root --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /bin/bash -c chown -R tss:tss /var/log/swtpm Nov 28 03:08:56 localhost ceph-osd[32506]: osd.5 pg_epoch: 70 pg[7.d( empty local-lis/les=0/0 n=0 ec=44/33 lis/c=54/54 les/c/f=55/55/0 sis=70) [1,3,5] r=2 lpr=70 pi=[54,70)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Nov 28 03:08:56 localhost podman[57507]: 2025-11-28 08:08:56.64264168 +0000 UTC m=+0.064138004 container died ecffbe7cad180bba146604f75d423ebc7fbe7687684a13ecf44cbde356aeddeb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtqemud_init_logs, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step2, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z) Nov 28 03:08:56 localhost podman[57507]: 2025-11-28 08:08:56.766155053 +0000 UTC m=+0.187651347 container cleanup ecffbe7cad180bba146604f75d423ebc7fbe7687684a13ecf44cbde356aeddeb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step2, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, maintainer=OpenStack TripleO Team, container_name=nova_virtqemud_init_logs) Nov 28 03:08:56 localhost systemd[1]: libpod-conmon-ecffbe7cad180bba146604f75d423ebc7fbe7687684a13ecf44cbde356aeddeb.scope: Deactivated successfully. Nov 28 03:08:57 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 4.9 scrub starts Nov 28 03:08:57 localhost podman[57630]: 2025-11-28 08:08:57.165708354 +0000 UTC m=+0.087931580 container create ccbff20b3845342a68374757d2d5f6d53ef10d4fa6de90e093768667982a7531 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, architecture=x86_64, config_id=tripleo_step2, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, container_name=create_haproxy_wrapper, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public) Nov 28 03:08:57 localhost systemd[1]: Started libpod-conmon-ccbff20b3845342a68374757d2d5f6d53ef10d4fa6de90e093768667982a7531.scope. Nov 28 03:08:57 localhost systemd[1]: Started libcrun container. Nov 28 03:08:57 localhost podman[57629]: 2025-11-28 08:08:57.112872902 +0000 UTC m=+0.043504712 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 28 03:08:57 localhost podman[57630]: 2025-11-28 08:08:57.115750259 +0000 UTC m=+0.037973545 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Nov 28 03:08:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ffc9017578f246ce8b4d7e2db3f29d809acca5f23dc5c91669a411567041b2c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 03:08:57 localhost podman[57630]: 2025-11-28 08:08:57.224342372 +0000 UTC m=+0.146565638 container init ccbff20b3845342a68374757d2d5f6d53ef10d4fa6de90e093768667982a7531 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, release=1761123044, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, container_name=create_haproxy_wrapper, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true) Nov 28 03:08:57 localhost podman[57630]: 2025-11-28 08:08:57.233809507 +0000 UTC m=+0.156032743 container start ccbff20b3845342a68374757d2d5f6d53ef10d4fa6de90e093768667982a7531 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=create_haproxy_wrapper, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, config_id=tripleo_step2, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:08:57 localhost podman[57630]: 2025-11-28 08:08:57.234224769 +0000 UTC m=+0.156448025 container attach ccbff20b3845342a68374757d2d5f6d53ef10d4fa6de90e093768667982a7531 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step2, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=create_haproxy_wrapper, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, tcib_managed=true, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}) Nov 28 03:08:57 localhost podman[57629]: 2025-11-28 08:08:57.275614657 +0000 UTC m=+0.206246417 container create 685c3d45d3785e0e616411ce008e8b79b4b543eac2375733e152070c7aaf3d71 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=create_virtlogd_wrapper, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, version=17.1.12, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step2) Nov 28 03:08:57 localhost systemd[1]: Started libpod-conmon-685c3d45d3785e0e616411ce008e8b79b4b543eac2375733e152070c7aaf3d71.scope. Nov 28 03:08:57 localhost systemd[1]: Started libcrun container. Nov 28 03:08:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e44c8d0954aa6a829c62ac9efb238e93c579afb851dd0e28c7fd1cbc63565274/merged/var/lib/container-config-scripts supports timestamps until 2038 (0x7fffffff) Nov 28 03:08:57 localhost podman[57629]: 2025-11-28 08:08:57.335870433 +0000 UTC m=+0.266502223 container init 685c3d45d3785e0e616411ce008e8b79b4b543eac2375733e152070c7aaf3d71 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=create_virtlogd_wrapper, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step2, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z) Nov 28 03:08:57 localhost podman[57629]: 2025-11-28 08:08:57.344129722 +0000 UTC m=+0.274761502 container start 685c3d45d3785e0e616411ce008e8b79b4b543eac2375733e152070c7aaf3d71 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step2, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=create_virtlogd_wrapper, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:35:22Z) Nov 28 03:08:57 localhost podman[57629]: 2025-11-28 08:08:57.34440121 +0000 UTC m=+0.275033030 container attach 685c3d45d3785e0e616411ce008e8b79b4b543eac2375733e152070c7aaf3d71 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, container_name=create_virtlogd_wrapper, build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_id=tripleo_step2, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com) Nov 28 03:08:57 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 4.9 scrub ok Nov 28 03:08:57 localhost systemd[1]: var-lib-containers-storage-overlay-cc38caeb419d32ecfc51d7c0645586de2c8ada37c45a7d9cd6676aef2fc0ca1f-merged.mount: Deactivated successfully. Nov 28 03:08:57 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ecd8620dfff7bb7b28b391e9a1ddd7c8109888d70f83e22569879e1b5ac9f23a-userdata-shm.mount: Deactivated successfully. Nov 28 03:08:57 localhost systemd[1]: var-lib-containers-storage-overlay-1223dc0c4e426b2bfe915961a21ef517e6d9229ac1981a406c76a3ee9d07ac7d-merged.mount: Deactivated successfully. Nov 28 03:08:57 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ecffbe7cad180bba146604f75d423ebc7fbe7687684a13ecf44cbde356aeddeb-userdata-shm.mount: Deactivated successfully. Nov 28 03:08:58 localhost ovs-vsctl[57756]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory) Nov 28 03:08:59 localhost systemd[1]: libpod-685c3d45d3785e0e616411ce008e8b79b4b543eac2375733e152070c7aaf3d71.scope: Deactivated successfully. Nov 28 03:08:59 localhost systemd[1]: libpod-685c3d45d3785e0e616411ce008e8b79b4b543eac2375733e152070c7aaf3d71.scope: Consumed 2.016s CPU time. Nov 28 03:08:59 localhost podman[57629]: 2025-11-28 08:08:59.353108928 +0000 UTC m=+2.283740708 container died 685c3d45d3785e0e616411ce008e8b79b4b543eac2375733e152070c7aaf3d71 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step2, tcib_managed=true, container_name=create_virtlogd_wrapper, architecture=x86_64, io.openshift.expose-services=, version=17.1.12, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, batch=17.1_20251118.1, distribution-scope=public) Nov 28 03:08:59 localhost systemd[1]: tmp-crun.rg4dLc.mount: Deactivated successfully. Nov 28 03:08:59 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-685c3d45d3785e0e616411ce008e8b79b4b543eac2375733e152070c7aaf3d71-userdata-shm.mount: Deactivated successfully. Nov 28 03:08:59 localhost systemd[1]: var-lib-containers-storage-overlay-e44c8d0954aa6a829c62ac9efb238e93c579afb851dd0e28c7fd1cbc63565274-merged.mount: Deactivated successfully. Nov 28 03:08:59 localhost podman[57883]: 2025-11-28 08:08:59.451138672 +0000 UTC m=+0.088473698 container cleanup 685c3d45d3785e0e616411ce008e8b79b4b543eac2375733e152070c7aaf3d71 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, container_name=create_virtlogd_wrapper, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., config_id=tripleo_step2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:08:59 localhost systemd[1]: libpod-conmon-685c3d45d3785e0e616411ce008e8b79b4b543eac2375733e152070c7aaf3d71.scope: Deactivated successfully. Nov 28 03:08:59 localhost python3[57372]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/create_virtlogd_wrapper.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1764316155 --label config_id=tripleo_step2 --label container_name=create_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_virtlogd_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::nova::virtlogd_wrapper Nov 28 03:09:00 localhost systemd[1]: libpod-ccbff20b3845342a68374757d2d5f6d53ef10d4fa6de90e093768667982a7531.scope: Deactivated successfully. Nov 28 03:09:00 localhost systemd[1]: libpod-ccbff20b3845342a68374757d2d5f6d53ef10d4fa6de90e093768667982a7531.scope: Consumed 2.132s CPU time. Nov 28 03:09:00 localhost podman[57630]: 2025-11-28 08:09:00.274673462 +0000 UTC m=+3.196896698 container died ccbff20b3845342a68374757d2d5f6d53ef10d4fa6de90e093768667982a7531 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step2, io.openshift.expose-services=, container_name=create_haproxy_wrapper, maintainer=OpenStack TripleO Team, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, managed_by=tripleo_ansible) Nov 28 03:09:00 localhost podman[57923]: 2025-11-28 08:09:00.347629291 +0000 UTC m=+0.063980260 container cleanup ccbff20b3845342a68374757d2d5f6d53ef10d4fa6de90e093768667982a7531 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, config_id=tripleo_step2, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, container_name=create_haproxy_wrapper, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:09:00 localhost systemd[1]: libpod-conmon-ccbff20b3845342a68374757d2d5f6d53ef10d4fa6de90e093768667982a7531.scope: Deactivated successfully. Nov 28 03:09:00 localhost python3[57372]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_haproxy_wrapper --conmon-pidfile /run/create_haproxy_wrapper.pid --detach=False --label config_id=tripleo_step2 --label container_name=create_haproxy_wrapper --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_haproxy_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers Nov 28 03:09:00 localhost systemd[1]: var-lib-containers-storage-overlay-0ffc9017578f246ce8b4d7e2db3f29d809acca5f23dc5c91669a411567041b2c-merged.mount: Deactivated successfully. Nov 28 03:09:00 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ccbff20b3845342a68374757d2d5f6d53ef10d4fa6de90e093768667982a7531-userdata-shm.mount: Deactivated successfully. Nov 28 03:09:01 localhost python3[57978]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks2.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:09:02 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 4.14 scrub starts Nov 28 03:09:02 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 4.14 scrub ok Nov 28 03:09:02 localhost python3[58099]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks2.json short_hostname=np0005538513 step=2 update_config_hash_only=False Nov 28 03:09:02 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.3 scrub starts Nov 28 03:09:02 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.3 scrub ok Nov 28 03:09:02 localhost python3[58115]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:09:03 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 4.15 scrub starts Nov 28 03:09:03 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 4.15 scrub ok Nov 28 03:09:03 localhost python3[58131]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_2 config_pattern=container-puppet-*.json config_overrides={} debug=True Nov 28 03:09:04 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.12 scrub starts Nov 28 03:09:04 localhost ceph-osd[32506]: osd.5 pg_epoch: 72 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=56/57 n=1 ec=44/33 lis/c=56/56 les/c/f=57/57/0 sis=72 pruub=12.112273216s) [3,5,1] r=1 lpr=72 pi=[56,72)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1236.578369141s@ mbc={}] start_peering_interval up [0,4,5] -> [3,5,1], acting [0,4,5] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:09:04 localhost ceph-osd[32506]: osd.5 pg_epoch: 72 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=56/57 n=1 ec=44/33 lis/c=56/56 les/c/f=57/57/0 sis=72 pruub=12.112176895s) [3,5,1] r=1 lpr=72 pi=[56,72)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1236.578369141s@ mbc={}] state: transitioning to Stray Nov 28 03:09:04 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.12 scrub ok Nov 28 03:09:05 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.1f scrub starts Nov 28 03:09:05 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.1f scrub ok Nov 28 03:09:05 localhost ceph-osd[32506]: osd.5 pg_epoch: 74 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=58/59 n=1 ec=44/33 lis/c=58/58 les/c/f=59/59/0 sis=74 pruub=10.864244461s) [0,5,1] r=1 lpr=74 pi=[58,74)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1236.879638672s@ mbc={}] start_peering_interval up [1,5,3] -> [0,5,1], acting [1,5,3] -> [0,5,1], acting_primary 1 -> 0, up_primary 1 -> 0, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Nov 28 03:09:05 localhost ceph-osd[32506]: osd.5 pg_epoch: 74 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=58/59 n=1 ec=44/33 lis/c=58/58 les/c/f=59/59/0 sis=74 pruub=10.864125252s) [0,5,1] r=1 lpr=74 pi=[58,74)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1236.879638672s@ mbc={}] state: transitioning to Stray Nov 28 03:09:06 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.1d scrub starts Nov 28 03:09:06 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.1d scrub ok Nov 28 03:09:08 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.9 scrub starts Nov 28 03:09:08 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.9 scrub ok Nov 28 03:09:10 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.15 scrub starts Nov 28 03:09:10 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.15 scrub ok Nov 28 03:09:11 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 5.b scrub starts Nov 28 03:09:11 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 5.b scrub ok Nov 28 03:09:12 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.1c scrub starts Nov 28 03:09:12 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.1c scrub ok Nov 28 03:09:13 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.5 scrub starts Nov 28 03:09:13 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.5 scrub ok Nov 28 03:09:14 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 5.12 scrub starts Nov 28 03:09:14 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 5.12 scrub ok Nov 28 03:09:16 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.a scrub starts Nov 28 03:09:16 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.a scrub ok Nov 28 03:09:19 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.c scrub starts Nov 28 03:09:19 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.c scrub ok Nov 28 03:09:20 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 5.13 scrub starts Nov 28 03:09:20 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 5.13 scrub ok Nov 28 03:09:22 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.18 scrub starts Nov 28 03:09:22 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.18 scrub ok Nov 28 03:09:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:09:23 localhost podman[58132]: 2025-11-28 08:09:23.851616382 +0000 UTC m=+0.084286301 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, container_name=metrics_qdr, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:09:24 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 6.12 scrub starts Nov 28 03:09:24 localhost podman[58132]: 2025-11-28 08:09:24.060560209 +0000 UTC m=+0.293230108 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.12, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 28 03:09:24 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 6.12 scrub ok Nov 28 03:09:24 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:09:24 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.2 scrub starts Nov 28 03:09:24 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.2 scrub ok Nov 28 03:09:25 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.1d scrub starts Nov 28 03:09:26 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.13 scrub starts Nov 28 03:09:26 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.13 scrub ok Nov 28 03:09:27 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.f scrub starts Nov 28 03:09:27 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.f scrub ok Nov 28 03:09:28 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.1 scrub starts Nov 28 03:09:28 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.1 scrub ok Nov 28 03:09:29 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.10 scrub starts Nov 28 03:09:29 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 2.10 scrub ok Nov 28 03:09:30 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.11 scrub starts Nov 28 03:09:30 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.11 scrub ok Nov 28 03:09:30 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 7.5 scrub starts Nov 28 03:09:30 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 7.5 scrub ok Nov 28 03:09:31 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 7.a deep-scrub starts Nov 28 03:09:31 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 7.a deep-scrub ok Nov 28 03:09:32 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 3.8 scrub starts Nov 28 03:09:32 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 3.8 scrub ok Nov 28 03:09:33 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.1d scrub starts Nov 28 03:09:33 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.1d scrub ok Nov 28 03:09:35 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.1a scrub starts Nov 28 03:09:35 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 3.1a scrub ok Nov 28 03:09:36 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.1d scrub starts Nov 28 03:09:36 localhost ceph-osd[31557]: log_channel(cluster) log [DBG] : 4.1d scrub ok Nov 28 03:09:38 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 5.4 scrub starts Nov 28 03:09:38 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 5.4 scrub ok Nov 28 03:09:41 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.b scrub starts Nov 28 03:09:41 localhost ceph-osd[32506]: log_channel(cluster) log [DBG] : 2.b scrub ok Nov 28 03:09:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:09:54 localhost podman[58238]: 2025-11-28 08:09:54.8428472 +0000 UTC m=+0.080390804 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1761123044, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:09:55 localhost podman[58238]: 2025-11-28 08:09:55.075494622 +0000 UTC m=+0.313038276 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1) Nov 28 03:09:55 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:10:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:10:25 localhost podman[58267]: 2025-11-28 08:10:25.831164212 +0000 UTC m=+0.071056002 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git) Nov 28 03:10:26 localhost podman[58267]: 2025-11-28 08:10:26.025054267 +0000 UTC m=+0.264946017 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, container_name=metrics_qdr, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc.) Nov 28 03:10:26 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:10:42 localhost systemd[1]: tmp-crun.k5fw0k.mount: Deactivated successfully. Nov 28 03:10:42 localhost podman[58396]: 2025-11-28 08:10:42.130077347 +0000 UTC m=+0.097372170 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, architecture=x86_64, release=553, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, RELEASE=main, ceph=True, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.openshift.expose-services=) Nov 28 03:10:42 localhost podman[58396]: 2025-11-28 08:10:42.242461058 +0000 UTC m=+0.209755921 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, architecture=x86_64, ceph=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, RELEASE=main, release=553, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_CLEAN=True, CEPH_POINT_RELEASE=, name=rhceph, vendor=Red Hat, Inc., GIT_BRANCH=main, build-date=2025-09-24T08:57:55, distribution-scope=public) Nov 28 03:10:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:10:56 localhost podman[58542]: 2025-11-28 08:10:56.839001578 +0000 UTC m=+0.077449695 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, container_name=metrics_qdr, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 28 03:10:57 localhost podman[58542]: 2025-11-28 08:10:57.029214463 +0000 UTC m=+0.267662580 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:10:57 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:11:03 localhost sshd[58571]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:11:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:11:27 localhost podman[58573]: 2025-11-28 08:11:27.845790017 +0000 UTC m=+0.079321142 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step1, url=https://www.redhat.com, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:11:28 localhost podman[58573]: 2025-11-28 08:11:28.05943892 +0000 UTC m=+0.292970085 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1) Nov 28 03:11:28 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:11:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:11:58 localhost podman[58680]: 2025-11-28 08:11:58.843821895 +0000 UTC m=+0.086025084 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible) Nov 28 03:11:59 localhost podman[58680]: 2025-11-28 08:11:59.033789312 +0000 UTC m=+0.275992551 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git) Nov 28 03:11:59 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:12:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:12:29 localhost podman[58707]: 2025-11-28 08:12:29.843325468 +0000 UTC m=+0.079154438 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, container_name=metrics_qdr, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044) Nov 28 03:12:30 localhost podman[58707]: 2025-11-28 08:12:30.016936732 +0000 UTC m=+0.252765712 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, container_name=metrics_qdr) Nov 28 03:12:30 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:13:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:13:00 localhost systemd[1]: tmp-crun.9wsZIe.mount: Deactivated successfully. Nov 28 03:13:00 localhost podman[58813]: 2025-11-28 08:13:00.848829917 +0000 UTC m=+0.079963962 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-qdrouterd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z) Nov 28 03:13:01 localhost podman[58813]: 2025-11-28 08:13:01.062631506 +0000 UTC m=+0.293765531 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible) Nov 28 03:13:01 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:13:17 localhost sshd[58845]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:13:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:13:31 localhost systemd[1]: tmp-crun.aRlRUu.mount: Deactivated successfully. Nov 28 03:13:31 localhost podman[58847]: 2025-11-28 08:13:31.856500614 +0000 UTC m=+0.091045058 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, build-date=2025-11-18T22:49:46Z, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.buildah.version=1.41.4) Nov 28 03:13:32 localhost podman[58847]: 2025-11-28 08:13:32.075971074 +0000 UTC m=+0.310515508 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git) Nov 28 03:13:32 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:13:33 localhost python3[58924]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:13:34 localhost python3[58969]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317613.451003-99573-86393723704210/source _original_basename=tmpok_1_q9e follow=False checksum=62439dd24dde40c90e7a39f6a1b31cc6061fe59b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:13:35 localhost python3[58999]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:13:37 localhost ansible-async_wrapper.py[59171]: Invoked with 906942319155 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317616.509703-99749-159616059690828/AnsiballZ_command.py _ Nov 28 03:13:37 localhost ansible-async_wrapper.py[59174]: Starting module and watcher Nov 28 03:13:37 localhost ansible-async_wrapper.py[59174]: Start watching 59175 (3600) Nov 28 03:13:37 localhost ansible-async_wrapper.py[59175]: Start module (59175) Nov 28 03:13:37 localhost ansible-async_wrapper.py[59171]: Return async_wrapper task started. Nov 28 03:13:37 localhost python3[59195]: ansible-ansible.legacy.async_status Invoked with jid=906942319155.59171 mode=status _async_dir=/tmp/.ansible_async Nov 28 03:13:40 localhost puppet-user[59194]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 28 03:13:40 localhost puppet-user[59194]: (file: /etc/puppet/hiera.yaml) Nov 28 03:13:40 localhost puppet-user[59194]: Warning: Undefined variable '::deploy_config_name'; Nov 28 03:13:40 localhost puppet-user[59194]: (file & line not available) Nov 28 03:13:40 localhost puppet-user[59194]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 28 03:13:40 localhost puppet-user[59194]: (file & line not available) Nov 28 03:13:40 localhost puppet-user[59194]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Nov 28 03:13:40 localhost puppet-user[59194]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Nov 28 03:13:40 localhost puppet-user[59194]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.11 seconds Nov 28 03:13:40 localhost puppet-user[59194]: Notice: Applied catalog in 0.04 seconds Nov 28 03:13:40 localhost puppet-user[59194]: Application: Nov 28 03:13:40 localhost puppet-user[59194]: Initial environment: production Nov 28 03:13:40 localhost puppet-user[59194]: Converged environment: production Nov 28 03:13:40 localhost puppet-user[59194]: Run mode: user Nov 28 03:13:40 localhost puppet-user[59194]: Changes: Nov 28 03:13:40 localhost puppet-user[59194]: Events: Nov 28 03:13:40 localhost puppet-user[59194]: Resources: Nov 28 03:13:40 localhost puppet-user[59194]: Total: 10 Nov 28 03:13:40 localhost puppet-user[59194]: Time: Nov 28 03:13:40 localhost puppet-user[59194]: Schedule: 0.00 Nov 28 03:13:40 localhost puppet-user[59194]: File: 0.00 Nov 28 03:13:40 localhost puppet-user[59194]: Exec: 0.01 Nov 28 03:13:40 localhost puppet-user[59194]: Augeas: 0.01 Nov 28 03:13:40 localhost puppet-user[59194]: Transaction evaluation: 0.03 Nov 28 03:13:40 localhost puppet-user[59194]: Catalog application: 0.04 Nov 28 03:13:40 localhost puppet-user[59194]: Config retrieval: 0.15 Nov 28 03:13:40 localhost puppet-user[59194]: Last run: 1764317620 Nov 28 03:13:40 localhost puppet-user[59194]: Filebucket: 0.00 Nov 28 03:13:40 localhost puppet-user[59194]: Total: 0.04 Nov 28 03:13:40 localhost puppet-user[59194]: Version: Nov 28 03:13:40 localhost puppet-user[59194]: Config: 1764317620 Nov 28 03:13:40 localhost puppet-user[59194]: Puppet: 7.10.0 Nov 28 03:13:40 localhost ansible-async_wrapper.py[59175]: Module complete (59175) Nov 28 03:13:42 localhost ansible-async_wrapper.py[59174]: Done in kid B. Nov 28 03:13:47 localhost python3[59371]: ansible-ansible.legacy.async_status Invoked with jid=906942319155.59171 mode=status _async_dir=/tmp/.ansible_async Nov 28 03:13:48 localhost python3[59414]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 28 03:13:48 localhost python3[59430]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:13:49 localhost python3[59480]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:13:49 localhost python3[59498]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmp8xnmvue_ recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 28 03:13:49 localhost python3[59528]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:13:51 localhost python3[59631]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Nov 28 03:13:51 localhost python3[59650]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:13:52 localhost python3[59682]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:13:53 localhost python3[59732]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:13:54 localhost python3[59750]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:13:54 localhost python3[59812]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:13:55 localhost python3[59830]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:13:55 localhost python3[59892]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:13:55 localhost python3[59910]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:13:56 localhost python3[59972]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:13:56 localhost python3[59990]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:13:57 localhost python3[60020]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:13:57 localhost systemd[1]: Reloading. Nov 28 03:13:57 localhost systemd-rc-local-generator[60038]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:13:57 localhost systemd-sysv-generator[60046]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:13:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:13:57 localhost python3[60106]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:13:58 localhost python3[60124]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:13:58 localhost python3[60186]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:13:59 localhost python3[60204]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:13:59 localhost python3[60234]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:13:59 localhost systemd[1]: Reloading. Nov 28 03:13:59 localhost systemd-rc-local-generator[60258]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:13:59 localhost systemd-sysv-generator[60262]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:13:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:13:59 localhost systemd[1]: Starting Create netns directory... Nov 28 03:13:59 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 28 03:13:59 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 28 03:13:59 localhost systemd[1]: Finished Create netns directory. Nov 28 03:14:00 localhost python3[60291]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Nov 28 03:14:02 localhost python3[60350]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step3 config_dir=/var/lib/tripleo-config/container-startup-config/step_3 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Nov 28 03:14:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:14:02 localhost podman[60379]: 2025-11-28 08:14:02.43925302 +0000 UTC m=+0.131910227 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.openshift.expose-services=, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:14:02 localhost podman[60526]: 2025-11-28 08:14:02.552186627 +0000 UTC m=+0.067797203 container create d527d6e05b826bf85108ac755f1e40f44588049a9c22eb8644f76dede82580c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, config_id=tripleo_step3, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, release=1761123044, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_statedir_owner, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:14:02 localhost podman[60548]: 2025-11-28 08:14:02.583079977 +0000 UTC m=+0.076139330 container create 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, container_name=collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container) Nov 28 03:14:02 localhost systemd[1]: Started libpod-conmon-d527d6e05b826bf85108ac755f1e40f44588049a9c22eb8644f76dede82580c1.scope. Nov 28 03:14:02 localhost podman[60552]: 2025-11-28 08:14:02.597506989 +0000 UTC m=+0.086307165 container create 8e7c684118204ac89680ace1a0889960b1955fa40b4d50e655f226c9c1f115be (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step3, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:35:22Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, container_name=nova_virtlogd_wrapper, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044) Nov 28 03:14:02 localhost systemd[1]: Started libcrun container. Nov 28 03:14:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/171c9fab2d92ad957aca0c5cf0ee4736f8cd2abe2dd09f25e9d89f04a4353dd2/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/171c9fab2d92ad957aca0c5cf0ee4736f8cd2abe2dd09f25e9d89f04a4353dd2/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/171c9fab2d92ad957aca0c5cf0ee4736f8cd2abe2dd09f25e9d89f04a4353dd2/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:02 localhost systemd[1]: Started libpod-conmon-2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.scope. Nov 28 03:14:02 localhost podman[60563]: 2025-11-28 08:14:02.618301555 +0000 UTC m=+0.098036391 container create 33a8f7b4baabd0da2b2d44596de4ba044dd80b4c1cbf36f390043c4e7ddd116e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, container_name=ceilometer_init_log, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step3) Nov 28 03:14:02 localhost podman[60526]: 2025-11-28 08:14:02.523364414 +0000 UTC m=+0.038975030 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 28 03:14:02 localhost podman[60526]: 2025-11-28 08:14:02.625538596 +0000 UTC m=+0.141149182 container init d527d6e05b826bf85108ac755f1e40f44588049a9c22eb8644f76dede82580c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_statedir_owner, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step3, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:14:02 localhost systemd[1]: Started libcrun container. Nov 28 03:14:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/483382bf68693c05a65aded8bd4df683c0e0d8870bcc7441be08a396c5476266/merged/scripts supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/483382bf68693c05a65aded8bd4df683c0e0d8870bcc7441be08a396c5476266/merged/var/log/collectd supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:02 localhost podman[60564]: 2025-11-28 08:14:02.63159123 +0000 UTC m=+0.105755588 container create 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, com.redhat.component=openstack-rsyslog-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044) Nov 28 03:14:02 localhost podman[60548]: 2025-11-28 08:14:02.544784789 +0000 UTC m=+0.037844142 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Nov 28 03:14:02 localhost podman[60552]: 2025-11-28 08:14:02.543924602 +0000 UTC m=+0.032724798 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 28 03:14:02 localhost podman[60563]: 2025-11-28 08:14:02.551242157 +0000 UTC m=+0.030976983 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Nov 28 03:14:02 localhost podman[60564]: 2025-11-28 08:14:02.574689598 +0000 UTC m=+0.048853976 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Nov 28 03:14:02 localhost systemd[1]: Started libpod-conmon-8e7c684118204ac89680ace1a0889960b1955fa40b4d50e655f226c9c1f115be.scope. Nov 28 03:14:02 localhost systemd[1]: Started libpod-conmon-9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f.scope. Nov 28 03:14:02 localhost systemd[1]: Started libpod-conmon-33a8f7b4baabd0da2b2d44596de4ba044dd80b4c1cbf36f390043c4e7ddd116e.scope. Nov 28 03:14:02 localhost systemd[1]: Started libcrun container. Nov 28 03:14:02 localhost systemd[1]: libpod-d527d6e05b826bf85108ac755f1e40f44588049a9c22eb8644f76dede82580c1.scope: Deactivated successfully. Nov 28 03:14:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/870580852a6f869c68321019e7d77d0da890e64d97007d93989d967a33c02183/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/870580852a6f869c68321019e7d77d0da890e64d97007d93989d967a33c02183/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/870580852a6f869c68321019e7d77d0da890e64d97007d93989d967a33c02183/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/870580852a6f869c68321019e7d77d0da890e64d97007d93989d967a33c02183/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:02 localhost systemd[1]: Started libcrun container. Nov 28 03:14:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/870580852a6f869c68321019e7d77d0da890e64d97007d93989d967a33c02183/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/870580852a6f869c68321019e7d77d0da890e64d97007d93989d967a33c02183/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/870580852a6f869c68321019e7d77d0da890e64d97007d93989d967a33c02183/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78b48dbff0a2fe76e018f9048f1970d44652db7588437c79ac71691eb45c0ad0/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:02 localhost systemd[1]: Started libcrun container. Nov 28 03:14:02 localhost podman[60379]: 2025-11-28 08:14:02.704403493 +0000 UTC m=+0.397060700 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:14:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:14:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:02 localhost podman[60548]: 2025-11-28 08:14:02.70682296 +0000 UTC m=+0.199882313 container init 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, name=rhosp17/openstack-collectd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, container_name=collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc.) Nov 28 03:14:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:02 localhost podman[60564]: 2025-11-28 08:14:02.714174716 +0000 UTC m=+0.188339084 container init 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, container_name=rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, summary=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, build-date=2025-11-18T22:49:49Z, name=rhosp17/openstack-rsyslog, config_id=tripleo_step3) Nov 28 03:14:02 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:14:02 localhost podman[60564]: 2025-11-28 08:14:02.72117744 +0000 UTC m=+0.195341808 container start 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-type=git, name=rhosp17/openstack-rsyslog, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step3, com.redhat.component=openstack-rsyslog-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:49Z, container_name=rsyslog, managed_by=tripleo_ansible, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 rsyslog) Nov 28 03:14:02 localhost python3[60350]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name rsyslog --conmon-pidfile /run/rsyslog.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=138ccb6252fd89d73a6c37a3f993f3eb --label config_id=tripleo_step3 --label container_name=rsyslog --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/rsyslog.log --network host --privileged=True --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:ro --volume /var/log/containers/rsyslog:/var/log/rsyslog:rw,z --volume /var/log:/var/log/host:ro --volume /var/lib/rsyslog.container:/var/lib/rsyslog:rw,z registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Nov 28 03:14:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:14:02 localhost systemd-logind[764]: Existing logind session ID 28 used by new audit session, ignoring. Nov 28 03:14:02 localhost podman[60548]: 2025-11-28 08:14:02.739532448 +0000 UTC m=+0.232591781 container start 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, tcib_managed=true, io.openshift.expose-services=, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:14:02 localhost systemd[1]: Created slice User Slice of UID 0. Nov 28 03:14:02 localhost python3[60350]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name collectd --cap-add IPC_LOCK --conmon-pidfile /run/collectd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=da9a0dc7b40588672419e3ce10063e21 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=collectd --label managed_by=tripleo_ansible --label config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/collectd.log --memory 512m --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro --volume /var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/collectd:/var/log/collectd:rw,z --volume /var/lib/container-config-scripts:/config-scripts:ro --volume /var/lib/container-user-scripts:/scripts:z --volume /run:/run:rw --volume /sys/fs/cgroup:/sys/fs/cgroup:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Nov 28 03:14:02 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Nov 28 03:14:02 localhost podman[60563]: 2025-11-28 08:14:02.763954911 +0000 UTC m=+0.243689777 container init 33a8f7b4baabd0da2b2d44596de4ba044dd80b4c1cbf36f390043c4e7ddd116e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, batch=17.1_20251118.1, config_id=tripleo_step3, container_name=ceilometer_init_log, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, url=https://www.redhat.com, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:14:02 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Nov 28 03:14:02 localhost podman[60563]: 2025-11-28 08:14:02.771952227 +0000 UTC m=+0.251687063 container start 33a8f7b4baabd0da2b2d44596de4ba044dd80b4c1cbf36f390043c4e7ddd116e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, container_name=ceilometer_init_log, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 28 03:14:02 localhost systemd[1]: Starting User Manager for UID 0... Nov 28 03:14:02 localhost python3[60350]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_init_log --conmon-pidfile /run/ceilometer_init_log.pid --detach=True --label config_id=tripleo_step3 --label container_name=ceilometer_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_init_log.log --network none --user root --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 /bin/bash -c chown -R ceilometer:ceilometer /var/log/ceilometer Nov 28 03:14:02 localhost systemd[1]: libpod-33a8f7b4baabd0da2b2d44596de4ba044dd80b4c1cbf36f390043c4e7ddd116e.scope: Deactivated successfully. Nov 28 03:14:02 localhost podman[60526]: 2025-11-28 08:14:02.787887697 +0000 UTC m=+0.303498283 container start d527d6e05b826bf85108ac755f1e40f44588049a9c22eb8644f76dede82580c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step3, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_statedir_owner, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:14:02 localhost podman[60526]: 2025-11-28 08:14:02.792298319 +0000 UTC m=+0.307908895 container attach d527d6e05b826bf85108ac755f1e40f44588049a9c22eb8644f76dede82580c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, container_name=nova_statedir_owner, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, version=17.1.12, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:14:02 localhost systemd[1]: libpod-9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f.scope: Deactivated successfully. Nov 28 03:14:02 localhost podman[60552]: 2025-11-28 08:14:02.812971491 +0000 UTC m=+0.301771667 container init 8e7c684118204ac89680ace1a0889960b1955fa40b4d50e655f226c9c1f115be (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public, build-date=2025-11-19T00:35:22Z, container_name=nova_virtlogd_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step3, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12) Nov 28 03:14:02 localhost podman[60689]: 2025-11-28 08:14:02.832450805 +0000 UTC m=+0.038538766 container died 33a8f7b4baabd0da2b2d44596de4ba044dd80b4c1cbf36f390043c4e7ddd116e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, container_name=ceilometer_init_log, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044) Nov 28 03:14:02 localhost podman[60526]: 2025-11-28 08:14:02.845211994 +0000 UTC m=+0.360822580 container died d527d6e05b826bf85108ac755f1e40f44588049a9c22eb8644f76dede82580c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, distribution-scope=public, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_statedir_owner, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Nov 28 03:14:02 localhost systemd-logind[764]: Existing logind session ID 28 used by new audit session, ignoring. Nov 28 03:14:02 localhost podman[60707]: 2025-11-28 08:14:02.860281706 +0000 UTC m=+0.048436992 container died 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., container_name=rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp17/openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:49Z, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step3, io.openshift.expose-services=, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog) Nov 28 03:14:02 localhost podman[60552]: 2025-11-28 08:14:02.871543567 +0000 UTC m=+0.360343743 container start 8e7c684118204ac89680ace1a0889960b1955fa40b4d50e655f226c9c1f115be (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, container_name=nova_virtlogd_wrapper, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:14:02 localhost python3[60350]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/nova_virtlogd_wrapper.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=0f0904943dda1bf1d123bdf96d71020f --label config_id=tripleo_step3 --label container_name=nova_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtlogd_wrapper.log --network host --pid host --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 28 03:14:02 localhost podman[60707]: 2025-11-28 08:14:02.89003885 +0000 UTC m=+0.078194106 container cleanup 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.component=openstack-rsyslog-container, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp17/openstack-rsyslog, architecture=x86_64, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, vendor=Red Hat, Inc., distribution-scope=public) Nov 28 03:14:02 localhost systemd[1]: libpod-conmon-9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f.scope: Deactivated successfully. Nov 28 03:14:02 localhost systemd[60679]: Queued start job for default target Main User Target. Nov 28 03:14:02 localhost systemd[60679]: Created slice User Application Slice. Nov 28 03:14:02 localhost systemd[60679]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Nov 28 03:14:02 localhost systemd[60679]: Started Daily Cleanup of User's Temporary Directories. Nov 28 03:14:02 localhost systemd[60679]: Reached target Paths. Nov 28 03:14:02 localhost systemd[60679]: Reached target Timers. Nov 28 03:14:02 localhost systemd[60679]: Starting D-Bus User Message Bus Socket... Nov 28 03:14:02 localhost systemd[60679]: Starting Create User's Volatile Files and Directories... Nov 28 03:14:02 localhost systemd[60679]: Finished Create User's Volatile Files and Directories. Nov 28 03:14:02 localhost systemd[60679]: Listening on D-Bus User Message Bus Socket. Nov 28 03:14:02 localhost systemd[60679]: Reached target Sockets. Nov 28 03:14:02 localhost systemd[60679]: Reached target Basic System. Nov 28 03:14:02 localhost systemd[60679]: Reached target Main User Target. Nov 28 03:14:02 localhost systemd[60679]: Startup finished in 142ms. Nov 28 03:14:02 localhost systemd[1]: Started User Manager for UID 0. Nov 28 03:14:02 localhost systemd[1]: Started Session c1 of User root. Nov 28 03:14:02 localhost systemd[1]: Started Session c2 of User root. Nov 28 03:14:02 localhost podman[60630]: 2025-11-28 08:14:02.967991017 +0000 UTC m=+0.261692304 container cleanup d527d6e05b826bf85108ac755f1e40f44588049a9c22eb8644f76dede82580c1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, managed_by=tripleo_ansible, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, container_name=nova_statedir_owner, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z) Nov 28 03:14:02 localhost systemd[1]: libpod-conmon-d527d6e05b826bf85108ac755f1e40f44588049a9c22eb8644f76dede82580c1.scope: Deactivated successfully. Nov 28 03:14:02 localhost python3[60350]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_statedir_owner --conmon-pidfile /run/nova_statedir_owner.pid --detach=False --env NOVA_STATEDIR_OWNERSHIP_SKIP=triliovault-mounts --env TRIPLEO_DEPLOY_IDENTIFIER=1764316155 --env __OS_DEBUG=true --label config_id=tripleo_step3 --label container_name=nova_statedir_owner --label managed_by=tripleo_ansible --label config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_statedir_owner.log --network none --privileged=False --security-opt label=disable --user root --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/container-config-scripts:/container-config-scripts:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py Nov 28 03:14:03 localhost podman[60689]: 2025-11-28 08:14:03.013385221 +0000 UTC m=+0.219473182 container cleanup 33a8f7b4baabd0da2b2d44596de4ba044dd80b4c1cbf36f390043c4e7ddd116e (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, tcib_managed=true, container_name=ceilometer_init_log, build-date=2025-11-19T00:12:45Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_id=tripleo_step3, distribution-scope=public) Nov 28 03:14:03 localhost systemd[1]: libpod-conmon-33a8f7b4baabd0da2b2d44596de4ba044dd80b4c1cbf36f390043c4e7ddd116e.scope: Deactivated successfully. Nov 28 03:14:03 localhost podman[60649]: 2025-11-28 08:14:02.981051745 +0000 UTC m=+0.237521630 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:14:03 localhost systemd[1]: session-c2.scope: Deactivated successfully. Nov 28 03:14:03 localhost systemd[1]: session-c1.scope: Deactivated successfully. Nov 28 03:14:03 localhost podman[60649]: 2025-11-28 08:14:03.067422342 +0000 UTC m=+0.323892237 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64) Nov 28 03:14:03 localhost podman[60649]: unhealthy Nov 28 03:14:03 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:14:03 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Failed with result 'exit-code'. Nov 28 03:14:03 localhost podman[60903]: 2025-11-28 08:14:03.298948788 +0000 UTC m=+0.079831378 container create 4ec37aacc81c30d8f8c1872da98bbae1184ac4d79565b18c976c336f5619bd9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, tcib_managed=true) Nov 28 03:14:03 localhost systemd[1]: Started libpod-conmon-4ec37aacc81c30d8f8c1872da98bbae1184ac4d79565b18c976c336f5619bd9a.scope. Nov 28 03:14:03 localhost podman[60903]: 2025-11-28 08:14:03.255815326 +0000 UTC m=+0.036697926 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 28 03:14:03 localhost systemd[1]: Started libcrun container. Nov 28 03:14:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2373ee3a3140a9b51206ab4b5b4d03e16415bec87f47ab14239ebc6234afc02f/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2373ee3a3140a9b51206ab4b5b4d03e16415bec87f47ab14239ebc6234afc02f/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2373ee3a3140a9b51206ab4b5b4d03e16415bec87f47ab14239ebc6234afc02f/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2373ee3a3140a9b51206ab4b5b4d03e16415bec87f47ab14239ebc6234afc02f/merged/var/log/swtpm/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:03 localhost podman[60903]: 2025-11-28 08:14:03.369771467 +0000 UTC m=+0.150654017 container init 4ec37aacc81c30d8f8c1872da98bbae1184ac4d79565b18c976c336f5619bd9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:14:03 localhost podman[60903]: 2025-11-28 08:14:03.379424365 +0000 UTC m=+0.160306905 container start 4ec37aacc81c30d8f8c1872da98bbae1184ac4d79565b18c976c336f5619bd9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, distribution-scope=public) Nov 28 03:14:03 localhost systemd[1]: var-lib-containers-storage-overlay-171c9fab2d92ad957aca0c5cf0ee4736f8cd2abe2dd09f25e9d89f04a4353dd2-merged.mount: Deactivated successfully. Nov 28 03:14:03 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d527d6e05b826bf85108ac755f1e40f44588049a9c22eb8644f76dede82580c1-userdata-shm.mount: Deactivated successfully. Nov 28 03:14:03 localhost podman[60955]: 2025-11-28 08:14:03.452210667 +0000 UTC m=+0.070528700 container create 2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-nova-libvirt, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtsecretd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container) Nov 28 03:14:03 localhost systemd[1]: Started libpod-conmon-2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76.scope. Nov 28 03:14:03 localhost systemd[1]: Started libcrun container. Nov 28 03:14:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e56fe7373565106f757a890fd2f61ff50243f466d810e377560c4fab1b4ea8c4/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e56fe7373565106f757a890fd2f61ff50243f466d810e377560c4fab1b4ea8c4/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e56fe7373565106f757a890fd2f61ff50243f466d810e377560c4fab1b4ea8c4/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e56fe7373565106f757a890fd2f61ff50243f466d810e377560c4fab1b4ea8c4/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e56fe7373565106f757a890fd2f61ff50243f466d810e377560c4fab1b4ea8c4/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e56fe7373565106f757a890fd2f61ff50243f466d810e377560c4fab1b4ea8c4/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e56fe7373565106f757a890fd2f61ff50243f466d810e377560c4fab1b4ea8c4/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:03 localhost podman[60955]: 2025-11-28 08:14:03.419251032 +0000 UTC m=+0.037569095 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 28 03:14:03 localhost podman[60955]: 2025-11-28 08:14:03.520239687 +0000 UTC m=+0.138557720 container init 2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, config_id=tripleo_step3, version=17.1.12, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_virtsecretd, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:14:03 localhost podman[60955]: 2025-11-28 08:14:03.52969682 +0000 UTC m=+0.148014863 container start 2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, architecture=x86_64, version=17.1.12, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtsecretd, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt) Nov 28 03:14:03 localhost python3[60350]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtsecretd --cgroupns=host --conmon-pidfile /run/nova_virtsecretd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=0f0904943dda1bf1d123bdf96d71020f --label config_id=tripleo_step3 --label container_name=nova_virtsecretd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtsecretd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 28 03:14:03 localhost systemd-logind[764]: Existing logind session ID 28 used by new audit session, ignoring. Nov 28 03:14:03 localhost systemd[1]: Started Session c3 of User root. Nov 28 03:14:03 localhost systemd[1]: session-c3.scope: Deactivated successfully. Nov 28 03:14:03 localhost podman[61099]: 2025-11-28 08:14:03.974451927 +0000 UTC m=+0.077147693 container create 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-iscsid-container, container_name=iscsid, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 28 03:14:04 localhost podman[61112]: 2025-11-28 08:14:04.011961968 +0000 UTC m=+0.084827658 container create 6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, container_name=nova_virtnodedevd, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, vcs-type=git, architecture=x86_64, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible) Nov 28 03:14:04 localhost systemd[1]: Started libpod-conmon-08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.scope. Nov 28 03:14:04 localhost podman[61099]: 2025-11-28 08:14:03.928740283 +0000 UTC m=+0.031436079 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Nov 28 03:14:04 localhost systemd[1]: Started libcrun container. Nov 28 03:14:04 localhost systemd[1]: Started libpod-conmon-6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265.scope. Nov 28 03:14:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b21fd5920b03309569c61b370f896baf7b2149eb0e6261b2fb5b94e6a082fed/merged/etc/target supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b21fd5920b03309569c61b370f896baf7b2149eb0e6261b2fb5b94e6a082fed/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:04 localhost systemd[1]: Started libcrun container. Nov 28 03:14:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/025413710f75ace93305dda8659754012c7e6b0ed71f82e0bba1b040134d7ebe/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/025413710f75ace93305dda8659754012c7e6b0ed71f82e0bba1b040134d7ebe/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/025413710f75ace93305dda8659754012c7e6b0ed71f82e0bba1b040134d7ebe/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/025413710f75ace93305dda8659754012c7e6b0ed71f82e0bba1b040134d7ebe/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/025413710f75ace93305dda8659754012c7e6b0ed71f82e0bba1b040134d7ebe/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/025413710f75ace93305dda8659754012c7e6b0ed71f82e0bba1b040134d7ebe/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/025413710f75ace93305dda8659754012c7e6b0ed71f82e0bba1b040134d7ebe/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:04 localhost podman[61112]: 2025-11-28 08:14:03.967827264 +0000 UTC m=+0.040693024 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 28 03:14:04 localhost podman[61112]: 2025-11-28 08:14:04.073108797 +0000 UTC m=+0.145974487 container init 6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, vcs-type=git, config_id=tripleo_step3, version=17.1.12, container_name=nova_virtnodedevd, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044) Nov 28 03:14:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:14:04 localhost podman[61099]: 2025-11-28 08:14:04.078266322 +0000 UTC m=+0.180962088 container init 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 28 03:14:04 localhost podman[61112]: 2025-11-28 08:14:04.082440565 +0000 UTC m=+0.155306275 container start 6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, architecture=x86_64, container_name=nova_virtnodedevd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git) Nov 28 03:14:04 localhost python3[60350]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtnodedevd --cgroupns=host --conmon-pidfile /run/nova_virtnodedevd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=0f0904943dda1bf1d123bdf96d71020f --label config_id=tripleo_step3 --label container_name=nova_virtnodedevd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtnodedevd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 28 03:14:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:14:04 localhost systemd-logind[764]: Existing logind session ID 28 used by new audit session, ignoring. Nov 28 03:14:04 localhost podman[61099]: 2025-11-28 08:14:04.11598144 +0000 UTC m=+0.218677206 container start 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1) Nov 28 03:14:04 localhost python3[60350]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name iscsid --conmon-pidfile /run/iscsid.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=c9c242145d21d40ef98889981c05ca84 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=iscsid --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/iscsid.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Nov 28 03:14:04 localhost systemd-logind[764]: Existing logind session ID 28 used by new audit session, ignoring. Nov 28 03:14:04 localhost systemd[1]: Started Session c4 of User root. Nov 28 03:14:04 localhost systemd[1]: Started Session c5 of User root. Nov 28 03:14:04 localhost podman[61149]: 2025-11-28 08:14:04.234111585 +0000 UTC m=+0.095837922 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, distribution-scope=public) Nov 28 03:14:04 localhost systemd[1]: session-c5.scope: Deactivated successfully. Nov 28 03:14:04 localhost systemd[1]: session-c4.scope: Deactivated successfully. Nov 28 03:14:04 localhost kernel: Loading iSCSI transport class v2.0-870. Nov 28 03:14:04 localhost podman[61149]: 2025-11-28 08:14:04.286471291 +0000 UTC m=+0.148197658 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, vcs-type=git, release=1761123044, architecture=x86_64, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:14:04 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:14:04 localhost podman[61274]: 2025-11-28 08:14:04.736149756 +0000 UTC m=+0.082520285 container create 635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, container_name=nova_virtstoraged, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true) Nov 28 03:14:04 localhost systemd[1]: Started libpod-conmon-635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951.scope. Nov 28 03:14:04 localhost systemd[1]: Started libcrun container. Nov 28 03:14:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc6f1986e221e102f7feefd3b96110a2bdd081d61203281438672df579c3b8f4/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc6f1986e221e102f7feefd3b96110a2bdd081d61203281438672df579c3b8f4/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc6f1986e221e102f7feefd3b96110a2bdd081d61203281438672df579c3b8f4/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc6f1986e221e102f7feefd3b96110a2bdd081d61203281438672df579c3b8f4/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc6f1986e221e102f7feefd3b96110a2bdd081d61203281438672df579c3b8f4/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc6f1986e221e102f7feefd3b96110a2bdd081d61203281438672df579c3b8f4/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc6f1986e221e102f7feefd3b96110a2bdd081d61203281438672df579c3b8f4/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:04 localhost podman[61274]: 2025-11-28 08:14:04.696865448 +0000 UTC m=+0.043235957 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 28 03:14:04 localhost podman[61274]: 2025-11-28 08:14:04.800336272 +0000 UTC m=+0.146706771 container init 635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-type=git, container_name=nova_virtstoraged, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, architecture=x86_64, release=1761123044, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:14:04 localhost podman[61274]: 2025-11-28 08:14:04.814030141 +0000 UTC m=+0.160400640 container start 635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtstoraged, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12) Nov 28 03:14:04 localhost python3[60350]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtstoraged --cgroupns=host --conmon-pidfile /run/nova_virtstoraged.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=0f0904943dda1bf1d123bdf96d71020f --label config_id=tripleo_step3 --label container_name=nova_virtstoraged --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtstoraged.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 28 03:14:04 localhost systemd-logind[764]: Existing logind session ID 28 used by new audit session, ignoring. Nov 28 03:14:04 localhost systemd[1]: Started Session c6 of User root. Nov 28 03:14:04 localhost systemd[1]: session-c6.scope: Deactivated successfully. Nov 28 03:14:05 localhost podman[61378]: 2025-11-28 08:14:05.288981405 +0000 UTC m=+0.085195340 container create 60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step3, container_name=nova_virtqemud, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12) Nov 28 03:14:05 localhost systemd[1]: Started libpod-conmon-60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057.scope. Nov 28 03:14:05 localhost systemd[1]: Started libcrun container. Nov 28 03:14:05 localhost podman[61378]: 2025-11-28 08:14:05.247877088 +0000 UTC m=+0.044091063 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 28 03:14:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/850a5494078c53945d7c69a5c914ed859c31bc07523103c44dd52329f5c128d7/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/850a5494078c53945d7c69a5c914ed859c31bc07523103c44dd52329f5c128d7/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/850a5494078c53945d7c69a5c914ed859c31bc07523103c44dd52329f5c128d7/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/850a5494078c53945d7c69a5c914ed859c31bc07523103c44dd52329f5c128d7/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/850a5494078c53945d7c69a5c914ed859c31bc07523103c44dd52329f5c128d7/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/850a5494078c53945d7c69a5c914ed859c31bc07523103c44dd52329f5c128d7/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/850a5494078c53945d7c69a5c914ed859c31bc07523103c44dd52329f5c128d7/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/850a5494078c53945d7c69a5c914ed859c31bc07523103c44dd52329f5c128d7/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:05 localhost podman[61378]: 2025-11-28 08:14:05.360531347 +0000 UTC m=+0.156745322 container init 60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, vendor=Red Hat, Inc., release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, url=https://www.redhat.com, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, config_id=tripleo_step3, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtqemud, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:14:05 localhost podman[61378]: 2025-11-28 08:14:05.370534657 +0000 UTC m=+0.166748612 container start 60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, batch=17.1_20251118.1, config_id=tripleo_step3, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtqemud, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, url=https://www.redhat.com) Nov 28 03:14:05 localhost python3[60350]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud --cgroupns=host --conmon-pidfile /run/nova_virtqemud.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=0f0904943dda1bf1d123bdf96d71020f --label config_id=tripleo_step3 --label container_name=nova_virtqemud --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 28 03:14:05 localhost systemd-logind[764]: Existing logind session ID 28 used by new audit session, ignoring. Nov 28 03:14:05 localhost systemd[1]: Started Session c7 of User root. Nov 28 03:14:05 localhost systemd[1]: session-c7.scope: Deactivated successfully. Nov 28 03:14:05 localhost podman[61486]: 2025-11-28 08:14:05.771739078 +0000 UTC m=+0.069586819 container create 76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_virtproxyd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, release=1761123044, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:14:05 localhost systemd[1]: Started libpod-conmon-76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50.scope. Nov 28 03:14:05 localhost systemd[1]: Started libcrun container. Nov 28 03:14:05 localhost podman[61486]: 2025-11-28 08:14:05.73431287 +0000 UTC m=+0.032160661 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 28 03:14:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ed6a910a473599a3c0eb40fb417f8a3b4e8b49460b9c3a2f878bde6830d5976/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ed6a910a473599a3c0eb40fb417f8a3b4e8b49460b9c3a2f878bde6830d5976/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ed6a910a473599a3c0eb40fb417f8a3b4e8b49460b9c3a2f878bde6830d5976/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ed6a910a473599a3c0eb40fb417f8a3b4e8b49460b9c3a2f878bde6830d5976/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ed6a910a473599a3c0eb40fb417f8a3b4e8b49460b9c3a2f878bde6830d5976/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ed6a910a473599a3c0eb40fb417f8a3b4e8b49460b9c3a2f878bde6830d5976/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ed6a910a473599a3c0eb40fb417f8a3b4e8b49460b9c3a2f878bde6830d5976/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:05 localhost podman[61486]: 2025-11-28 08:14:05.844333414 +0000 UTC m=+0.142181185 container init 76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., container_name=nova_virtproxyd, config_id=tripleo_step3, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1761123044, distribution-scope=public, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 28 03:14:05 localhost podman[61486]: 2025-11-28 08:14:05.856191385 +0000 UTC m=+0.154039156 container start 76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, config_id=tripleo_step3, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, container_name=nova_virtproxyd, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 28 03:14:05 localhost python3[60350]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtproxyd --cgroupns=host --conmon-pidfile /run/nova_virtproxyd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=0f0904943dda1bf1d123bdf96d71020f --label config_id=tripleo_step3 --label container_name=nova_virtproxyd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtproxyd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 28 03:14:05 localhost systemd-logind[764]: Existing logind session ID 28 used by new audit session, ignoring. Nov 28 03:14:05 localhost systemd[1]: Started Session c8 of User root. Nov 28 03:14:06 localhost systemd[1]: session-c8.scope: Deactivated successfully. Nov 28 03:14:06 localhost python3[61567]: ansible-file Invoked with path=/etc/systemd/system/tripleo_collectd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:14:06 localhost python3[61583]: ansible-file Invoked with path=/etc/systemd/system/tripleo_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:14:06 localhost python3[61599]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:14:07 localhost python3[61615]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:14:07 localhost python3[61631]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:14:07 localhost python3[61647]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:14:08 localhost python3[61663]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:14:08 localhost python3[61679]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:14:08 localhost python3[61695]: ansible-file Invoked with path=/etc/systemd/system/tripleo_rsyslog.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:14:08 localhost python3[61711]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_collectd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:14:09 localhost python3[61727]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_iscsid_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:14:09 localhost python3[61743]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:14:09 localhost python3[61759]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:14:09 localhost python3[61775]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:14:09 localhost python3[61791]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:14:10 localhost python3[61807]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:14:10 localhost python3[61823]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:14:10 localhost python3[61839]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_rsyslog_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:14:11 localhost python3[61900]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317650.7635152-101024-248536602527523/source dest=/etc/systemd/system/tripleo_collectd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:14:11 localhost python3[61930]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317650.7635152-101024-248536602527523/source dest=/etc/systemd/system/tripleo_iscsid.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:14:12 localhost python3[61959]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317650.7635152-101024-248536602527523/source dest=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:14:12 localhost python3[61988]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317650.7635152-101024-248536602527523/source dest=/etc/systemd/system/tripleo_nova_virtnodedevd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:14:13 localhost python3[62017]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317650.7635152-101024-248536602527523/source dest=/etc/systemd/system/tripleo_nova_virtproxyd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:14:13 localhost python3[62046]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317650.7635152-101024-248536602527523/source dest=/etc/systemd/system/tripleo_nova_virtqemud.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:14:14 localhost python3[62075]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317650.7635152-101024-248536602527523/source dest=/etc/systemd/system/tripleo_nova_virtsecretd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:14:14 localhost python3[62104]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317650.7635152-101024-248536602527523/source dest=/etc/systemd/system/tripleo_nova_virtstoraged.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:14:15 localhost python3[62133]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317650.7635152-101024-248536602527523/source dest=/etc/systemd/system/tripleo_rsyslog.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:14:15 localhost python3[62149]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 28 03:14:15 localhost systemd[1]: Reloading. Nov 28 03:14:15 localhost systemd-sysv-generator[62179]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:14:15 localhost systemd-rc-local-generator[62176]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:14:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:14:16 localhost systemd[1]: Stopping User Manager for UID 0... Nov 28 03:14:16 localhost systemd[60679]: Activating special unit Exit the Session... Nov 28 03:14:16 localhost systemd[60679]: Stopped target Main User Target. Nov 28 03:14:16 localhost systemd[60679]: Stopped target Basic System. Nov 28 03:14:16 localhost systemd[60679]: Stopped target Paths. Nov 28 03:14:16 localhost systemd[60679]: Stopped target Sockets. Nov 28 03:14:16 localhost systemd[60679]: Stopped target Timers. Nov 28 03:14:16 localhost systemd[60679]: Stopped Daily Cleanup of User's Temporary Directories. Nov 28 03:14:16 localhost systemd[60679]: Closed D-Bus User Message Bus Socket. Nov 28 03:14:16 localhost systemd[60679]: Stopped Create User's Volatile Files and Directories. Nov 28 03:14:16 localhost systemd[60679]: Removed slice User Application Slice. Nov 28 03:14:16 localhost systemd[60679]: Reached target Shutdown. Nov 28 03:14:16 localhost systemd[60679]: Finished Exit the Session. Nov 28 03:14:16 localhost systemd[60679]: Reached target Exit the Session. Nov 28 03:14:16 localhost systemd[1]: user@0.service: Deactivated successfully. Nov 28 03:14:16 localhost systemd[1]: Stopped User Manager for UID 0. Nov 28 03:14:16 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Nov 28 03:14:16 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Nov 28 03:14:16 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Nov 28 03:14:16 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Nov 28 03:14:16 localhost systemd[1]: Removed slice User Slice of UID 0. Nov 28 03:14:16 localhost python3[62202]: ansible-systemd Invoked with state=restarted name=tripleo_collectd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:14:16 localhost systemd[1]: Reloading. Nov 28 03:14:16 localhost systemd-sysv-generator[62231]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:14:16 localhost systemd-rc-local-generator[62227]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:14:16 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:14:17 localhost systemd[1]: Starting collectd container... Nov 28 03:14:17 localhost systemd[1]: Started collectd container. Nov 28 03:14:17 localhost python3[62269]: ansible-systemd Invoked with state=restarted name=tripleo_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:14:17 localhost systemd[1]: Reloading. Nov 28 03:14:17 localhost systemd-sysv-generator[62298]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:14:17 localhost systemd-rc-local-generator[62294]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:14:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:14:18 localhost systemd[1]: Starting iscsid container... Nov 28 03:14:18 localhost systemd[1]: Started iscsid container. Nov 28 03:14:18 localhost python3[62336]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtlogd_wrapper.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:14:18 localhost systemd[1]: Reloading. Nov 28 03:14:18 localhost systemd-sysv-generator[62364]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:14:18 localhost systemd-rc-local-generator[62360]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:14:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:14:19 localhost systemd[1]: Starting nova_virtlogd_wrapper container... Nov 28 03:14:19 localhost systemd[1]: Started nova_virtlogd_wrapper container. Nov 28 03:14:19 localhost python3[62403]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtnodedevd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:14:20 localhost systemd[1]: Reloading. Nov 28 03:14:20 localhost systemd-sysv-generator[62436]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:14:20 localhost systemd-rc-local-generator[62432]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:14:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:14:21 localhost systemd[1]: Starting nova_virtnodedevd container... Nov 28 03:14:21 localhost tripleo-start-podman-container[62443]: Creating additional drop-in dependency for "nova_virtnodedevd" (6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265) Nov 28 03:14:21 localhost systemd[1]: Reloading. Nov 28 03:14:21 localhost systemd-rc-local-generator[62501]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:14:21 localhost systemd-sysv-generator[62504]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:14:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:14:21 localhost systemd[1]: Started nova_virtnodedevd container. Nov 28 03:14:22 localhost python3[62528]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtproxyd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:14:22 localhost systemd[1]: Reloading. Nov 28 03:14:22 localhost systemd-rc-local-generator[62553]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:14:22 localhost systemd-sysv-generator[62556]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:14:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:14:22 localhost systemd[1]: Starting nova_virtproxyd container... Nov 28 03:14:22 localhost tripleo-start-podman-container[62567]: Creating additional drop-in dependency for "nova_virtproxyd" (76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50) Nov 28 03:14:22 localhost systemd[1]: Reloading. Nov 28 03:14:22 localhost systemd-sysv-generator[62628]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:14:22 localhost systemd-rc-local-generator[62625]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:14:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:14:22 localhost systemd[1]: Started nova_virtproxyd container. Nov 28 03:14:23 localhost python3[62652]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtqemud.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:14:23 localhost systemd[1]: Reloading. Nov 28 03:14:23 localhost systemd-sysv-generator[62682]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:14:23 localhost systemd-rc-local-generator[62677]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:14:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:14:23 localhost systemd[1]: Starting nova_virtqemud container... Nov 28 03:14:23 localhost tripleo-start-podman-container[62692]: Creating additional drop-in dependency for "nova_virtqemud" (60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057) Nov 28 03:14:23 localhost systemd[1]: Reloading. Nov 28 03:14:24 localhost systemd-sysv-generator[62753]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:14:24 localhost systemd-rc-local-generator[62749]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:14:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:14:24 localhost systemd[1]: Started nova_virtqemud container. Nov 28 03:14:25 localhost python3[62775]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtsecretd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:14:25 localhost systemd[1]: Reloading. Nov 28 03:14:25 localhost systemd-sysv-generator[62806]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:14:25 localhost systemd-rc-local-generator[62800]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:14:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:14:25 localhost systemd[1]: Starting nova_virtsecretd container... Nov 28 03:14:25 localhost tripleo-start-podman-container[62814]: Creating additional drop-in dependency for "nova_virtsecretd" (2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76) Nov 28 03:14:25 localhost systemd[1]: Reloading. Nov 28 03:14:25 localhost systemd-rc-local-generator[62868]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:14:25 localhost systemd-sysv-generator[62874]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:14:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:14:25 localhost systemd[1]: Started nova_virtsecretd container. Nov 28 03:14:26 localhost python3[62897]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtstoraged.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:14:26 localhost systemd[1]: Reloading. Nov 28 03:14:26 localhost systemd-sysv-generator[62925]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:14:26 localhost systemd-rc-local-generator[62921]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:14:26 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:14:26 localhost systemd[1]: Starting nova_virtstoraged container... Nov 28 03:14:27 localhost tripleo-start-podman-container[62937]: Creating additional drop-in dependency for "nova_virtstoraged" (635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951) Nov 28 03:14:27 localhost systemd[1]: Reloading. Nov 28 03:14:27 localhost systemd-rc-local-generator[62995]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:14:27 localhost systemd-sysv-generator[62998]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:14:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:14:27 localhost systemd[1]: Started nova_virtstoraged container. Nov 28 03:14:27 localhost python3[63022]: ansible-systemd Invoked with state=restarted name=tripleo_rsyslog.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:14:27 localhost systemd[1]: Reloading. Nov 28 03:14:28 localhost systemd-sysv-generator[63053]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:14:28 localhost systemd-rc-local-generator[63049]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:14:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:14:28 localhost systemd[1]: Starting rsyslog container... Nov 28 03:14:28 localhost systemd[1]: Started libcrun container. Nov 28 03:14:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:28 localhost podman[63061]: 2025-11-28 08:14:28.454550936 +0000 UTC m=+0.118281470 container init 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, container_name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, build-date=2025-11-18T22:49:49Z, name=rhosp17/openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, release=1761123044, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, config_id=tripleo_step3) Nov 28 03:14:28 localhost podman[63061]: 2025-11-28 08:14:28.465828027 +0000 UTC m=+0.129558551 container start 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2025-11-18T22:49:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, container_name=rsyslog, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git) Nov 28 03:14:28 localhost podman[63061]: rsyslog Nov 28 03:14:28 localhost systemd[1]: Started rsyslog container. Nov 28 03:14:28 localhost systemd[1]: libpod-9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f.scope: Deactivated successfully. Nov 28 03:14:28 localhost podman[63098]: 2025-11-28 08:14:28.634159219 +0000 UTC m=+0.050897231 container died 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, version=17.1.12, build-date=2025-11-18T22:49:49Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, vcs-type=git) Nov 28 03:14:28 localhost podman[63098]: 2025-11-28 08:14:28.660991279 +0000 UTC m=+0.077729261 container cleanup 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, tcib_managed=true, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-rsyslog-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:49Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, name=rhosp17/openstack-rsyslog, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}) Nov 28 03:14:28 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:14:28 localhost podman[63110]: 2025-11-28 08:14:28.747010214 +0000 UTC m=+0.061713607 container cleanup 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, name=rhosp17/openstack-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-18T22:49:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container) Nov 28 03:14:28 localhost podman[63110]: rsyslog Nov 28 03:14:28 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Nov 28 03:14:28 localhost python3[63137]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks3.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:14:28 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 1. Nov 28 03:14:28 localhost systemd[1]: Stopped rsyslog container. Nov 28 03:14:29 localhost systemd[1]: Starting rsyslog container... Nov 28 03:14:29 localhost systemd[1]: Started libcrun container. Nov 28 03:14:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:29 localhost podman[63138]: 2025-11-28 08:14:29.094445474 +0000 UTC m=+0.083399033 container init 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.buildah.version=1.41.4, vcs-type=git, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, build-date=2025-11-18T22:49:49Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-rsyslog, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64) Nov 28 03:14:29 localhost podman[63138]: 2025-11-28 08:14:29.10306956 +0000 UTC m=+0.092023159 container start 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, name=rhosp17/openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.buildah.version=1.41.4, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=rsyslog, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, build-date=2025-11-18T22:49:49Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog) Nov 28 03:14:29 localhost podman[63138]: rsyslog Nov 28 03:14:29 localhost systemd[1]: Started rsyslog container. Nov 28 03:14:29 localhost systemd[1]: libpod-9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f.scope: Deactivated successfully. Nov 28 03:14:29 localhost podman[63161]: 2025-11-28 08:14:29.2619651 +0000 UTC m=+0.050376124 container died 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp17/openstack-rsyslog, build-date=2025-11-18T22:49:49Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:14:29 localhost podman[63161]: 2025-11-28 08:14:29.289973197 +0000 UTC m=+0.078384191 container cleanup 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=rsyslog, build-date=2025-11-18T22:49:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp17/openstack-rsyslog, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:14:29 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:14:29 localhost podman[63187]: 2025-11-28 08:14:29.374640579 +0000 UTC m=+0.058084941 container cleanup 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-rsyslog-container, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, release=1761123044, name=rhosp17/openstack-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Nov 28 03:14:29 localhost podman[63187]: rsyslog Nov 28 03:14:29 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Nov 28 03:14:29 localhost systemd[1]: var-lib-containers-storage-overlay-f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17-merged.mount: Deactivated successfully. Nov 28 03:14:29 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f-userdata-shm.mount: Deactivated successfully. Nov 28 03:14:29 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 2. Nov 28 03:14:29 localhost systemd[1]: Stopped rsyslog container. Nov 28 03:14:29 localhost systemd[1]: Starting rsyslog container... Nov 28 03:14:29 localhost systemd[1]: Started libcrun container. Nov 28 03:14:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:29 localhost podman[63234]: 2025-11-28 08:14:29.624627857 +0000 UTC m=+0.108969382 container init 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-rsyslog-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:49:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog) Nov 28 03:14:29 localhost podman[63234]: 2025-11-28 08:14:29.633172891 +0000 UTC m=+0.117514456 container start 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, version=17.1.12, release=1761123044, name=rhosp17/openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-18T22:49:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=rsyslog, distribution-scope=public, com.redhat.component=openstack-rsyslog-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git) Nov 28 03:14:29 localhost podman[63234]: rsyslog Nov 28 03:14:29 localhost systemd[1]: Started rsyslog container. Nov 28 03:14:29 localhost systemd[1]: libpod-9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f.scope: Deactivated successfully. Nov 28 03:14:29 localhost podman[63280]: 2025-11-28 08:14:29.776507522 +0000 UTC m=+0.050921682 container died 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:49Z, name=rhosp17/openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:14:29 localhost podman[63280]: 2025-11-28 08:14:29.799346673 +0000 UTC m=+0.073760793 container cleanup 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:49Z, io.buildah.version=1.41.4, container_name=rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:14:29 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:14:29 localhost podman[63312]: 2025-11-28 08:14:29.894724659 +0000 UTC m=+0.063242037 container cleanup 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, architecture=x86_64, container_name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:49Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-rsyslog, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=) Nov 28 03:14:29 localhost podman[63312]: rsyslog Nov 28 03:14:29 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Nov 28 03:14:30 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 3. Nov 28 03:14:30 localhost systemd[1]: Stopped rsyslog container. Nov 28 03:14:30 localhost systemd[1]: Starting rsyslog container... Nov 28 03:14:30 localhost systemd[1]: Started libcrun container. Nov 28 03:14:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:30 localhost podman[63356]: 2025-11-28 08:14:30.366316996 +0000 UTC m=+0.111553305 container init 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_id=tripleo_step3, name=rhosp17/openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=rsyslog, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4) Nov 28 03:14:30 localhost podman[63356]: 2025-11-28 08:14:30.375127128 +0000 UTC m=+0.120363427 container start 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2025-11-18T22:49:49Z, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, name=rhosp17/openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, com.redhat.component=openstack-rsyslog-container) Nov 28 03:14:30 localhost podman[63356]: rsyslog Nov 28 03:14:30 localhost systemd[1]: Started rsyslog container. Nov 28 03:14:30 localhost python3[63355]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks3.json short_hostname=np0005538513 step=3 update_config_hash_only=False Nov 28 03:14:30 localhost systemd[1]: libpod-9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f.scope: Deactivated successfully. Nov 28 03:14:30 localhost podman[63379]: 2025-11-28 08:14:30.516700842 +0000 UTC m=+0.040275870 container died 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_id=tripleo_step3, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-rsyslog-container, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T22:49:49Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}) Nov 28 03:14:30 localhost systemd[1]: tmp-crun.t8EM5O.mount: Deactivated successfully. Nov 28 03:14:30 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f-userdata-shm.mount: Deactivated successfully. Nov 28 03:14:30 localhost systemd[1]: var-lib-containers-storage-overlay-f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17-merged.mount: Deactivated successfully. Nov 28 03:14:30 localhost podman[63379]: 2025-11-28 08:14:30.549441851 +0000 UTC m=+0.073016829 container cleanup 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, container_name=rsyslog, name=rhosp17/openstack-rsyslog, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-rsyslog-container, build-date=2025-11-18T22:49:49Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1) Nov 28 03:14:30 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:14:30 localhost podman[63393]: 2025-11-28 08:14:30.627552454 +0000 UTC m=+0.054713043 container cleanup 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, container_name=rsyslog, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-rsyslog, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, version=17.1.12, build-date=2025-11-18T22:49:49Z) Nov 28 03:14:30 localhost podman[63393]: rsyslog Nov 28 03:14:30 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Nov 28 03:14:30 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 4. Nov 28 03:14:30 localhost systemd[1]: Stopped rsyslog container. Nov 28 03:14:30 localhost systemd[1]: Starting rsyslog container... Nov 28 03:14:30 localhost systemd[1]: Started libcrun container. Nov 28 03:14:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82224ba72e48969d33ced9a8f63115983b371ba24f399dbf64e0b6c5b984d17/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Nov 28 03:14:30 localhost podman[63405]: 2025-11-28 08:14:30.848321086 +0000 UTC m=+0.093311720 container init 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:49Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-rsyslog, config_id=tripleo_step3, distribution-scope=public, com.redhat.component=openstack-rsyslog-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, release=1761123044) Nov 28 03:14:30 localhost podman[63405]: 2025-11-28 08:14:30.857290703 +0000 UTC m=+0.102281327 container start 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, name=rhosp17/openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, container_name=rsyslog) Nov 28 03:14:30 localhost podman[63405]: rsyslog Nov 28 03:14:30 localhost systemd[1]: Started rsyslog container. Nov 28 03:14:30 localhost systemd[1]: libpod-9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f.scope: Deactivated successfully. Nov 28 03:14:31 localhost podman[63440]: 2025-11-28 08:14:31.009942223 +0000 UTC m=+0.054200398 container died 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, url=https://www.redhat.com, container_name=rsyslog, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:49Z, release=1761123044) Nov 28 03:14:31 localhost podman[63440]: 2025-11-28 08:14:31.035429039 +0000 UTC m=+0.079687184 container cleanup 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:49Z, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, managed_by=tripleo_ansible, name=rhosp17/openstack-rsyslog, batch=17.1_20251118.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, version=17.1.12, release=1761123044, url=https://www.redhat.com, container_name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true) Nov 28 03:14:31 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:14:31 localhost podman[63456]: 2025-11-28 08:14:31.145859016 +0000 UTC m=+0.078742563 container cleanup 9173091271b5f42db5122870fff24ef0773893527458e03bbea4ff51c03efe4f (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '138ccb6252fd89d73a6c37a3f993f3eb'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-rsyslog, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:49Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-type=git, distribution-scope=public) Nov 28 03:14:31 localhost podman[63456]: rsyslog Nov 28 03:14:31 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Nov 28 03:14:31 localhost python3[63454]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:14:31 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 5. Nov 28 03:14:31 localhost systemd[1]: Stopped rsyslog container. Nov 28 03:14:31 localhost systemd[1]: tripleo_rsyslog.service: Start request repeated too quickly. Nov 28 03:14:31 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Nov 28 03:14:31 localhost systemd[1]: Failed to start rsyslog container. Nov 28 03:14:31 localhost python3[63483]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_3 config_pattern=container-puppet-*.json config_overrides={} debug=True Nov 28 03:14:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:14:32 localhost podman[63484]: 2025-11-28 08:14:32.83641877 +0000 UTC m=+0.070350255 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:14:33 localhost podman[63484]: 2025-11-28 08:14:33.031154368 +0000 UTC m=+0.265085823 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, tcib_managed=true, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044) Nov 28 03:14:33 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:14:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:14:33 localhost podman[63513]: 2025-11-28 08:14:33.843738807 +0000 UTC m=+0.082619797 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, architecture=x86_64) Nov 28 03:14:33 localhost podman[63513]: 2025-11-28 08:14:33.85941667 +0000 UTC m=+0.098297660 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true) Nov 28 03:14:33 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:14:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:14:34 localhost systemd[1]: tmp-crun.yo1hfP.mount: Deactivated successfully. Nov 28 03:14:34 localhost podman[63533]: 2025-11-28 08:14:34.879357891 +0000 UTC m=+0.086201652 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:14:34 localhost podman[63533]: 2025-11-28 08:14:34.920409536 +0000 UTC m=+0.127253257 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 28 03:14:34 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:15:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:15:03 localhost systemd[1]: tmp-crun.PhwKBI.mount: Deactivated successfully. Nov 28 03:15:03 localhost podman[63631]: 2025-11-28 08:15:03.863100305 +0000 UTC m=+0.096959789 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.41.4, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public) Nov 28 03:15:04 localhost podman[63631]: 2025-11-28 08:15:04.073683399 +0000 UTC m=+0.307542903 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, io.buildah.version=1.41.4, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=) Nov 28 03:15:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:15:04 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:15:04 localhost systemd[1]: tmp-crun.1kaJ0p.mount: Deactivated successfully. Nov 28 03:15:04 localhost podman[63660]: 2025-11-28 08:15:04.193255481 +0000 UTC m=+0.087343560 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, build-date=2025-11-18T22:51:28Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container) Nov 28 03:15:04 localhost podman[63660]: 2025-11-28 08:15:04.205405578 +0000 UTC m=+0.099493647 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, url=https://www.redhat.com, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:15:04 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:15:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:15:05 localhost podman[63681]: 2025-11-28 08:15:05.841343462 +0000 UTC m=+0.079284788 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:15:05 localhost podman[63681]: 2025-11-28 08:15:05.878471437 +0000 UTC m=+0.116412733 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_id=tripleo_step3, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.12) Nov 28 03:15:05 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:15:24 localhost sshd[63699]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:15:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:15:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:15:35 localhost podman[63701]: 2025-11-28 08:15:35.378177356 +0000 UTC m=+0.066781229 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:15:35 localhost podman[63701]: 2025-11-28 08:15:35.387983041 +0000 UTC m=+0.076586884 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, version=17.1.12, release=1761123044) Nov 28 03:15:35 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:15:35 localhost podman[63702]: 2025-11-28 08:15:35.392815371 +0000 UTC m=+0.073895800 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=metrics_qdr, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044) Nov 28 03:15:35 localhost podman[63702]: 2025-11-28 08:15:35.585682744 +0000 UTC m=+0.266763233 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, release=1761123044) Nov 28 03:15:35 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:15:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:15:36 localhost systemd[1]: tmp-crun.j22Ket.mount: Deactivated successfully. Nov 28 03:15:36 localhost podman[63752]: 2025-11-28 08:15:36.845215272 +0000 UTC m=+0.080153705 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 28 03:15:36 localhost podman[63752]: 2025-11-28 08:15:36.852916872 +0000 UTC m=+0.087855315 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible) Nov 28 03:15:36 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:16:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:16:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:16:05 localhost podman[63849]: 2025-11-28 08:16:05.829801189 +0000 UTC m=+0.073801366 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc.) Nov 28 03:16:05 localhost systemd[1]: tmp-crun.VJV5i6.mount: Deactivated successfully. Nov 28 03:16:05 localhost podman[63850]: 2025-11-28 08:16:05.864390036 +0000 UTC m=+0.099409994 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, release=1761123044, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team) Nov 28 03:16:05 localhost podman[63849]: 2025-11-28 08:16:05.869364051 +0000 UTC m=+0.113364188 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:16:05 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:16:06 localhost podman[63850]: 2025-11-28 08:16:06.057618269 +0000 UTC m=+0.292638197 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc.) Nov 28 03:16:06 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:16:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:16:07 localhost podman[63897]: 2025-11-28 08:16:07.845664226 +0000 UTC m=+0.084280823 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.12, architecture=x86_64, config_id=tripleo_step3, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:16:07 localhost podman[63897]: 2025-11-28 08:16:07.857485545 +0000 UTC m=+0.096102142 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, container_name=iscsid, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public) Nov 28 03:16:07 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:16:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:16:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:16:36 localhost podman[63918]: 2025-11-28 08:16:36.8410722 +0000 UTC m=+0.075998535 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, version=17.1.12, vcs-type=git, url=https://www.redhat.com) Nov 28 03:16:36 localhost podman[63917]: 2025-11-28 08:16:36.896985521 +0000 UTC m=+0.132928449 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:16:36 localhost podman[63917]: 2025-11-28 08:16:36.90565693 +0000 UTC m=+0.141599888 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git, container_name=collectd, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container) Nov 28 03:16:36 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:16:37 localhost podman[63918]: 2025-11-28 08:16:37.032369794 +0000 UTC m=+0.267296129 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:16:37 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:16:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:16:38 localhost podman[63966]: 2025-11-28 08:16:38.824987433 +0000 UTC m=+0.066447699 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_id=tripleo_step3) Nov 28 03:16:38 localhost podman[63966]: 2025-11-28 08:16:38.862377687 +0000 UTC m=+0.103837943 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:16:38 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:17:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:17:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:17:07 localhost systemd[1]: tmp-crun.AjsOuO.mount: Deactivated successfully. Nov 28 03:17:07 localhost podman[64062]: 2025-11-28 08:17:07.857637266 +0000 UTC m=+0.095167523 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=collectd, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Nov 28 03:17:07 localhost podman[64062]: 2025-11-28 08:17:07.866853253 +0000 UTC m=+0.104383460 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public) Nov 28 03:17:07 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:17:07 localhost podman[64063]: 2025-11-28 08:17:07.958122674 +0000 UTC m=+0.194538616 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=) Nov 28 03:17:08 localhost podman[64063]: 2025-11-28 08:17:08.184443456 +0000 UTC m=+0.420859388 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.4, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public) Nov 28 03:17:08 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:17:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:17:09 localhost podman[64111]: 2025-11-28 08:17:09.851404256 +0000 UTC m=+0.081061574 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:17:09 localhost podman[64111]: 2025-11-28 08:17:09.8643839 +0000 UTC m=+0.094041198 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, distribution-scope=public, tcib_managed=true, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4) Nov 28 03:17:09 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:17:28 localhost sshd[64130]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:17:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:17:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:17:38 localhost podman[64132]: 2025-11-28 08:17:38.8297433 +0000 UTC m=+0.065817659 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, container_name=collectd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, release=1761123044, version=17.1.12) Nov 28 03:17:38 localhost podman[64132]: 2025-11-28 08:17:38.842339821 +0000 UTC m=+0.078414170 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, release=1761123044, io.openshift.expose-services=, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc.) Nov 28 03:17:38 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:17:38 localhost podman[64133]: 2025-11-28 08:17:38.896148345 +0000 UTC m=+0.128919632 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, vcs-type=git, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:17:39 localhost podman[64133]: 2025-11-28 08:17:39.114310345 +0000 UTC m=+0.347081602 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step1, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com) Nov 28 03:17:39 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:17:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:17:40 localhost podman[64179]: 2025-11-28 08:17:40.849631582 +0000 UTC m=+0.087791934 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3) Nov 28 03:17:40 localhost podman[64179]: 2025-11-28 08:17:40.859364665 +0000 UTC m=+0.097525007 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, tcib_managed=true, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 28 03:17:40 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:18:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:18:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:18:09 localhost podman[64325]: 2025-11-28 08:18:09.839867725 +0000 UTC m=+0.077106810 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, architecture=x86_64, config_id=tripleo_step1, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4) Nov 28 03:18:09 localhost systemd[1]: tmp-crun.USzITd.mount: Deactivated successfully. Nov 28 03:18:09 localhost podman[64324]: 2025-11-28 08:18:09.873722758 +0000 UTC m=+0.109864590 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public) Nov 28 03:18:09 localhost podman[64324]: 2025-11-28 08:18:09.914647022 +0000 UTC m=+0.150788904 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, container_name=collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vcs-type=git) Nov 28 03:18:09 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:18:10 localhost podman[64325]: 2025-11-28 08:18:10.039446606 +0000 UTC m=+0.276685671 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:18:10 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:18:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:18:11 localhost systemd[1]: tmp-crun.1VIiWK.mount: Deactivated successfully. Nov 28 03:18:11 localhost podman[64374]: 2025-11-28 08:18:11.823724416 +0000 UTC m=+0.062878118 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, architecture=x86_64, managed_by=tripleo_ansible, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044) Nov 28 03:18:11 localhost podman[64374]: 2025-11-28 08:18:11.858129067 +0000 UTC m=+0.097282839 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 28 03:18:11 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:18:28 localhost python3[64441]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:18:28 localhost python3[64486]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317908.1401827-108167-57177670405838/source _original_basename=tmpq7dykwvm follow=False checksum=ee48fb03297eb703b1954c8852d0f67fab51dac1 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:18:30 localhost python3[64548]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/recover_tripleo_nova_virtqemud.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:18:30 localhost python3[64591]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/recover_tripleo_nova_virtqemud.sh mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317909.744724-108363-151236680935108/source _original_basename=tmpymj_pfvr follow=False checksum=922b8aa8342176110bffc2e39abdccc2b39e53a9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:18:31 localhost python3[64653]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:18:31 localhost python3[64696]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.service mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317910.7036264-108421-207814106630004/source _original_basename=tmp3_ou5n1o follow=False checksum=92f73544b703afc85885fa63ab07bdf8f8671554 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:18:32 localhost python3[64758]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:18:32 localhost python3[64801]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317911.7271552-108483-211511023349089/source _original_basename=tmpa1i1_8_v follow=False checksum=c6e5f76a53c0d6ccaf46c4b48d813dc2891ad8e9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:18:33 localhost python3[64831]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.service daemon_reexec=False scope=system no_block=False state=None force=None masked=None Nov 28 03:18:33 localhost systemd[1]: Reloading. Nov 28 03:18:33 localhost systemd-sysv-generator[64861]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:18:33 localhost systemd-rc-local-generator[64858]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:18:33 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:18:33 localhost systemd[1]: Reloading. Nov 28 03:18:33 localhost systemd-sysv-generator[64898]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:18:33 localhost systemd-rc-local-generator[64894]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:18:33 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:18:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 03:18:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 4541 writes, 20K keys, 4541 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4541 writes, 459 syncs, 9.89 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 297 writes, 643 keys, 297 commit groups, 1.0 writes per commit group, ingest: 0.53 MB, 0.00 MB/s#012Interval WAL: 297 writes, 148 syncs, 2.01 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 28 03:18:34 localhost python3[64920]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.timer state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:18:34 localhost systemd[1]: Reloading. Nov 28 03:18:34 localhost systemd-rc-local-generator[64943]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:18:34 localhost systemd-sysv-generator[64946]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:18:34 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:18:35 localhost systemd[1]: Reloading. Nov 28 03:18:35 localhost systemd-sysv-generator[64987]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:18:35 localhost systemd-rc-local-generator[64984]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:18:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:18:35 localhost systemd[1]: Started Check and recover tripleo_nova_virtqemud every 10m. Nov 28 03:18:35 localhost python3[65010]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl enable --now tripleo_nova_virtqemud_recover.timer _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 03:18:35 localhost systemd[1]: Reloading. Nov 28 03:18:35 localhost systemd-rc-local-generator[65035]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:18:35 localhost systemd-sysv-generator[65038]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:18:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:18:36 localhost python3[65094]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:18:36 localhost python3[65137]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_libvirt.target group=root mode=0644 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317916.1270037-108624-93954046964457/source _original_basename=tmp4murbgv2 follow=False checksum=c064b4a8e7d3d1d7c62d1f80a09e350659996afd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:18:37 localhost python3[65167]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:18:37 localhost systemd[1]: Reloading. Nov 28 03:18:37 localhost systemd-rc-local-generator[65192]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:18:37 localhost systemd-sysv-generator[65197]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:18:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:18:37 localhost systemd[1]: Reached target tripleo_nova_libvirt.target. Nov 28 03:18:38 localhost python3[65222]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:18:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 03:18:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.3 total, 600.0 interval#012Cumulative writes: 5030 writes, 22K keys, 5030 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5030 writes, 563 syncs, 8.93 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 293 writes, 666 keys, 293 commit groups, 1.0 writes per commit group, ingest: 0.52 MB, 0.00 MB/s#012Interval WAL: 293 writes, 146 syncs, 2.01 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 28 03:18:39 localhost ansible-async_wrapper.py[65394]: Invoked with 771929046149 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317919.2468598-108755-100979962503665/AnsiballZ_command.py _ Nov 28 03:18:39 localhost ansible-async_wrapper.py[65397]: Starting module and watcher Nov 28 03:18:39 localhost ansible-async_wrapper.py[65397]: Start watching 65398 (3600) Nov 28 03:18:39 localhost ansible-async_wrapper.py[65398]: Start module (65398) Nov 28 03:18:39 localhost ansible-async_wrapper.py[65394]: Return async_wrapper task started. Nov 28 03:18:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:18:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:18:40 localhost podman[65418]: 2025-11-28 08:18:40.10168311 +0000 UTC m=+0.111343666 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, config_id=tripleo_step3, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, container_name=collectd, io.openshift.expose-services=, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.buildah.version=1.41.4) Nov 28 03:18:40 localhost podman[65418]: 2025-11-28 08:18:40.113416345 +0000 UTC m=+0.123076951 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step3, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:18:40 localhost python3[65419]: ansible-ansible.legacy.async_status Invoked with jid=771929046149.65394 mode=status _async_dir=/tmp/.ansible_async Nov 28 03:18:40 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:18:40 localhost podman[65438]: 2025-11-28 08:18:40.194611722 +0000 UTC m=+0.083380046 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:18:40 localhost podman[65438]: 2025-11-28 08:18:40.386336059 +0000 UTC m=+0.275104383 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, vcs-type=git, build-date=2025-11-18T22:49:46Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com) Nov 28 03:18:40 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:18:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:18:42 localhost podman[65521]: 2025-11-28 08:18:42.462359489 +0000 UTC m=+0.078136163 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:18:42 localhost podman[65521]: 2025-11-28 08:18:42.47654839 +0000 UTC m=+0.092325134 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, config_id=tripleo_step3, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vendor=Red Hat, Inc., container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 28 03:18:42 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:18:43 localhost puppet-user[65404]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 28 03:18:43 localhost puppet-user[65404]: (file: /etc/puppet/hiera.yaml) Nov 28 03:18:43 localhost puppet-user[65404]: Warning: Undefined variable '::deploy_config_name'; Nov 28 03:18:43 localhost puppet-user[65404]: (file & line not available) Nov 28 03:18:43 localhost puppet-user[65404]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 28 03:18:43 localhost puppet-user[65404]: (file & line not available) Nov 28 03:18:43 localhost puppet-user[65404]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Nov 28 03:18:43 localhost puppet-user[65404]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 28 03:18:43 localhost puppet-user[65404]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 28 03:18:43 localhost puppet-user[65404]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 28 03:18:43 localhost puppet-user[65404]: with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 28 03:18:43 localhost puppet-user[65404]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 28 03:18:43 localhost puppet-user[65404]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 28 03:18:43 localhost puppet-user[65404]: with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 28 03:18:43 localhost puppet-user[65404]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 28 03:18:43 localhost puppet-user[65404]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 28 03:18:43 localhost puppet-user[65404]: with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 28 03:18:43 localhost puppet-user[65404]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 28 03:18:43 localhost puppet-user[65404]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 28 03:18:43 localhost puppet-user[65404]: with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 28 03:18:43 localhost puppet-user[65404]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 28 03:18:43 localhost puppet-user[65404]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 28 03:18:43 localhost puppet-user[65404]: with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 28 03:18:43 localhost puppet-user[65404]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 28 03:18:43 localhost puppet-user[65404]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Nov 28 03:18:43 localhost puppet-user[65404]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.20 seconds Nov 28 03:18:44 localhost ansible-async_wrapper.py[65397]: 65398 still running (3600) Nov 28 03:18:49 localhost ansible-async_wrapper.py[65397]: 65398 still running (3595) Nov 28 03:18:50 localhost python3[65689]: ansible-ansible.legacy.async_status Invoked with jid=771929046149.65394 mode=status _async_dir=/tmp/.ansible_async Nov 28 03:18:51 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 28 03:18:51 localhost systemd[1]: Starting man-db-cache-update.service... Nov 28 03:18:51 localhost systemd[1]: Reloading. Nov 28 03:18:51 localhost systemd-sysv-generator[65837]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:18:51 localhost systemd-rc-local-generator[65833]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:18:51 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:18:51 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 28 03:18:52 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 28 03:18:52 localhost systemd[1]: Finished man-db-cache-update.service. Nov 28 03:18:52 localhost systemd[1]: man-db-cache-update.service: Consumed 1.155s CPU time. Nov 28 03:18:52 localhost systemd[1]: run-rc7fa4f6dd09f43ee9e8d0ddd10c187d0.service: Deactivated successfully. Nov 28 03:18:52 localhost puppet-user[65404]: Notice: /Stage[main]/Snmp/Package[snmpd]/ensure: created Nov 28 03:18:52 localhost puppet-user[65404]: Notice: /Stage[main]/Snmp/File[snmpd.conf]/content: content changed '{sha256}2b743f970e80e2150759bfc66f2d8d0fbd8b31624f79e2991248d1a5ac57494e' to '{sha256}2ccb0e8433cdc7a879c38994ca54c4dea57b8ee81d75485f6ddcf513209d34ed' Nov 28 03:18:52 localhost puppet-user[65404]: Notice: /Stage[main]/Snmp/File[snmpd.sysconfig]/content: content changed '{sha256}b63afb2dee7419b6834471f88581d981c8ae5c8b27b9d329ba67a02f3ddd8221' to '{sha256}3917ee8bbc680ad50d77186ad4a1d2705c2025c32fc32f823abbda7f2328dfbd' Nov 28 03:18:52 localhost puppet-user[65404]: Notice: /Stage[main]/Snmp/File[snmptrapd.conf]/content: content changed '{sha256}2e1ca894d609ef337b6243909bf5623c87fd5df98ecbd00c7d4c12cf12f03c4e' to '{sha256}3ecf18da1ba84ea3932607f2b903ee6a038b6f9ac4e1e371e48f3ef61c5052ea' Nov 28 03:18:52 localhost puppet-user[65404]: Notice: /Stage[main]/Snmp/File[snmptrapd.sysconfig]/content: content changed '{sha256}86ee5797ad10cb1ea0f631e9dfa6ae278ecf4f4d16f4c80f831cdde45601b23c' to '{sha256}2244553364afcca151958f8e2003e4c182f5e2ecfbe55405cec73fd818581e97' Nov 28 03:18:52 localhost puppet-user[65404]: Notice: /Stage[main]/Snmp/Service[snmptrapd]: Triggered 'refresh' from 2 events Nov 28 03:18:54 localhost ansible-async_wrapper.py[65397]: 65398 still running (3590) Nov 28 03:18:57 localhost puppet-user[65404]: Notice: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/returns: executed successfully Nov 28 03:18:58 localhost systemd[1]: Reloading. Nov 28 03:18:58 localhost systemd-sysv-generator[66824]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:18:58 localhost systemd-rc-local-generator[66821]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:18:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:18:58 localhost systemd[1]: Starting Simple Network Management Protocol (SNMP) Daemon.... Nov 28 03:18:58 localhost snmpd[66832]: Can't find directory of RPM packages Nov 28 03:18:58 localhost snmpd[66832]: Duplicate IPv4 address detected, some interfaces may not be visible in IP-MIB Nov 28 03:18:58 localhost systemd[1]: Started Simple Network Management Protocol (SNMP) Daemon.. Nov 28 03:18:58 localhost systemd[1]: Reloading. Nov 28 03:18:58 localhost systemd-rc-local-generator[66859]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:18:58 localhost systemd-sysv-generator[66863]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:18:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:18:58 localhost systemd[1]: Reloading. Nov 28 03:18:58 localhost systemd-rc-local-generator[66891]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:18:58 localhost systemd-sysv-generator[66896]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:18:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:18:59 localhost puppet-user[65404]: Notice: /Stage[main]/Snmp/Service[snmpd]/ensure: ensure changed 'stopped' to 'running' Nov 28 03:18:59 localhost puppet-user[65404]: Notice: Applied catalog in 15.38 seconds Nov 28 03:18:59 localhost puppet-user[65404]: Application: Nov 28 03:18:59 localhost puppet-user[65404]: Initial environment: production Nov 28 03:18:59 localhost puppet-user[65404]: Converged environment: production Nov 28 03:18:59 localhost puppet-user[65404]: Run mode: user Nov 28 03:18:59 localhost puppet-user[65404]: Changes: Nov 28 03:18:59 localhost puppet-user[65404]: Total: 8 Nov 28 03:18:59 localhost puppet-user[65404]: Events: Nov 28 03:18:59 localhost puppet-user[65404]: Success: 8 Nov 28 03:18:59 localhost puppet-user[65404]: Total: 8 Nov 28 03:18:59 localhost puppet-user[65404]: Resources: Nov 28 03:18:59 localhost puppet-user[65404]: Restarted: 1 Nov 28 03:18:59 localhost puppet-user[65404]: Changed: 8 Nov 28 03:18:59 localhost puppet-user[65404]: Out of sync: 8 Nov 28 03:18:59 localhost puppet-user[65404]: Total: 19 Nov 28 03:18:59 localhost puppet-user[65404]: Time: Nov 28 03:18:59 localhost puppet-user[65404]: Filebucket: 0.00 Nov 28 03:18:59 localhost puppet-user[65404]: Schedule: 0.00 Nov 28 03:18:59 localhost puppet-user[65404]: Augeas: 0.01 Nov 28 03:18:59 localhost puppet-user[65404]: File: 0.08 Nov 28 03:18:59 localhost puppet-user[65404]: Config retrieval: 0.26 Nov 28 03:18:59 localhost puppet-user[65404]: Service: 1.22 Nov 28 03:18:59 localhost puppet-user[65404]: Transaction evaluation: 15.37 Nov 28 03:18:59 localhost puppet-user[65404]: Catalog application: 15.38 Nov 28 03:18:59 localhost puppet-user[65404]: Last run: 1764317939 Nov 28 03:18:59 localhost puppet-user[65404]: Exec: 5.07 Nov 28 03:18:59 localhost puppet-user[65404]: Package: 8.82 Nov 28 03:18:59 localhost puppet-user[65404]: Total: 15.38 Nov 28 03:18:59 localhost puppet-user[65404]: Version: Nov 28 03:18:59 localhost puppet-user[65404]: Config: 1764317923 Nov 28 03:18:59 localhost puppet-user[65404]: Puppet: 7.10.0 Nov 28 03:18:59 localhost ansible-async_wrapper.py[65398]: Module complete (65398) Nov 28 03:18:59 localhost ansible-async_wrapper.py[65397]: Done in kid B. Nov 28 03:19:00 localhost python3[66920]: ansible-ansible.legacy.async_status Invoked with jid=771929046149.65394 mode=status _async_dir=/tmp/.ansible_async Nov 28 03:19:01 localhost python3[66936]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 28 03:19:01 localhost python3[66952]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:19:02 localhost python3[67002]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:19:02 localhost python3[67020]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpa0718uv6 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 28 03:19:02 localhost python3[67067]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:19:04 localhost python3[67253]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Nov 28 03:19:04 localhost python3[67341]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:19:04 localhost podman[67356]: Nov 28 03:19:04 localhost podman[67356]: 2025-11-28 08:19:04.889490985 +0000 UTC m=+0.080224158 container create b29cd3eb307babfe96d1ed2b77c0a52a232ce75b4a7b93eba93d24e201112e9f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_blackwell, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_BRANCH=main, io.openshift.expose-services=, maintainer=Guillaume Abrioux , GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, ceph=True, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, build-date=2025-09-24T08:57:55) Nov 28 03:19:04 localhost systemd[1]: Started libpod-conmon-b29cd3eb307babfe96d1ed2b77c0a52a232ce75b4a7b93eba93d24e201112e9f.scope. Nov 28 03:19:04 localhost systemd[1]: Started libcrun container. Nov 28 03:19:04 localhost podman[67356]: 2025-11-28 08:19:04.855490787 +0000 UTC m=+0.046224010 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 03:19:04 localhost podman[67356]: 2025-11-28 08:19:04.964378306 +0000 UTC m=+0.155111489 container init b29cd3eb307babfe96d1ed2b77c0a52a232ce75b4a7b93eba93d24e201112e9f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_blackwell, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-type=git, vendor=Red Hat, Inc., release=553, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, ceph=True, build-date=2025-09-24T08:57:55, distribution-scope=public, io.buildah.version=1.33.12, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux , io.openshift.expose-services=) Nov 28 03:19:04 localhost systemd[1]: tmp-crun.fWxGSB.mount: Deactivated successfully. Nov 28 03:19:04 localhost podman[67356]: 2025-11-28 08:19:04.977787802 +0000 UTC m=+0.168520975 container start b29cd3eb307babfe96d1ed2b77c0a52a232ce75b4a7b93eba93d24e201112e9f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_blackwell, io.openshift.expose-services=, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, name=rhceph, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, GIT_CLEAN=True, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, CEPH_POINT_RELEASE=, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, distribution-scope=public) Nov 28 03:19:04 localhost podman[67356]: 2025-11-28 08:19:04.978206476 +0000 UTC m=+0.168939659 container attach b29cd3eb307babfe96d1ed2b77c0a52a232ce75b4a7b93eba93d24e201112e9f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_blackwell, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, name=rhceph, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, release=553, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, CEPH_POINT_RELEASE=) Nov 28 03:19:04 localhost sleepy_blackwell[67372]: 167 167 Nov 28 03:19:04 localhost systemd[1]: libpod-b29cd3eb307babfe96d1ed2b77c0a52a232ce75b4a7b93eba93d24e201112e9f.scope: Deactivated successfully. Nov 28 03:19:04 localhost podman[67356]: 2025-11-28 08:19:04.982837979 +0000 UTC m=+0.173571162 container died b29cd3eb307babfe96d1ed2b77c0a52a232ce75b4a7b93eba93d24e201112e9f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_blackwell, distribution-scope=public, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , ceph=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, RELEASE=main, GIT_BRANCH=main, name=rhceph, version=7, vcs-type=git, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc.) Nov 28 03:19:05 localhost podman[67377]: 2025-11-28 08:19:05.083212124 +0000 UTC m=+0.088480196 container remove b29cd3eb307babfe96d1ed2b77c0a52a232ce75b4a7b93eba93d24e201112e9f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_blackwell, name=rhceph, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, architecture=x86_64, io.buildah.version=1.33.12, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph) Nov 28 03:19:05 localhost systemd[1]: libpod-conmon-b29cd3eb307babfe96d1ed2b77c0a52a232ce75b4a7b93eba93d24e201112e9f.scope: Deactivated successfully. Nov 28 03:19:05 localhost podman[67416]: Nov 28 03:19:05 localhost podman[67416]: 2025-11-28 08:19:05.300768345 +0000 UTC m=+0.083646815 container create a0cab37cd83a6c205cdd1bb3e282524e6395b285e88bbff75b00001bf72f6313 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_mclaren, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, version=7, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, RELEASE=main, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, vcs-type=git, io.buildah.version=1.33.12, ceph=True, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 03:19:05 localhost systemd[1]: Started libpod-conmon-a0cab37cd83a6c205cdd1bb3e282524e6395b285e88bbff75b00001bf72f6313.scope. Nov 28 03:19:05 localhost systemd[1]: Started libcrun container. Nov 28 03:19:05 localhost podman[67416]: 2025-11-28 08:19:05.264283389 +0000 UTC m=+0.047161929 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 03:19:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11cf5150f1f463467335df2b9dbb597480c9cf3d5415c339e42bc32ca62bee13/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 28 03:19:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11cf5150f1f463467335df2b9dbb597480c9cf3d5415c339e42bc32ca62bee13/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 28 03:19:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11cf5150f1f463467335df2b9dbb597480c9cf3d5415c339e42bc32ca62bee13/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 28 03:19:05 localhost podman[67416]: 2025-11-28 08:19:05.371668481 +0000 UTC m=+0.154546951 container init a0cab37cd83a6c205cdd1bb3e282524e6395b285e88bbff75b00001bf72f6313 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_mclaren, io.openshift.expose-services=, ceph=True, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, architecture=x86_64, io.buildah.version=1.33.12, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 28 03:19:05 localhost podman[67416]: 2025-11-28 08:19:05.383446397 +0000 UTC m=+0.166324877 container start a0cab37cd83a6c205cdd1bb3e282524e6395b285e88bbff75b00001bf72f6313 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_mclaren, RELEASE=main, build-date=2025-09-24T08:57:55, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_CLEAN=True, architecture=x86_64, ceph=True, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=) Nov 28 03:19:05 localhost podman[67416]: 2025-11-28 08:19:05.383767327 +0000 UTC m=+0.166645817 container attach a0cab37cd83a6c205cdd1bb3e282524e6395b285e88bbff75b00001bf72f6313 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_mclaren, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, maintainer=Guillaume Abrioux , ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, version=7, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, distribution-scope=public, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7) Nov 28 03:19:05 localhost python3[67453]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:19:05 localhost systemd[1]: var-lib-containers-storage-overlay-f85fc9d759a4c1c69629c73a27514be5577ab0598fee5b9a7fba520b058de24e-merged.mount: Deactivated successfully. Nov 28 03:19:06 localhost exciting_mclaren[67431]: [ Nov 28 03:19:06 localhost exciting_mclaren[67431]: { Nov 28 03:19:06 localhost exciting_mclaren[67431]: "available": false, Nov 28 03:19:06 localhost exciting_mclaren[67431]: "ceph_device": false, Nov 28 03:19:06 localhost exciting_mclaren[67431]: "device_id": "QEMU_DVD-ROM_QM00001", Nov 28 03:19:06 localhost exciting_mclaren[67431]: "lsm_data": {}, Nov 28 03:19:06 localhost exciting_mclaren[67431]: "lvs": [], Nov 28 03:19:06 localhost exciting_mclaren[67431]: "path": "/dev/sr0", Nov 28 03:19:06 localhost exciting_mclaren[67431]: "rejected_reasons": [ Nov 28 03:19:06 localhost exciting_mclaren[67431]: "Has a FileSystem", Nov 28 03:19:06 localhost exciting_mclaren[67431]: "Insufficient space (<5GB)" Nov 28 03:19:06 localhost exciting_mclaren[67431]: ], Nov 28 03:19:06 localhost exciting_mclaren[67431]: "sys_api": { Nov 28 03:19:06 localhost exciting_mclaren[67431]: "actuators": null, Nov 28 03:19:06 localhost exciting_mclaren[67431]: "device_nodes": "sr0", Nov 28 03:19:06 localhost exciting_mclaren[67431]: "human_readable_size": "482.00 KB", Nov 28 03:19:06 localhost exciting_mclaren[67431]: "id_bus": "ata", Nov 28 03:19:06 localhost exciting_mclaren[67431]: "model": "QEMU DVD-ROM", Nov 28 03:19:06 localhost exciting_mclaren[67431]: "nr_requests": "2", Nov 28 03:19:06 localhost exciting_mclaren[67431]: "partitions": {}, Nov 28 03:19:06 localhost exciting_mclaren[67431]: "path": "/dev/sr0", Nov 28 03:19:06 localhost exciting_mclaren[67431]: "removable": "1", Nov 28 03:19:06 localhost exciting_mclaren[67431]: "rev": "2.5+", Nov 28 03:19:06 localhost exciting_mclaren[67431]: "ro": "0", Nov 28 03:19:06 localhost exciting_mclaren[67431]: "rotational": "1", Nov 28 03:19:06 localhost exciting_mclaren[67431]: "sas_address": "", Nov 28 03:19:06 localhost exciting_mclaren[67431]: "sas_device_handle": "", Nov 28 03:19:06 localhost exciting_mclaren[67431]: "scheduler_mode": "mq-deadline", Nov 28 03:19:06 localhost exciting_mclaren[67431]: "sectors": 0, Nov 28 03:19:06 localhost exciting_mclaren[67431]: "sectorsize": "2048", Nov 28 03:19:06 localhost exciting_mclaren[67431]: "size": 493568.0, Nov 28 03:19:06 localhost exciting_mclaren[67431]: "support_discard": "0", Nov 28 03:19:06 localhost exciting_mclaren[67431]: "type": "disk", Nov 28 03:19:06 localhost exciting_mclaren[67431]: "vendor": "QEMU" Nov 28 03:19:06 localhost exciting_mclaren[67431]: } Nov 28 03:19:06 localhost exciting_mclaren[67431]: } Nov 28 03:19:06 localhost exciting_mclaren[67431]: ] Nov 28 03:19:06 localhost python3[68766]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:19:06 localhost systemd[1]: libpod-a0cab37cd83a6c205cdd1bb3e282524e6395b285e88bbff75b00001bf72f6313.scope: Deactivated successfully. Nov 28 03:19:06 localhost systemd[1]: libpod-a0cab37cd83a6c205cdd1bb3e282524e6395b285e88bbff75b00001bf72f6313.scope: Consumed 1.017s CPU time. Nov 28 03:19:06 localhost podman[67416]: 2025-11-28 08:19:06.383748389 +0000 UTC m=+1.166626899 container died a0cab37cd83a6c205cdd1bb3e282524e6395b285e88bbff75b00001bf72f6313 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_mclaren, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, release=553, version=7, io.openshift.expose-services=, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_CLEAN=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, ceph=True, vcs-type=git, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 03:19:06 localhost systemd[1]: var-lib-containers-storage-overlay-11cf5150f1f463467335df2b9dbb597480c9cf3d5415c339e42bc32ca62bee13-merged.mount: Deactivated successfully. Nov 28 03:19:06 localhost podman[69067]: 2025-11-28 08:19:06.48723778 +0000 UTC m=+0.089599580 container remove a0cab37cd83a6c205cdd1bb3e282524e6395b285e88bbff75b00001bf72f6313 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_mclaren, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, maintainer=Guillaume Abrioux , io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553) Nov 28 03:19:06 localhost systemd[1]: libpod-conmon-a0cab37cd83a6c205cdd1bb3e282524e6395b285e88bbff75b00001bf72f6313.scope: Deactivated successfully. Nov 28 03:19:06 localhost python3[69099]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:19:07 localhost python3[69161]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:19:07 localhost python3[69194]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:19:08 localhost python3[69256]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:19:08 localhost python3[69274]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:19:08 localhost python3[69336]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:19:09 localhost python3[69354]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:19:09 localhost python3[69384]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:19:09 localhost systemd[1]: Reloading. Nov 28 03:19:09 localhost systemd-sysv-generator[69409]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:19:09 localhost systemd-rc-local-generator[69405]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:19:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:19:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:19:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:19:10 localhost systemd[1]: tmp-crun.z4ehy0.mount: Deactivated successfully. Nov 28 03:19:10 localhost podman[69469]: 2025-11-28 08:19:10.60714836 +0000 UTC m=+0.114402122 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, container_name=collectd, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:19:10 localhost python3[69470]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:19:10 localhost podman[69469]: 2025-11-28 08:19:10.642147558 +0000 UTC m=+0.149401320 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, release=1761123044, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, name=rhosp17/openstack-collectd, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team) Nov 28 03:19:10 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:19:10 localhost podman[69471]: 2025-11-28 08:19:10.689219344 +0000 UTC m=+0.196229819 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com) Nov 28 03:19:10 localhost podman[69471]: 2025-11-28 08:19:10.885489472 +0000 UTC m=+0.392499947 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, config_id=tripleo_step1, version=17.1.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com) Nov 28 03:19:10 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:19:10 localhost python3[69537]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:19:11 localhost python3[69599]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:19:11 localhost systemd[1]: tmp-crun.Ep8ky5.mount: Deactivated successfully. Nov 28 03:19:11 localhost python3[69617]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:19:12 localhost python3[69647]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:19:12 localhost systemd[1]: Reloading. Nov 28 03:19:12 localhost systemd-rc-local-generator[69673]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:19:12 localhost systemd-sysv-generator[69677]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:19:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:19:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:19:12 localhost systemd[1]: Starting Create netns directory... Nov 28 03:19:12 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 28 03:19:12 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 28 03:19:12 localhost systemd[1]: Finished Create netns directory. Nov 28 03:19:12 localhost systemd[1]: tmp-crun.WwZGJa.mount: Deactivated successfully. Nov 28 03:19:12 localhost podman[69685]: 2025-11-28 08:19:12.61572202 +0000 UTC m=+0.091086156 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, version=17.1.12, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-iscsid, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 28 03:19:12 localhost podman[69685]: 2025-11-28 08:19:12.627208078 +0000 UTC m=+0.102572214 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, name=rhosp17/openstack-iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1) Nov 28 03:19:12 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:19:13 localhost python3[69722]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Nov 28 03:19:15 localhost python3[69781]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step4 config_dir=/var/lib/tripleo-config/container-startup-config/step_4 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Nov 28 03:19:15 localhost podman[69952]: 2025-11-28 08:19:15.472570981 +0000 UTC m=+0.061953839 container create b3acc362da558e764f3cb1c197ade9eb852868cb570ef88f532cf93f61b44add (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, build-date=2025-11-18T23:34:05Z, release=1761123044, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=configure_cms_options, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:19:15 localhost podman[69953]: 2025-11-28 08:19:15.499826569 +0000 UTC m=+0.083719416 container create f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=) Nov 28 03:19:15 localhost podman[69972]: 2025-11-28 08:19:15.510363667 +0000 UTC m=+0.079212086 container create 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 28 03:19:15 localhost systemd[1]: Started libpod-conmon-b3acc362da558e764f3cb1c197ade9eb852868cb570ef88f532cf93f61b44add.scope. Nov 28 03:19:15 localhost podman[69973]: 2025-11-28 08:19:15.534309402 +0000 UTC m=+0.099989553 container create bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:49:32Z, version=17.1.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com) Nov 28 03:19:15 localhost podman[69952]: 2025-11-28 08:19:15.436910051 +0000 UTC m=+0.026292919 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Nov 28 03:19:15 localhost systemd[1]: Started libpod-conmon-4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.scope. Nov 28 03:19:15 localhost systemd[1]: Started libpod-conmon-f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.scope. Nov 28 03:19:15 localhost systemd[1]: Started libcrun container. Nov 28 03:19:15 localhost systemd[1]: Started libcrun container. Nov 28 03:19:15 localhost podman[69953]: 2025-11-28 08:19:15.448990137 +0000 UTC m=+0.032882994 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Nov 28 03:19:15 localhost systemd[1]: Started libcrun container. Nov 28 03:19:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9124600c137d24812fa12ae9a3723386ce2301b24a72f8d26d11f6206571e1d/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Nov 28 03:19:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f3450f8dbd8f52854977f74bd961373e3aeac1471ae57db291ae89b64fa40dd/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Nov 28 03:19:15 localhost podman[69972]: 2025-11-28 08:19:15.463636823 +0000 UTC m=+0.032485242 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Nov 28 03:19:15 localhost podman[69989]: 2025-11-28 08:19:15.564182942 +0000 UTC m=+0.115461945 container create 4ef9a876eb51c0fb7805cce2278a0020c7131c4e4156ae0a51343d63dc9eb2b1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, build-date=2025-11-19T00:35:22Z, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, container_name=nova_libvirt_init_secret, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, release=1761123044, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:19:15 localhost podman[69989]: 2025-11-28 08:19:15.475182302 +0000 UTC m=+0.026461325 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Nov 28 03:19:15 localhost systemd[1]: Started libpod-conmon-bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.scope. Nov 28 03:19:15 localhost systemd[1]: Started libpod-conmon-4ef9a876eb51c0fb7805cce2278a0020c7131c4e4156ae0a51343d63dc9eb2b1.scope. Nov 28 03:19:15 localhost podman[69973]: 2025-11-28 08:19:15.486639189 +0000 UTC m=+0.052319350 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Nov 28 03:19:15 localhost systemd[1]: Started libcrun container. Nov 28 03:19:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/926487bff9cf92c9a8f31e53ba63d52e32adb5e89e0ae8702a4e60968ca6f3f1/merged/var/log/containers supports timestamps until 2038 (0x7fffffff) Nov 28 03:19:15 localhost systemd[1]: Started libcrun container. Nov 28 03:19:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:19:15 localhost podman[69953]: 2025-11-28 08:19:15.594248307 +0000 UTC m=+0.178141154 container init f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, version=17.1.12, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team) Nov 28 03:19:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a02f32b7b5e21aed25175ffba2f815ea97f4b11687dc87cafc732563545474b2/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:19:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a02f32b7b5e21aed25175ffba2f815ea97f4b11687dc87cafc732563545474b2/merged/etc/nova supports timestamps until 2038 (0x7fffffff) Nov 28 03:19:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a02f32b7b5e21aed25175ffba2f815ea97f4b11687dc87cafc732563545474b2/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:19:15 localhost podman[69989]: 2025-11-28 08:19:15.605861469 +0000 UTC m=+0.157140482 container init 4ef9a876eb51c0fb7805cce2278a0020c7131c4e4156ae0a51343d63dc9eb2b1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, vcs-type=git, release=1761123044, version=17.1.12, config_id=tripleo_step4, container_name=nova_libvirt_init_secret, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, build-date=2025-11-19T00:35:22Z) Nov 28 03:19:15 localhost podman[69952]: 2025-11-28 08:19:15.609566055 +0000 UTC m=+0.198948933 container init b3acc362da558e764f3cb1c197ade9eb852868cb570ef88f532cf93f61b44add (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=configure_cms_options, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, distribution-scope=public, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1) Nov 28 03:19:15 localhost podman[69989]: 2025-11-28 08:19:15.613312901 +0000 UTC m=+0.164591904 container start 4ef9a876eb51c0fb7805cce2278a0020c7131c4e4156ae0a51343d63dc9eb2b1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, container_name=nova_libvirt_init_secret, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, version=17.1.12, build-date=2025-11-19T00:35:22Z, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true) Nov 28 03:19:15 localhost podman[69989]: 2025-11-28 08:19:15.613584819 +0000 UTC m=+0.164863832 container attach 4ef9a876eb51c0fb7805cce2278a0020c7131c4e4156ae0a51343d63dc9eb2b1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, config_id=tripleo_step4, container_name=nova_libvirt_init_secret, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1) Nov 28 03:19:15 localhost podman[69952]: 2025-11-28 08:19:15.619745351 +0000 UTC m=+0.209128219 container start b3acc362da558e764f3cb1c197ade9eb852868cb570ef88f532cf93f61b44add (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, container_name=configure_cms_options, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Nov 28 03:19:15 localhost podman[69952]: 2025-11-28 08:19:15.621048562 +0000 UTC m=+0.210431490 container attach b3acc362da558e764f3cb1c197ade9eb852868cb570ef88f532cf93f61b44add (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=configure_cms_options, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-ovn-controller-container) Nov 28 03:19:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:19:15 localhost podman[69953]: 2025-11-28 08:19:15.633097997 +0000 UTC m=+0.216990844 container start f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:19:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:19:15 localhost podman[69973]: 2025-11-28 08:19:15.636445091 +0000 UTC m=+0.202125282 container init bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, build-date=2025-11-18T22:49:32Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-cron-container) Nov 28 03:19:15 localhost python3[69781]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_ipmi --conmon-pidfile /run/ceilometer_agent_ipmi.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=684be86bd5476b8c779d4769a9adf982 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_ipmi --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_ipmi.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Nov 28 03:19:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:19:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:19:15 localhost podman[69972]: 2025-11-28 08:19:15.696411768 +0000 UTC m=+0.265260177 container init 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, tcib_managed=true, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4) Nov 28 03:19:15 localhost ovs-vsctl[70101]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . external_ids ovn-cms-options Nov 28 03:19:15 localhost podman[69973]: 2025-11-28 08:19:15.724315956 +0000 UTC m=+0.289996107 container start bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, container_name=logrotate_crond, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, release=1761123044, build-date=2025-11-18T22:49:32Z, vcs-type=git, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:19:15 localhost python3[69781]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name logrotate_crond --conmon-pidfile /run/logrotate_crond.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=53ed83bb0cae779ff95edb2002262c6f --healthcheck-command /usr/share/openstack-tripleo-common/healthcheck/cron --label config_id=tripleo_step4 --label container_name=logrotate_crond --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/logrotate_crond.log --network none --pid host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:z registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Nov 28 03:19:15 localhost systemd[1]: libpod-b3acc362da558e764f3cb1c197ade9eb852868cb570ef88f532cf93f61b44add.scope: Deactivated successfully. Nov 28 03:19:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:19:15 localhost podman[69972]: 2025-11-28 08:19:15.76269704 +0000 UTC m=+0.331545429 container start 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, version=17.1.12, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1) Nov 28 03:19:15 localhost python3[69781]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=684be86bd5476b8c779d4769a9adf982 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_compute.log --network host --privileged=False --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Nov 28 03:19:15 localhost podman[70055]: 2025-11-28 08:19:15.772079342 +0000 UTC m=+0.138357857 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, tcib_managed=true, release=1761123044, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 28 03:19:15 localhost systemd[1]: libpod-4ef9a876eb51c0fb7805cce2278a0020c7131c4e4156ae0a51343d63dc9eb2b1.scope: Deactivated successfully. Nov 28 03:19:15 localhost podman[69952]: 2025-11-28 08:19:15.800255019 +0000 UTC m=+0.389637897 container died b3acc362da558e764f3cb1c197ade9eb852868cb570ef88f532cf93f61b44add (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, version=17.1.12, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=configure_cms_options, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Nov 28 03:19:15 localhost podman[70129]: 2025-11-28 08:19:15.832790361 +0000 UTC m=+0.061651639 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, distribution-scope=public, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, version=17.1.12, architecture=x86_64, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1) Nov 28 03:19:15 localhost podman[69989]: 2025-11-28 08:19:15.853203227 +0000 UTC m=+0.404482240 container died 4ef9a876eb51c0fb7805cce2278a0020c7131c4e4156ae0a51343d63dc9eb2b1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_libvirt_init_secret, distribution-scope=public, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt) Nov 28 03:19:15 localhost podman[70129]: 2025-11-28 08:19:15.868249295 +0000 UTC m=+0.097110553 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, tcib_managed=true, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container) Nov 28 03:19:15 localhost podman[70129]: unhealthy Nov 28 03:19:15 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:19:15 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Failed with result 'exit-code'. Nov 28 03:19:15 localhost podman[70077]: 2025-11-28 08:19:15.93909018 +0000 UTC m=+0.264465752 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=starting, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, managed_by=tripleo_ansible, container_name=logrotate_crond, release=1761123044, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12) Nov 28 03:19:15 localhost podman[70077]: 2025-11-28 08:19:15.948300007 +0000 UTC m=+0.273675799 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, release=1761123044) Nov 28 03:19:15 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:19:15 localhost podman[70055]: 2025-11-28 08:19:15.966380709 +0000 UTC m=+0.332659224 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, vcs-type=git, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12) Nov 28 03:19:15 localhost podman[70055]: unhealthy Nov 28 03:19:15 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:19:15 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Failed with result 'exit-code'. Nov 28 03:19:16 localhost podman[70126]: 2025-11-28 08:19:16.03712193 +0000 UTC m=+0.274229915 container cleanup b3acc362da558e764f3cb1c197ade9eb852868cb570ef88f532cf93f61b44add (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, container_name=configure_cms_options, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-ovn-controller, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4) Nov 28 03:19:16 localhost systemd[1]: libpod-conmon-b3acc362da558e764f3cb1c197ade9eb852868cb570ef88f532cf93f61b44add.scope: Deactivated successfully. Nov 28 03:19:16 localhost python3[69781]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name configure_cms_options --conmon-pidfile /run/configure_cms_options.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1764316155 --label config_id=tripleo_step4 --label container_name=configure_cms_options --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/configure_cms_options.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 /bin/bash -c CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi Nov 28 03:19:16 localhost podman[70168]: 2025-11-28 08:19:16.133410688 +0000 UTC m=+0.319308829 container cleanup 4ef9a876eb51c0fb7805cce2278a0020c7131c4e4156ae0a51343d63dc9eb2b1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_libvirt_init_secret, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:19:16 localhost systemd[1]: libpod-conmon-4ef9a876eb51c0fb7805cce2278a0020c7131c4e4156ae0a51343d63dc9eb2b1.scope: Deactivated successfully. Nov 28 03:19:16 localhost python3[69781]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_libvirt_init_secret --cgroupns=host --conmon-pidfile /run/nova_libvirt_init_secret.pid --detach=False --env LIBVIRT_DEFAULT_URI=qemu:///system --env TRIPLEO_CONFIG_HASH=0f0904943dda1bf1d123bdf96d71020f --label config_id=tripleo_step4 --label container_name=nova_libvirt_init_secret --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_libvirt_init_secret.log --network host --privileged=False --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova --volume /etc/libvirt:/etc/libvirt --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro --volume /var/lib/tripleo-config/ceph:/etc/ceph:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /nova_libvirt_init_secret.sh ceph:openstack Nov 28 03:19:16 localhost podman[70261]: 2025-11-28 08:19:16.211184758 +0000 UTC m=+0.144300292 container create 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:19:16 localhost systemd[1]: Started libpod-conmon-9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.scope. Nov 28 03:19:16 localhost systemd[1]: Started libcrun container. Nov 28 03:19:16 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f1ef42e70cf0e1a0a8a92baa1944c6e077a6987018321471d4c334fae1280ec/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 28 03:19:16 localhost podman[70261]: 2025-11-28 08:19:16.175272651 +0000 UTC m=+0.108388245 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 28 03:19:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:19:16 localhost podman[70261]: 2025-11-28 08:19:16.282932581 +0000 UTC m=+0.216048155 container init 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.buildah.version=1.41.4, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1) Nov 28 03:19:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:19:16 localhost podman[70261]: 2025-11-28 08:19:16.311409697 +0000 UTC m=+0.244525241 container start 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4) Nov 28 03:19:16 localhost podman[70325]: 2025-11-28 08:19:16.319447397 +0000 UTC m=+0.122478593 container create b558d4971537e9d22bbec20d85b84342b92fe1e88ef9195cca460c7543cefb42 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, container_name=setup_ovs_manager, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.openshift.expose-services=) Nov 28 03:19:16 localhost python3[69781]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_migration_target --conmon-pidfile /run/nova_migration_target.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=0f0904943dda1bf1d123bdf96d71020f --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=nova_migration_target --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_migration_target.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /etc/ssh:/host-ssh:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 28 03:19:16 localhost podman[70325]: 2025-11-28 08:19:16.237217468 +0000 UTC m=+0.040248664 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Nov 28 03:19:16 localhost systemd[1]: Started libpod-conmon-b558d4971537e9d22bbec20d85b84342b92fe1e88ef9195cca460c7543cefb42.scope. Nov 28 03:19:16 localhost podman[70361]: 2025-11-28 08:19:16.406224548 +0000 UTC m=+0.084792650 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=starting, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1) Nov 28 03:19:16 localhost systemd[1]: Started libcrun container. Nov 28 03:19:16 localhost podman[70325]: 2025-11-28 08:19:16.419970836 +0000 UTC m=+0.223002072 container init b558d4971537e9d22bbec20d85b84342b92fe1e88ef9195cca460c7543cefb42 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=setup_ovs_manager, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com) Nov 28 03:19:16 localhost systemd[1]: var-lib-containers-storage-overlay-6bc7f6cf2f2d6fa477b99c9c15d2f85320ca24368fa72b515260201c2b251c67-merged.mount: Deactivated successfully. Nov 28 03:19:16 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b3acc362da558e764f3cb1c197ade9eb852868cb570ef88f532cf93f61b44add-userdata-shm.mount: Deactivated successfully. Nov 28 03:19:16 localhost podman[70325]: 2025-11-28 08:19:16.483960088 +0000 UTC m=+0.286991304 container start b558d4971537e9d22bbec20d85b84342b92fe1e88ef9195cca460c7543cefb42 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=setup_ovs_manager, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 28 03:19:16 localhost podman[70325]: 2025-11-28 08:19:16.484459393 +0000 UTC m=+0.287490629 container attach b558d4971537e9d22bbec20d85b84342b92fe1e88ef9195cca460c7543cefb42 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=setup_ovs_manager, architecture=x86_64, version=17.1.12, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:19:16 localhost podman[70361]: 2025-11-28 08:19:16.756142658 +0000 UTC m=+0.434710770 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 28 03:19:16 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:19:17 localhost kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure Nov 28 03:19:19 localhost ovs-vsctl[70546]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager Nov 28 03:19:19 localhost systemd[1]: libpod-b558d4971537e9d22bbec20d85b84342b92fe1e88ef9195cca460c7543cefb42.scope: Deactivated successfully. Nov 28 03:19:19 localhost systemd[1]: libpod-b558d4971537e9d22bbec20d85b84342b92fe1e88ef9195cca460c7543cefb42.scope: Consumed 2.917s CPU time. Nov 28 03:19:19 localhost podman[70325]: 2025-11-28 08:19:19.364358711 +0000 UTC m=+3.167389937 container died b558d4971537e9d22bbec20d85b84342b92fe1e88ef9195cca460c7543cefb42 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, config_id=tripleo_step4, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, architecture=x86_64, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=setup_ovs_manager, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team) Nov 28 03:19:19 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b558d4971537e9d22bbec20d85b84342b92fe1e88ef9195cca460c7543cefb42-userdata-shm.mount: Deactivated successfully. Nov 28 03:19:19 localhost systemd[1]: var-lib-containers-storage-overlay-be016c2b2baa2373c75faceeaeaba806753f973ba55e627044f9b5780da52a4c-merged.mount: Deactivated successfully. Nov 28 03:19:19 localhost podman[70547]: 2025-11-28 08:19:19.47099875 +0000 UTC m=+0.092477459 container cleanup b558d4971537e9d22bbec20d85b84342b92fe1e88ef9195cca460c7543cefb42 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=setup_ovs_manager, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:19:19 localhost systemd[1]: libpod-conmon-b558d4971537e9d22bbec20d85b84342b92fe1e88ef9195cca460c7543cefb42.scope: Deactivated successfully. Nov 28 03:19:19 localhost python3[69781]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name setup_ovs_manager --conmon-pidfile /run/setup_ovs_manager.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1764316155 --label config_id=tripleo_step4 --label container_name=setup_ovs_manager --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1764316155'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/setup_ovs_manager.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 exec include tripleo::profile::base::neutron::ovn_metadata Nov 28 03:19:19 localhost podman[70654]: 2025-11-28 08:19:19.905175512 +0000 UTC m=+0.072774546 container create 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1761123044) Nov 28 03:19:19 localhost podman[70667]: 2025-11-28 08:19:19.947161689 +0000 UTC m=+0.069726111 container create 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:19:19 localhost podman[70654]: 2025-11-28 08:19:19.865859869 +0000 UTC m=+0.033458983 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Nov 28 03:19:19 localhost systemd[1]: Started libpod-conmon-1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.scope. Nov 28 03:19:19 localhost systemd[1]: Started libpod-conmon-3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.scope. Nov 28 03:19:19 localhost systemd[1]: Started libcrun container. Nov 28 03:19:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/278e8886c08486a895bd57117a67612048d79e44c13305c2caa42c992931b27d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 03:19:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/278e8886c08486a895bd57117a67612048d79e44c13305c2caa42c992931b27d/merged/var/log/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 03:19:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/278e8886c08486a895bd57117a67612048d79e44c13305c2caa42c992931b27d/merged/etc/neutron/kill_scripts supports timestamps until 2038 (0x7fffffff) Nov 28 03:19:20 localhost systemd[1]: Started libcrun container. Nov 28 03:19:20 localhost podman[70667]: 2025-11-28 08:19:19.911983334 +0000 UTC m=+0.034547826 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Nov 28 03:19:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/069878c0f25bf999d8bcfd5288562c93f75c544f63f9cb3c25f6c968781689e7/merged/run/ovn supports timestamps until 2038 (0x7fffffff) Nov 28 03:19:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/069878c0f25bf999d8bcfd5288562c93f75c544f63f9cb3c25f6c968781689e7/merged/var/log/openvswitch supports timestamps until 2038 (0x7fffffff) Nov 28 03:19:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/069878c0f25bf999d8bcfd5288562c93f75c544f63f9cb3c25f6c968781689e7/merged/var/log/ovn supports timestamps until 2038 (0x7fffffff) Nov 28 03:19:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:19:20 localhost podman[70667]: 2025-11-28 08:19:20.039154672 +0000 UTC m=+0.161719194 container init 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent) Nov 28 03:19:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:19:20 localhost podman[70654]: 2025-11-28 08:19:20.056535583 +0000 UTC m=+0.224134687 container init 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, tcib_managed=true, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team) Nov 28 03:19:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:19:20 localhost podman[70667]: 2025-11-28 08:19:20.090719486 +0000 UTC m=+0.213283958 container start 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, version=17.1.12, config_id=tripleo_step4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 28 03:19:20 localhost systemd-logind[764]: Existing logind session ID 28 used by new audit session, ignoring. Nov 28 03:19:20 localhost python3[69781]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=dfc67f7a8d1f67548a53836c6db3b704 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ovn_metadata_agent --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_metadata_agent.log --network host --pid host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/neutron:/var/log/neutron:z --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /run/netns:/run/netns:shared --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Nov 28 03:19:20 localhost systemd[1]: Created slice User Slice of UID 0. Nov 28 03:19:20 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Nov 28 03:19:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:19:20 localhost podman[70654]: 2025-11-28 08:19:20.142838829 +0000 UTC m=+0.310437893 container start 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4) Nov 28 03:19:20 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Nov 28 03:19:20 localhost python3[69781]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck 6642 --label config_id=tripleo_step4 --label container_name=ovn_controller --label managed_by=tripleo_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_controller.log --network host --privileged=True --user root --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/log/containers/openvswitch:/var/log/openvswitch:z --volume /var/log/containers/openvswitch:/var/log/ovn:z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Nov 28 03:19:20 localhost systemd[1]: Starting User Manager for UID 0... Nov 28 03:19:20 localhost podman[70699]: 2025-11-28 08:19:20.214686234 +0000 UTC m=+0.114588126 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible) Nov 28 03:19:20 localhost podman[70699]: 2025-11-28 08:19:20.255593468 +0000 UTC m=+0.155495380 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, version=17.1.12, build-date=2025-11-19T00:14:25Z, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 03:19:20 localhost podman[70699]: unhealthy Nov 28 03:19:20 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:19:20 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:19:20 localhost systemd[70718]: Queued start job for default target Main User Target. Nov 28 03:19:20 localhost systemd[70718]: Created slice User Application Slice. Nov 28 03:19:20 localhost systemd[70718]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Nov 28 03:19:20 localhost systemd[70718]: Started Daily Cleanup of User's Temporary Directories. Nov 28 03:19:20 localhost systemd[70718]: Reached target Paths. Nov 28 03:19:20 localhost systemd[70718]: Reached target Timers. Nov 28 03:19:20 localhost systemd[70718]: Starting D-Bus User Message Bus Socket... Nov 28 03:19:20 localhost systemd[70718]: Starting Create User's Volatile Files and Directories... Nov 28 03:19:20 localhost systemd[70718]: Listening on D-Bus User Message Bus Socket. Nov 28 03:19:20 localhost systemd[70718]: Reached target Sockets. Nov 28 03:19:20 localhost systemd[70718]: Finished Create User's Volatile Files and Directories. Nov 28 03:19:20 localhost systemd[70718]: Reached target Basic System. Nov 28 03:19:20 localhost systemd[70718]: Reached target Main User Target. Nov 28 03:19:20 localhost systemd[70718]: Startup finished in 121ms. Nov 28 03:19:20 localhost systemd[1]: Started User Manager for UID 0. Nov 28 03:19:20 localhost systemd[1]: Started Session c9 of User root. Nov 28 03:19:20 localhost podman[70714]: 2025-11-28 08:19:20.356104546 +0000 UTC m=+0.196161806 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, tcib_managed=true) Nov 28 03:19:20 localhost podman[70714]: 2025-11-28 08:19:20.417186167 +0000 UTC m=+0.257243427 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:19:20 localhost podman[70714]: unhealthy Nov 28 03:19:20 localhost systemd[1]: session-c9.scope: Deactivated successfully. Nov 28 03:19:20 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:19:20 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:19:20 localhost kernel: device br-int entered promiscuous mode Nov 28 03:19:20 localhost NetworkManager[5967]: [1764317960.4865] manager: (br-int): new Generic device (/org/freedesktop/NetworkManager/Devices/11) Nov 28 03:19:20 localhost systemd-udevd[70804]: Network interface NamePolicy= disabled on kernel command line. Nov 28 03:19:20 localhost python3[70824]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:19:21 localhost python3[70840]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:19:21 localhost python3[70856]: ansible-file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:19:21 localhost python3[70872]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:19:21 localhost kernel: device genev_sys_6081 entered promiscuous mode Nov 28 03:19:21 localhost systemd-udevd[70806]: Network interface NamePolicy= disabled on kernel command line. Nov 28 03:19:21 localhost NetworkManager[5967]: [1764317961.5516] device (genev_sys_6081): carrier: link connected Nov 28 03:19:21 localhost NetworkManager[5967]: [1764317961.5525] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/12) Nov 28 03:19:21 localhost python3[70891]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:19:22 localhost python3[70910]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:19:22 localhost python3[70927]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:19:22 localhost python3[70944]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:19:22 localhost python3[70961]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_logrotate_crond_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:19:23 localhost python3[70979]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_migration_target_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:19:23 localhost python3[70995]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_controller_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:19:23 localhost python3[71011]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:19:24 localhost python3[71072]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317963.5978873-110014-4299752726396/source dest=/etc/systemd/system/tripleo_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:19:24 localhost python3[71101]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317963.5978873-110014-4299752726396/source dest=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:19:25 localhost python3[71130]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317963.5978873-110014-4299752726396/source dest=/etc/systemd/system/tripleo_logrotate_crond.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:19:25 localhost python3[71159]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317963.5978873-110014-4299752726396/source dest=/etc/systemd/system/tripleo_nova_migration_target.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:19:26 localhost python3[71188]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317963.5978873-110014-4299752726396/source dest=/etc/systemd/system/tripleo_ovn_controller.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:19:26 localhost python3[71217]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764317963.5978873-110014-4299752726396/source dest=/etc/systemd/system/tripleo_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:19:26 localhost python3[71233]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 28 03:19:27 localhost systemd[1]: Reloading. Nov 28 03:19:27 localhost systemd-rc-local-generator[71258]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:19:27 localhost systemd-sysv-generator[71263]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:19:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:19:27 localhost python3[71284]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:19:27 localhost systemd[1]: Reloading. Nov 28 03:19:28 localhost systemd-sysv-generator[71311]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:19:28 localhost systemd-rc-local-generator[71308]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:19:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:19:28 localhost systemd[1]: Starting ceilometer_agent_compute container... Nov 28 03:19:28 localhost tripleo-start-podman-container[71323]: Creating additional drop-in dependency for "ceilometer_agent_compute" (4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5) Nov 28 03:19:28 localhost systemd[1]: Reloading. Nov 28 03:19:28 localhost systemd-sysv-generator[71387]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:19:28 localhost systemd-rc-local-generator[71384]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:19:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:19:28 localhost systemd[1]: Started ceilometer_agent_compute container. Nov 28 03:19:29 localhost python3[71409]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_ipmi.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:19:29 localhost systemd[1]: Reloading. Nov 28 03:19:29 localhost systemd-rc-local-generator[71433]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:19:29 localhost systemd-sysv-generator[71437]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:19:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:19:29 localhost systemd[1]: Starting ceilometer_agent_ipmi container... Nov 28 03:19:29 localhost systemd[1]: Started ceilometer_agent_ipmi container. Nov 28 03:19:30 localhost python3[71477]: ansible-systemd Invoked with state=restarted name=tripleo_logrotate_crond.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:19:30 localhost systemd[1]: Reloading. Nov 28 03:19:30 localhost systemd-rc-local-generator[71501]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:19:30 localhost systemd-sysv-generator[71505]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:19:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:19:30 localhost systemd[1]: Stopping User Manager for UID 0... Nov 28 03:19:30 localhost systemd[70718]: Activating special unit Exit the Session... Nov 28 03:19:30 localhost systemd[70718]: Stopped target Main User Target. Nov 28 03:19:30 localhost systemd[70718]: Stopped target Basic System. Nov 28 03:19:30 localhost systemd[70718]: Stopped target Paths. Nov 28 03:19:30 localhost systemd[70718]: Stopped target Sockets. Nov 28 03:19:30 localhost systemd[70718]: Stopped target Timers. Nov 28 03:19:30 localhost systemd[70718]: Stopped Daily Cleanup of User's Temporary Directories. Nov 28 03:19:30 localhost systemd[70718]: Closed D-Bus User Message Bus Socket. Nov 28 03:19:30 localhost systemd[70718]: Stopped Create User's Volatile Files and Directories. Nov 28 03:19:30 localhost systemd[70718]: Removed slice User Application Slice. Nov 28 03:19:30 localhost systemd[70718]: Reached target Shutdown. Nov 28 03:19:30 localhost systemd[70718]: Finished Exit the Session. Nov 28 03:19:30 localhost systemd[70718]: Reached target Exit the Session. Nov 28 03:19:30 localhost systemd[1]: Starting logrotate_crond container... Nov 28 03:19:30 localhost systemd[1]: user@0.service: Deactivated successfully. Nov 28 03:19:30 localhost systemd[1]: Stopped User Manager for UID 0. Nov 28 03:19:30 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Nov 28 03:19:30 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Nov 28 03:19:30 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Nov 28 03:19:30 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Nov 28 03:19:30 localhost systemd[1]: Removed slice User Slice of UID 0. Nov 28 03:19:30 localhost systemd[1]: Started logrotate_crond container. Nov 28 03:19:31 localhost python3[71545]: ansible-systemd Invoked with state=restarted name=tripleo_nova_migration_target.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:19:31 localhost systemd[1]: Reloading. Nov 28 03:19:31 localhost systemd-rc-local-generator[71573]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:19:31 localhost systemd-sysv-generator[71578]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:19:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:19:32 localhost systemd[1]: Starting nova_migration_target container... Nov 28 03:19:32 localhost systemd[1]: Started nova_migration_target container. Nov 28 03:19:32 localhost python3[71613]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:19:32 localhost systemd[1]: Reloading. Nov 28 03:19:32 localhost systemd-rc-local-generator[71637]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:19:32 localhost systemd-sysv-generator[71643]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:19:33 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:19:33 localhost systemd[1]: Starting ovn_controller container... Nov 28 03:19:33 localhost tripleo-start-podman-container[71653]: Creating additional drop-in dependency for "ovn_controller" (3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e) Nov 28 03:19:33 localhost systemd[1]: Reloading. Nov 28 03:19:33 localhost systemd-rc-local-generator[71711]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:19:33 localhost systemd-sysv-generator[71714]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:19:33 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:19:33 localhost systemd[1]: Started ovn_controller container. Nov 28 03:19:34 localhost sshd[71740]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:19:34 localhost python3[71739]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:19:34 localhost systemd[1]: Reloading. Nov 28 03:19:34 localhost systemd-sysv-generator[71772]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:19:34 localhost systemd-rc-local-generator[71769]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:19:34 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:19:34 localhost systemd[1]: Starting ovn_metadata_agent container... Nov 28 03:19:34 localhost systemd[1]: Started ovn_metadata_agent container. Nov 28 03:19:35 localhost python3[71823]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks4.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:19:36 localhost python3[71944]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks4.json short_hostname=np0005538513 step=4 update_config_hash_only=False Nov 28 03:19:37 localhost python3[71960]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:19:37 localhost python3[71976]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_4 config_pattern=container-puppet-*.json config_overrides={} debug=True Nov 28 03:19:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:19:40 localhost podman[71978]: 2025-11-28 08:19:40.830528867 +0000 UTC m=+0.068982630 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, release=1761123044, vcs-type=git, name=rhosp17/openstack-collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, io.openshift.expose-services=) Nov 28 03:19:40 localhost podman[71978]: 2025-11-28 08:19:40.841969017 +0000 UTC m=+0.080422780 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:19:40 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:19:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:19:41 localhost podman[71999]: 2025-11-28 08:19:41.871169908 +0000 UTC m=+0.075870166 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 28 03:19:42 localhost podman[71999]: 2025-11-28 08:19:42.042625999 +0000 UTC m=+0.247326297 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true) Nov 28 03:19:42 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:19:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:19:42 localhost podman[72029]: 2025-11-28 08:19:42.847726696 +0000 UTC m=+0.083125236 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, release=1761123044, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:19:42 localhost podman[72029]: 2025-11-28 08:19:42.855940974 +0000 UTC m=+0.091339474 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12) Nov 28 03:19:42 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:19:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:19:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:19:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:19:46 localhost systemd[1]: tmp-crun.MHizjA.mount: Deactivated successfully. Nov 28 03:19:46 localhost podman[72052]: 2025-11-28 08:19:46.87439841 +0000 UTC m=+0.110823246 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, version=17.1.12, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi) Nov 28 03:19:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:19:46 localhost podman[72050]: 2025-11-28 08:19:46.925771715 +0000 UTC m=+0.164822523 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, release=1761123044, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Nov 28 03:19:46 localhost podman[72052]: 2025-11-28 08:19:46.931628489 +0000 UTC m=+0.168053335 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step4) Nov 28 03:19:46 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:19:46 localhost podman[72050]: 2025-11-28 08:19:46.980488646 +0000 UTC m=+0.219539504 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z) Nov 28 03:19:46 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:19:47 localhost podman[72088]: 2025-11-28 08:19:47.065165419 +0000 UTC m=+0.171363800 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, config_id=tripleo_step4) Nov 28 03:19:47 localhost podman[72051]: 2025-11-28 08:19:46.988395185 +0000 UTC m=+0.223144208 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public) Nov 28 03:19:47 localhost podman[72051]: 2025-11-28 08:19:47.121581322 +0000 UTC m=+0.356330365 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.expose-services=, release=1761123044, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:19:47 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:19:47 localhost podman[72088]: 2025-11-28 08:19:47.460555141 +0000 UTC m=+0.566753512 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute) Nov 28 03:19:47 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:19:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:19:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:19:50 localhost podman[72145]: 2025-11-28 08:19:50.847697397 +0000 UTC m=+0.084520939 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:19:50 localhost podman[72145]: 2025-11-28 08:19:50.88722753 +0000 UTC m=+0.124051062 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1) Nov 28 03:19:50 localhost podman[72146]: 2025-11-28 08:19:50.902113268 +0000 UTC m=+0.133765297 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:19:50 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:19:50 localhost podman[72146]: 2025-11-28 08:19:50.94732737 +0000 UTC m=+0.178979389 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=ovn_controller, version=17.1.12, url=https://www.redhat.com, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 03:19:50 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:19:58 localhost snmpd[66832]: empty variable list in _query Nov 28 03:19:58 localhost snmpd[66832]: empty variable list in _query Nov 28 03:20:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:20:11 localhost podman[72272]: 2025-11-28 08:20:11.852964427 +0000 UTC m=+0.087294547 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step3, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container) Nov 28 03:20:11 localhost podman[72272]: 2025-11-28 08:20:11.890360022 +0000 UTC m=+0.124690092 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:20:11 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:20:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:20:12 localhost systemd[1]: tmp-crun.DJAMnU.mount: Deactivated successfully. Nov 28 03:20:12 localhost podman[72290]: 2025-11-28 08:20:12.844952688 +0000 UTC m=+0.079589004 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, version=17.1.12, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:20:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:20:13 localhost podman[72290]: 2025-11-28 08:20:13.055510719 +0000 UTC m=+0.290146985 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:20:13 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:20:13 localhost systemd[1]: tmp-crun.ITc6ed.mount: Deactivated successfully. Nov 28 03:20:13 localhost podman[72319]: 2025-11-28 08:20:13.134598516 +0000 UTC m=+0.077232300 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, container_name=iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z) Nov 28 03:20:13 localhost podman[72319]: 2025-11-28 08:20:13.146345195 +0000 UTC m=+0.088978959 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, container_name=iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com) Nov 28 03:20:13 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:20:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:20:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:20:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:20:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:20:17 localhost podman[72339]: 2025-11-28 08:20:17.856094217 +0000 UTC m=+0.092631843 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, architecture=x86_64, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:20:17 localhost podman[72339]: 2025-11-28 08:20:17.911389636 +0000 UTC m=+0.147927302 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z) Nov 28 03:20:17 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:20:17 localhost podman[72341]: 2025-11-28 08:20:17.913512383 +0000 UTC m=+0.142978547 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, vcs-type=git) Nov 28 03:20:17 localhost podman[72341]: 2025-11-28 08:20:17.996306446 +0000 UTC m=+0.225772610 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron) Nov 28 03:20:18 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:20:18 localhost podman[72347]: 2025-11-28 08:20:18.009286615 +0000 UTC m=+0.232887614 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:20:18 localhost podman[72340]: 2025-11-28 08:20:17.966760598 +0000 UTC m=+0.198214304 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:20:18 localhost podman[72347]: 2025-11-28 08:20:18.052276696 +0000 UTC m=+0.275877715 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 28 03:20:18 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:20:18 localhost podman[72340]: 2025-11-28 08:20:18.369363967 +0000 UTC m=+0.600817643 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git) Nov 28 03:20:18 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:20:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:20:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:20:21 localhost systemd[1]: tmp-crun.zBS8H1.mount: Deactivated successfully. Nov 28 03:20:21 localhost podman[72432]: 2025-11-28 08:20:21.907131777 +0000 UTC m=+0.148664755 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent) Nov 28 03:20:21 localhost podman[72433]: 2025-11-28 08:20:21.872765117 +0000 UTC m=+0.111179586 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, release=1761123044, config_id=tripleo_step4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 03:20:21 localhost podman[72432]: 2025-11-28 08:20:21.953400713 +0000 UTC m=+0.194933731 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:20:21 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:20:21 localhost podman[72433]: 2025-11-28 08:20:21.963238333 +0000 UTC m=+0.201652862 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, container_name=ovn_controller) Nov 28 03:20:22 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:20:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:20:42 localhost podman[72477]: 2025-11-28 08:20:42.847215863 +0000 UTC m=+0.087079059 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, container_name=collectd, config_id=tripleo_step3) Nov 28 03:20:42 localhost podman[72477]: 2025-11-28 08:20:42.882228185 +0000 UTC m=+0.122091391 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044) Nov 28 03:20:42 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:20:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:20:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:20:43 localhost podman[72497]: 2025-11-28 08:20:43.82506024 +0000 UTC m=+0.058927064 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, tcib_managed=true, container_name=iscsid, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 28 03:20:43 localhost podman[72497]: 2025-11-28 08:20:43.863404646 +0000 UTC m=+0.097271510 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, container_name=iscsid, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:20:43 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:20:43 localhost podman[72498]: 2025-11-28 08:20:43.920482461 +0000 UTC m=+0.147466518 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git) Nov 28 03:20:44 localhost podman[72498]: 2025-11-28 08:20:44.109728822 +0000 UTC m=+0.336712859 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:20:44 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:20:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:20:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:20:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:20:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:20:48 localhost podman[72550]: 2025-11-28 08:20:48.841070674 +0000 UTC m=+0.074132363 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, architecture=x86_64, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team) Nov 28 03:20:48 localhost podman[72550]: 2025-11-28 08:20:48.865541503 +0000 UTC m=+0.098603192 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 28 03:20:48 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:20:48 localhost podman[72547]: 2025-11-28 08:20:48.95639543 +0000 UTC m=+0.194048463 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, release=1761123044) Nov 28 03:20:48 localhost podman[72547]: 2025-11-28 08:20:48.986842167 +0000 UTC m=+0.224495230 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container) Nov 28 03:20:48 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:20:49 localhost podman[72549]: 2025-11-28 08:20:49.055536818 +0000 UTC m=+0.288793643 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-cron-container, release=1761123044, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, vcs-type=git, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc.) Nov 28 03:20:49 localhost podman[72548]: 2025-11-28 08:20:49.060621847 +0000 UTC m=+0.295138421 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, architecture=x86_64) Nov 28 03:20:49 localhost podman[72549]: 2025-11-28 08:20:49.069443984 +0000 UTC m=+0.302700809 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, name=rhosp17/openstack-cron, vcs-type=git, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_id=tripleo_step4, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64) Nov 28 03:20:49 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:20:49 localhost podman[72548]: 2025-11-28 08:20:49.420481213 +0000 UTC m=+0.654997797 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4) Nov 28 03:20:49 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:20:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:20:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:20:52 localhost podman[72643]: 2025-11-28 08:20:52.836535547 +0000 UTC m=+0.070332643 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 28 03:20:52 localhost podman[72643]: 2025-11-28 08:20:52.857454325 +0000 UTC m=+0.091251471 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:20:52 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:20:52 localhost podman[72642]: 2025-11-28 08:20:52.949714346 +0000 UTC m=+0.186222507 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 28 03:20:53 localhost podman[72642]: 2025-11-28 08:20:53.01856391 +0000 UTC m=+0.255072081 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, config_id=tripleo_step4) Nov 28 03:20:53 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:21:09 localhost podman[72795]: 2025-11-28 08:21:09.782650558 +0000 UTC m=+0.085330824 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, release=553, architecture=x86_64, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_BRANCH=main, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, RELEASE=main, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git) Nov 28 03:21:09 localhost podman[72795]: 2025-11-28 08:21:09.912540513 +0000 UTC m=+0.215220769 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, name=rhceph, RELEASE=main, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., release=553, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph) Nov 28 03:21:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:21:13 localhost podman[72941]: 2025-11-28 08:21:13.848763343 +0000 UTC m=+0.085637224 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:21:13 localhost podman[72941]: 2025-11-28 08:21:13.858622953 +0000 UTC m=+0.095496824 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, distribution-scope=public, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, maintainer=OpenStack TripleO Team) Nov 28 03:21:13 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:21:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:21:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:21:14 localhost podman[72964]: 2025-11-28 08:21:14.831906607 +0000 UTC m=+0.069309620 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 28 03:21:14 localhost podman[72963]: 2025-11-28 08:21:14.89211277 +0000 UTC m=+0.130126202 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, architecture=x86_64, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z) Nov 28 03:21:14 localhost podman[72963]: 2025-11-28 08:21:14.92134945 +0000 UTC m=+0.159362912 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=iscsid, io.openshift.expose-services=, config_id=tripleo_step3, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:21:14 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:21:15 localhost podman[72964]: 2025-11-28 08:21:15.008358236 +0000 UTC m=+0.245761269 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, batch=17.1_20251118.1) Nov 28 03:21:15 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:21:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:21:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:21:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:21:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:21:19 localhost podman[73012]: 2025-11-28 08:21:19.820083904 +0000 UTC m=+0.059325506 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Nov 28 03:21:19 localhost systemd[1]: tmp-crun.jRgCjR.mount: Deactivated successfully. Nov 28 03:21:19 localhost podman[73013]: 2025-11-28 08:21:19.87020253 +0000 UTC m=+0.100567333 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2025-11-18T22:49:32Z) Nov 28 03:21:19 localhost podman[73013]: 2025-11-28 08:21:19.905317915 +0000 UTC m=+0.135682688 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1) Nov 28 03:21:19 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:21:19 localhost podman[73014]: 2025-11-28 08:21:19.91881356 +0000 UTC m=+0.146028664 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1761123044, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 28 03:21:19 localhost podman[73011]: 2025-11-28 08:21:19.965138326 +0000 UTC m=+0.203003564 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, container_name=ceilometer_agent_compute, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, build-date=2025-11-19T00:11:48Z, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git) Nov 28 03:21:19 localhost podman[73014]: 2025-11-28 08:21:19.970479554 +0000 UTC m=+0.197694678 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 28 03:21:19 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:21:19 localhost podman[73011]: 2025-11-28 08:21:19.995405358 +0000 UTC m=+0.233270616 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:21:20 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:21:20 localhost podman[73012]: 2025-11-28 08:21:20.176350928 +0000 UTC m=+0.415592550 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:21:20 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:21:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:21:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:21:23 localhost podman[73108]: 2025-11-28 08:21:23.84013082 +0000 UTC m=+0.080166361 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, container_name=ovn_metadata_agent) Nov 28 03:21:23 localhost podman[73108]: 2025-11-28 08:21:23.88401165 +0000 UTC m=+0.124047151 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 03:21:23 localhost systemd[1]: tmp-crun.ip9a0P.mount: Deactivated successfully. Nov 28 03:21:23 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:21:23 localhost podman[73109]: 2025-11-28 08:21:23.910140582 +0000 UTC m=+0.146372234 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T23:34:05Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:21:23 localhost podman[73109]: 2025-11-28 08:21:23.931873195 +0000 UTC m=+0.168104857 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team) Nov 28 03:21:23 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:21:36 localhost sshd[73153]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:21:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:21:44 localhost podman[73155]: 2025-11-28 08:21:44.854192353 +0000 UTC m=+0.089520366 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, config_id=tripleo_step3, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, container_name=collectd, version=17.1.12, vcs-type=git, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:21:44 localhost podman[73155]: 2025-11-28 08:21:44.890538906 +0000 UTC m=+0.125866979 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-type=git, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:21:44 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:21:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:21:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:21:45 localhost systemd[1]: tmp-crun.zFaGIv.mount: Deactivated successfully. Nov 28 03:21:45 localhost podman[73176]: 2025-11-28 08:21:45.843849522 +0000 UTC m=+0.084112106 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-18T23:44:13Z, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=iscsid) Nov 28 03:21:45 localhost podman[73177]: 2025-11-28 08:21:45.868819527 +0000 UTC m=+0.099743458 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public) Nov 28 03:21:45 localhost podman[73176]: 2025-11-28 08:21:45.877707006 +0000 UTC m=+0.117969560 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=iscsid, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:21:45 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:21:46 localhost podman[73177]: 2025-11-28 08:21:46.048479586 +0000 UTC m=+0.279403477 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, build-date=2025-11-18T22:49:46Z, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.) Nov 28 03:21:46 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:21:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:21:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:21:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:21:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:21:50 localhost podman[73223]: 2025-11-28 08:21:50.849101367 +0000 UTC m=+0.085671045 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:21:50 localhost podman[73222]: 2025-11-28 08:21:50.894439743 +0000 UTC m=+0.133861831 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:21:50 localhost podman[73224]: 2025-11-28 08:21:50.956799273 +0000 UTC m=+0.188626502 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.expose-services=) Nov 28 03:21:50 localhost podman[73228]: 2025-11-28 08:21:50.926710617 +0000 UTC m=+0.155354226 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 28 03:21:50 localhost podman[73224]: 2025-11-28 08:21:50.988653904 +0000 UTC m=+0.220481163 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, tcib_managed=true, version=17.1.12) Nov 28 03:21:50 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:21:51 localhost podman[73228]: 2025-11-28 08:21:51.009512481 +0000 UTC m=+0.238156120 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, version=17.1.12, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=) Nov 28 03:21:51 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:21:51 localhost podman[73222]: 2025-11-28 08:21:51.030662395 +0000 UTC m=+0.270084513 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:21:51 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:21:51 localhost podman[73223]: 2025-11-28 08:21:51.203826541 +0000 UTC m=+0.440396169 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git) Nov 28 03:21:51 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:21:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:21:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:21:54 localhost podman[73317]: 2025-11-28 08:21:54.849090733 +0000 UTC m=+0.086313275 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 03:21:54 localhost podman[73317]: 2025-11-28 08:21:54.898351052 +0000 UTC m=+0.135573574 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step4) Nov 28 03:21:54 localhost systemd[1]: tmp-crun.XhO9qt.mount: Deactivated successfully. Nov 28 03:21:54 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:21:54 localhost podman[73316]: 2025-11-28 08:21:54.914448078 +0000 UTC m=+0.153002512 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 28 03:21:54 localhost podman[73316]: 2025-11-28 08:21:54.946338441 +0000 UTC m=+0.184892865 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044) Nov 28 03:21:54 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:22:05 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 03:22:05 localhost recover_tripleo_nova_virtqemud[73363]: 61397 Nov 28 03:22:05 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 03:22:05 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 03:22:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:22:15 localhost podman[73442]: 2025-11-28 08:22:15.836090597 +0000 UTC m=+0.068046280 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, tcib_managed=true, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:22:15 localhost podman[73442]: 2025-11-28 08:22:15.84599985 +0000 UTC m=+0.077955533 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, release=1761123044, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.) Nov 28 03:22:15 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:22:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:22:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:22:16 localhost systemd[1]: tmp-crun.bZPjZB.mount: Deactivated successfully. Nov 28 03:22:16 localhost podman[73462]: 2025-11-28 08:22:16.856677789 +0000 UTC m=+0.092243112 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible) Nov 28 03:22:16 localhost podman[73462]: 2025-11-28 08:22:16.867688225 +0000 UTC m=+0.103253538 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step3, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-type=git) Nov 28 03:22:16 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:22:16 localhost podman[73463]: 2025-11-28 08:22:16.831480347 +0000 UTC m=+0.067893556 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, release=1761123044, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:22:17 localhost podman[73463]: 2025-11-28 08:22:17.04624372 +0000 UTC m=+0.282656859 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:22:17 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:22:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:22:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:22:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:22:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:22:21 localhost systemd[1]: tmp-crun.MvLlFc.mount: Deactivated successfully. Nov 28 03:22:21 localhost podman[73510]: 2025-11-28 08:22:21.851850578 +0000 UTC m=+0.083756065 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:22:21 localhost podman[73509]: 2025-11-28 08:22:21.903419649 +0000 UTC m=+0.134273963 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:22:21 localhost podman[73508]: 2025-11-28 08:22:21.963473187 +0000 UTC m=+0.196970454 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1) Nov 28 03:22:21 localhost podman[73511]: 2025-11-28 08:22:21.877244806 +0000 UTC m=+0.100507891 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 28 03:22:21 localhost podman[73510]: 2025-11-28 08:22:21.983523358 +0000 UTC m=+0.215428815 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-18T22:49:32Z, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1) Nov 28 03:22:21 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:22:22 localhost podman[73511]: 2025-11-28 08:22:22.00998449 +0000 UTC m=+0.233247585 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=) Nov 28 03:22:22 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:22:22 localhost podman[73508]: 2025-11-28 08:22:22.024845317 +0000 UTC m=+0.258342514 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:22:22 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:22:22 localhost podman[73509]: 2025-11-28 08:22:22.270708208 +0000 UTC m=+0.501562602 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4) Nov 28 03:22:22 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:22:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:22:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:22:25 localhost podman[73604]: 2025-11-28 08:22:25.857810241 +0000 UTC m=+0.086489021 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=ovn_controller, io.openshift.expose-services=, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:22:25 localhost systemd[1]: tmp-crun.lFYal7.mount: Deactivated successfully. Nov 28 03:22:25 localhost podman[73603]: 2025-11-28 08:22:25.915985279 +0000 UTC m=+0.145785995 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4) Nov 28 03:22:25 localhost podman[73604]: 2025-11-28 08:22:25.936543726 +0000 UTC m=+0.165222546 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Nov 28 03:22:25 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:22:25 localhost podman[73603]: 2025-11-28 08:22:25.991463212 +0000 UTC m=+0.221263938 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, architecture=x86_64, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git) Nov 28 03:22:26 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:22:26 localhost python3[73698]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:22:27 localhost python3[73743]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764318146.518492-114295-47244957008075/source _original_basename=tmpr3j16rg5 follow=False checksum=039e0b234f00fbd1242930f0d5dc67e8b4c067fe backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:22:28 localhost python3[73773]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_5 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:22:29 localhost ansible-async_wrapper.py[73945]: Invoked with 298073833886 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1764318149.3570664-114686-51827400948075/AnsiballZ_command.py _ Nov 28 03:22:29 localhost ansible-async_wrapper.py[73948]: Starting module and watcher Nov 28 03:22:29 localhost ansible-async_wrapper.py[73948]: Start watching 73949 (3600) Nov 28 03:22:29 localhost ansible-async_wrapper.py[73949]: Start module (73949) Nov 28 03:22:29 localhost ansible-async_wrapper.py[73945]: Return async_wrapper task started. Nov 28 03:22:30 localhost python3[73969]: ansible-ansible.legacy.async_status Invoked with jid=298073833886.73945 mode=status _async_dir=/tmp/.ansible_async Nov 28 03:22:33 localhost puppet-user[73964]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Nov 28 03:22:33 localhost puppet-user[73964]: (file: /etc/puppet/hiera.yaml) Nov 28 03:22:33 localhost puppet-user[73964]: Warning: Undefined variable '::deploy_config_name'; Nov 28 03:22:33 localhost puppet-user[73964]: (file & line not available) Nov 28 03:22:33 localhost puppet-user[73964]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Nov 28 03:22:33 localhost puppet-user[73964]: (file & line not available) Nov 28 03:22:33 localhost puppet-user[73964]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Nov 28 03:22:34 localhost puppet-user[73964]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 28 03:22:34 localhost puppet-user[73964]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 28 03:22:34 localhost puppet-user[73964]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 28 03:22:34 localhost puppet-user[73964]: with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 28 03:22:34 localhost puppet-user[73964]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 28 03:22:34 localhost puppet-user[73964]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 28 03:22:34 localhost puppet-user[73964]: with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 28 03:22:34 localhost puppet-user[73964]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 28 03:22:34 localhost puppet-user[73964]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 28 03:22:34 localhost puppet-user[73964]: with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 28 03:22:34 localhost puppet-user[73964]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 28 03:22:34 localhost puppet-user[73964]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 28 03:22:34 localhost puppet-user[73964]: with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 28 03:22:34 localhost puppet-user[73964]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 28 03:22:34 localhost puppet-user[73964]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Nov 28 03:22:34 localhost puppet-user[73964]: with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Nov 28 03:22:34 localhost puppet-user[73964]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Nov 28 03:22:34 localhost puppet-user[73964]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Nov 28 03:22:34 localhost puppet-user[73964]: Notice: Compiled catalog for np0005538513.localdomain in environment production in 0.22 seconds Nov 28 03:22:34 localhost puppet-user[73964]: Notice: Applied catalog in 0.43 seconds Nov 28 03:22:34 localhost puppet-user[73964]: Application: Nov 28 03:22:34 localhost puppet-user[73964]: Initial environment: production Nov 28 03:22:34 localhost puppet-user[73964]: Converged environment: production Nov 28 03:22:34 localhost puppet-user[73964]: Run mode: user Nov 28 03:22:34 localhost puppet-user[73964]: Changes: Nov 28 03:22:34 localhost puppet-user[73964]: Events: Nov 28 03:22:34 localhost puppet-user[73964]: Resources: Nov 28 03:22:34 localhost puppet-user[73964]: Total: 19 Nov 28 03:22:34 localhost puppet-user[73964]: Time: Nov 28 03:22:34 localhost puppet-user[73964]: Schedule: 0.00 Nov 28 03:22:34 localhost puppet-user[73964]: Package: 0.00 Nov 28 03:22:34 localhost puppet-user[73964]: Exec: 0.01 Nov 28 03:22:34 localhost puppet-user[73964]: Augeas: 0.01 Nov 28 03:22:34 localhost puppet-user[73964]: File: 0.02 Nov 28 03:22:34 localhost puppet-user[73964]: Service: 0.06 Nov 28 03:22:34 localhost puppet-user[73964]: Config retrieval: 0.28 Nov 28 03:22:34 localhost puppet-user[73964]: Transaction evaluation: 0.28 Nov 28 03:22:34 localhost puppet-user[73964]: Catalog application: 0.43 Nov 28 03:22:34 localhost puppet-user[73964]: Last run: 1764318154 Nov 28 03:22:34 localhost puppet-user[73964]: Filebucket: 0.00 Nov 28 03:22:34 localhost puppet-user[73964]: Total: 0.43 Nov 28 03:22:34 localhost puppet-user[73964]: Version: Nov 28 03:22:34 localhost puppet-user[73964]: Config: 1764318153 Nov 28 03:22:34 localhost puppet-user[73964]: Puppet: 7.10.0 Nov 28 03:22:34 localhost ansible-async_wrapper.py[73949]: Module complete (73949) Nov 28 03:22:34 localhost ansible-async_wrapper.py[73948]: Done in kid B. Nov 28 03:22:40 localhost python3[74107]: ansible-ansible.legacy.async_status Invoked with jid=298073833886.73945 mode=status _async_dir=/tmp/.ansible_async Nov 28 03:22:41 localhost python3[74123]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 28 03:22:41 localhost python3[74139]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:22:42 localhost python3[74189]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:22:42 localhost python3[74207]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpy6ueq7k2 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Nov 28 03:22:42 localhost python3[74237]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:22:43 localhost python3[74342]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Nov 28 03:22:44 localhost python3[74361]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:22:45 localhost python3[74393]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:22:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:22:45 localhost podman[74444]: 2025-11-28 08:22:45.988062484 +0000 UTC m=+0.086119268 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd) Nov 28 03:22:45 localhost podman[74444]: 2025-11-28 08:22:45.999361939 +0000 UTC m=+0.097418733 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, architecture=x86_64, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:22:46 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:22:46 localhost python3[74443]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:22:46 localhost python3[74481]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:22:46 localhost python3[74543]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:22:46 localhost python3[74561]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:22:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:22:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:22:47 localhost podman[74624]: 2025-11-28 08:22:47.47912343 +0000 UTC m=+0.072594114 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=) Nov 28 03:22:47 localhost podman[74624]: 2025-11-28 08:22:47.487265415 +0000 UTC m=+0.080736129 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, name=rhosp17/openstack-iscsid, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, tcib_managed=true, config_id=tripleo_step3, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:22:47 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:22:47 localhost podman[74625]: 2025-11-28 08:22:47.548776379 +0000 UTC m=+0.138499255 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:22:47 localhost python3[74623]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:22:47 localhost podman[74625]: 2025-11-28 08:22:47.784328386 +0000 UTC m=+0.374051202 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, version=17.1.12, vcs-type=git, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com) Nov 28 03:22:47 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:22:47 localhost python3[74687]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:22:48 localhost python3[74749]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:22:48 localhost python3[74767]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:22:49 localhost python3[74797]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:22:49 localhost systemd[1]: Reloading. Nov 28 03:22:49 localhost systemd-rc-local-generator[74820]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:22:49 localhost systemd-sysv-generator[74825]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:22:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:22:49 localhost python3[74883]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:22:50 localhost python3[74901]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:22:50 localhost python3[74963]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 28 03:22:50 localhost python3[74981]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:22:51 localhost python3[75011]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:22:51 localhost systemd[1]: Reloading. Nov 28 03:22:51 localhost systemd-rc-local-generator[75032]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:22:51 localhost systemd-sysv-generator[75036]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:22:51 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:22:51 localhost systemd[1]: Starting Create netns directory... Nov 28 03:22:51 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 28 03:22:51 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 28 03:22:51 localhost systemd[1]: Finished Create netns directory. Nov 28 03:22:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:22:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:22:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:22:52 localhost podman[75067]: 2025-11-28 08:22:52.250207202 +0000 UTC m=+0.084157357 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:22:52 localhost podman[75067]: 2025-11-28 08:22:52.303446346 +0000 UTC m=+0.137396501 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.buildah.version=1.41.4, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute) Nov 28 03:22:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:22:52 localhost podman[75070]: 2025-11-28 08:22:52.315004509 +0000 UTC m=+0.148811330 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, batch=17.1_20251118.1) Nov 28 03:22:52 localhost python3[75068]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Nov 28 03:22:52 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:22:52 localhost podman[75069]: 2025-11-28 08:22:52.3532198 +0000 UTC m=+0.187225547 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4) Nov 28 03:22:52 localhost podman[75118]: 2025-11-28 08:22:52.404938957 +0000 UTC m=+0.079441839 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, release=1761123044, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12) Nov 28 03:22:52 localhost podman[75069]: 2025-11-28 08:22:52.418325708 +0000 UTC m=+0.252331455 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, tcib_managed=true, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-cron, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 28 03:22:52 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:22:52 localhost podman[75070]: 2025-11-28 08:22:52.475130424 +0000 UTC m=+0.308937285 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vcs-type=git, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi) Nov 28 03:22:52 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:22:52 localhost podman[75118]: 2025-11-28 08:22:52.743275165 +0000 UTC m=+0.417778047 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, container_name=nova_migration_target, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:22:52 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:22:53 localhost python3[75215]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step5 config_dir=/var/lib/tripleo-config/container-startup-config/step_5 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Nov 28 03:22:54 localhost podman[75253]: 2025-11-28 08:22:54.095537955 +0000 UTC m=+0.072706877 container create c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step5) Nov 28 03:22:54 localhost systemd[1]: Started libpod-conmon-c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.scope. Nov 28 03:22:54 localhost systemd[1]: Started libcrun container. Nov 28 03:22:54 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d078e66fd31ac4112ed049a2e764bba9cb8e12df988d1bfceb2a6cc73592ff91/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Nov 28 03:22:54 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d078e66fd31ac4112ed049a2e764bba9cb8e12df988d1bfceb2a6cc73592ff91/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 28 03:22:54 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d078e66fd31ac4112ed049a2e764bba9cb8e12df988d1bfceb2a6cc73592ff91/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Nov 28 03:22:54 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d078e66fd31ac4112ed049a2e764bba9cb8e12df988d1bfceb2a6cc73592ff91/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 03:22:54 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d078e66fd31ac4112ed049a2e764bba9cb8e12df988d1bfceb2a6cc73592ff91/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Nov 28 03:22:54 localhost podman[75253]: 2025-11-28 08:22:54.055784266 +0000 UTC m=+0.032953248 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 28 03:22:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:22:54 localhost podman[75253]: 2025-11-28 08:22:54.180937961 +0000 UTC m=+0.158106993 container init c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=nova_compute, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:22:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:22:54 localhost podman[75253]: 2025-11-28 08:22:54.214190927 +0000 UTC m=+0.191359939 container start c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step5, version=17.1.12, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=nova_compute) Nov 28 03:22:54 localhost systemd-logind[764]: Existing logind session ID 28 used by new audit session, ignoring. Nov 28 03:22:54 localhost python3[75215]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute --conmon-pidfile /run/nova_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env LIBGUESTFS_BACKEND=direct --env TRIPLEO_CONFIG_HASH=c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f --healthcheck-command /openstack/healthcheck 5672 --ipc host --label config_id=tripleo_step5 --label container_name=nova_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute.log --network host --privileged=True --ulimit nofile=131072 --ulimit memlock=67108864 --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/nova:/var/log/nova --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /dev:/dev --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /run/nova:/run/nova:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /sys/class/net:/sys/class/net --volume /sys/bus/pci:/sys/bus/pci --volume /boot:/boot:ro --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 28 03:22:54 localhost systemd[1]: Created slice User Slice of UID 0. Nov 28 03:22:54 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Nov 28 03:22:54 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Nov 28 03:22:54 localhost systemd[1]: Starting User Manager for UID 0... Nov 28 03:22:54 localhost systemd[75288]: Queued start job for default target Main User Target. Nov 28 03:22:54 localhost systemd[75288]: Created slice User Application Slice. Nov 28 03:22:54 localhost systemd[75288]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Nov 28 03:22:54 localhost systemd[75288]: Started Daily Cleanup of User's Temporary Directories. Nov 28 03:22:54 localhost systemd[75288]: Reached target Paths. Nov 28 03:22:54 localhost systemd[75288]: Reached target Timers. Nov 28 03:22:54 localhost systemd[75288]: Starting D-Bus User Message Bus Socket... Nov 28 03:22:54 localhost podman[75275]: 2025-11-28 08:22:54.448573836 +0000 UTC m=+0.223567860 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, architecture=x86_64, release=1761123044, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-type=git, container_name=nova_compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:22:54 localhost systemd[75288]: Starting Create User's Volatile Files and Directories... Nov 28 03:22:54 localhost systemd[75288]: Finished Create User's Volatile Files and Directories. Nov 28 03:22:54 localhost systemd[75288]: Listening on D-Bus User Message Bus Socket. Nov 28 03:22:54 localhost systemd[75288]: Reached target Sockets. Nov 28 03:22:54 localhost systemd[75288]: Reached target Basic System. Nov 28 03:22:54 localhost systemd[75288]: Reached target Main User Target. Nov 28 03:22:54 localhost systemd[75288]: Startup finished in 131ms. Nov 28 03:22:54 localhost systemd[1]: Started User Manager for UID 0. Nov 28 03:22:54 localhost systemd[1]: Started Session c10 of User root. Nov 28 03:22:54 localhost podman[75275]: 2025-11-28 08:22:54.521508459 +0000 UTC m=+0.296502513 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, tcib_managed=true, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, vendor=Red Hat, Inc.) Nov 28 03:22:54 localhost podman[75275]: unhealthy Nov 28 03:22:54 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:22:54 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'. Nov 28 03:22:54 localhost systemd[1]: session-c10.scope: Deactivated successfully. Nov 28 03:22:54 localhost podman[75375]: 2025-11-28 08:22:54.676853925 +0000 UTC m=+0.099086627 container create b23471aa557660c5dac2cb5b1a3e4f53c2898babc5e8dcd2f3287d589ad38589 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, container_name=nova_wait_for_compute_service, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5) Nov 28 03:22:54 localhost podman[75375]: 2025-11-28 08:22:54.620633787 +0000 UTC m=+0.042866509 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 28 03:22:54 localhost systemd[1]: Started libpod-conmon-b23471aa557660c5dac2cb5b1a3e4f53c2898babc5e8dcd2f3287d589ad38589.scope. Nov 28 03:22:54 localhost systemd[1]: Started libcrun container. Nov 28 03:22:54 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/722caa9ca8bac97f5e9f74c37c16703e7dbaf1a59ddfd48c9720d9c7ac6caca3/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff) Nov 28 03:22:54 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/722caa9ca8bac97f5e9f74c37c16703e7dbaf1a59ddfd48c9720d9c7ac6caca3/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Nov 28 03:22:54 localhost podman[75375]: 2025-11-28 08:22:54.7511433 +0000 UTC m=+0.173376002 container init b23471aa557660c5dac2cb5b1a3e4f53c2898babc5e8dcd2f3287d589ad38589 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, container_name=nova_wait_for_compute_service, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git) Nov 28 03:22:54 localhost podman[75375]: 2025-11-28 08:22:54.760909608 +0000 UTC m=+0.183142310 container start b23471aa557660c5dac2cb5b1a3e4f53c2898babc5e8dcd2f3287d589ad38589 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_wait_for_compute_service, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, config_id=tripleo_step5) Nov 28 03:22:54 localhost podman[75375]: 2025-11-28 08:22:54.761152265 +0000 UTC m=+0.183384977 container attach b23471aa557660c5dac2cb5b1a3e4f53c2898babc5e8dcd2f3287d589ad38589 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, container_name=nova_wait_for_compute_service, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-type=git, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:22:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:22:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:22:56 localhost systemd[1]: tmp-crun.0ulbkM.mount: Deactivated successfully. Nov 28 03:22:56 localhost podman[75399]: 2025-11-28 08:22:56.869455379 +0000 UTC m=+0.099636964 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64) Nov 28 03:22:56 localhost podman[75398]: 2025-11-28 08:22:56.907271798 +0000 UTC m=+0.136374970 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:22:56 localhost podman[75399]: 2025-11-28 08:22:56.924375996 +0000 UTC m=+0.154557551 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, config_id=tripleo_step4, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:22:56 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:22:56 localhost podman[75398]: 2025-11-28 08:22:56.979583362 +0000 UTC m=+0.208686474 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, version=17.1.12, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:22:56 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:23:04 localhost systemd[1]: Stopping User Manager for UID 0... Nov 28 03:23:04 localhost systemd[75288]: Activating special unit Exit the Session... Nov 28 03:23:04 localhost systemd[75288]: Stopped target Main User Target. Nov 28 03:23:04 localhost systemd[75288]: Stopped target Basic System. Nov 28 03:23:04 localhost systemd[75288]: Stopped target Paths. Nov 28 03:23:04 localhost systemd[75288]: Stopped target Sockets. Nov 28 03:23:04 localhost systemd[75288]: Stopped target Timers. Nov 28 03:23:04 localhost systemd[75288]: Stopped Daily Cleanup of User's Temporary Directories. Nov 28 03:23:04 localhost systemd[75288]: Closed D-Bus User Message Bus Socket. Nov 28 03:23:04 localhost systemd[75288]: Stopped Create User's Volatile Files and Directories. Nov 28 03:23:04 localhost systemd[75288]: Removed slice User Application Slice. Nov 28 03:23:04 localhost systemd[75288]: Reached target Shutdown. Nov 28 03:23:04 localhost systemd[75288]: Finished Exit the Session. Nov 28 03:23:04 localhost systemd[75288]: Reached target Exit the Session. Nov 28 03:23:04 localhost systemd[1]: user@0.service: Deactivated successfully. Nov 28 03:23:04 localhost systemd[1]: Stopped User Manager for UID 0. Nov 28 03:23:04 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Nov 28 03:23:04 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Nov 28 03:23:04 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Nov 28 03:23:04 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Nov 28 03:23:04 localhost systemd[1]: Removed slice User Slice of UID 0. Nov 28 03:23:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:23:16 localhost podman[75524]: 2025-11-28 08:23:16.847336741 +0000 UTC m=+0.082003429 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, distribution-scope=public, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=) Nov 28 03:23:16 localhost podman[75524]: 2025-11-28 08:23:16.886510543 +0000 UTC m=+0.121177301 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-collectd-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:23:16 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:23:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:23:17 localhost systemd[1]: tmp-crun.0yiD7S.mount: Deactivated successfully. Nov 28 03:23:17 localhost podman[75545]: 2025-11-28 08:23:17.844010631 +0000 UTC m=+0.078701976 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, container_name=iscsid, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3) Nov 28 03:23:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:23:17 localhost podman[75545]: 2025-11-28 08:23:17.859445316 +0000 UTC m=+0.094136651 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, architecture=x86_64, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true) Nov 28 03:23:17 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:23:17 localhost systemd[1]: tmp-crun.JLrCmr.mount: Deactivated successfully. Nov 28 03:23:17 localhost podman[75564]: 2025-11-28 08:23:17.948247998 +0000 UTC m=+0.086949905 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, container_name=metrics_qdr, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_id=tripleo_step1, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=) Nov 28 03:23:18 localhost podman[75564]: 2025-11-28 08:23:18.133692439 +0000 UTC m=+0.272394346 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1) Nov 28 03:23:18 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:23:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:23:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:23:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:23:22 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 03:23:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:23:22 localhost recover_tripleo_nova_virtqemud[75613]: 61397 Nov 28 03:23:22 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 03:23:22 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 03:23:22 localhost podman[75593]: 2025-11-28 08:23:22.841169703 +0000 UTC m=+0.076842448 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 28 03:23:22 localhost podman[75593]: 2025-11-28 08:23:22.86432691 +0000 UTC m=+0.099999675 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:11:48Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 28 03:23:22 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:23:22 localhost podman[75595]: 2025-11-28 08:23:22.906409344 +0000 UTC m=+0.135726320 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible) Nov 28 03:23:22 localhost podman[75594]: 2025-11-28 08:23:22.946570366 +0000 UTC m=+0.177463411 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-cron-container, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team) Nov 28 03:23:22 localhost podman[75595]: 2025-11-28 08:23:22.959374869 +0000 UTC m=+0.188691815 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, tcib_managed=true, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 28 03:23:22 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:23:22 localhost podman[75594]: 2025-11-28 08:23:22.979368578 +0000 UTC m=+0.210261713 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:23:22 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:23:23 localhost podman[75612]: 2025-11-28 08:23:23.0615092 +0000 UTC m=+0.278042873 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, container_name=nova_migration_target, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:23:23 localhost podman[75612]: 2025-11-28 08:23:23.460695713 +0000 UTC m=+0.677229376 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64) Nov 28 03:23:23 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:23:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:23:24 localhost podman[75687]: 2025-11-28 08:23:24.844453584 +0000 UTC m=+0.077808868 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, io.openshift.expose-services=, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:23:24 localhost podman[75687]: 2025-11-28 08:23:24.928406294 +0000 UTC m=+0.161761548 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=nova_compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:23:24 localhost podman[75687]: unhealthy Nov 28 03:23:24 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:23:24 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'. Nov 28 03:23:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:23:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:23:27 localhost podman[75710]: 2025-11-28 08:23:27.837099834 +0000 UTC m=+0.072673426 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, release=1761123044) Nov 28 03:23:27 localhost podman[75710]: 2025-11-28 08:23:27.884825115 +0000 UTC m=+0.120398677 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, vcs-type=git, container_name=ovn_controller, batch=17.1_20251118.1) Nov 28 03:23:27 localhost systemd[1]: tmp-crun.A0f2h0.mount: Deactivated successfully. Nov 28 03:23:27 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:23:27 localhost podman[75709]: 2025-11-28 08:23:27.909821411 +0000 UTC m=+0.147415787 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=) Nov 28 03:23:27 localhost podman[75709]: 2025-11-28 08:23:27.955007951 +0000 UTC m=+0.192602327 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 03:23:27 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:23:37 localhost sshd[75759]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:23:46 localhost systemd[1]: session-27.scope: Deactivated successfully. Nov 28 03:23:46 localhost systemd[1]: session-27.scope: Consumed 2.979s CPU time. Nov 28 03:23:46 localhost systemd-logind[764]: Session 27 logged out. Waiting for processes to exit. Nov 28 03:23:46 localhost systemd-logind[764]: Removed session 27. Nov 28 03:23:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:23:47 localhost systemd[1]: tmp-crun.wdczUu.mount: Deactivated successfully. Nov 28 03:23:47 localhost podman[75761]: 2025-11-28 08:23:47.860050277 +0000 UTC m=+0.093271209 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step3, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, release=1761123044, name=rhosp17/openstack-collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:23:47 localhost podman[75761]: 2025-11-28 08:23:47.877074658 +0000 UTC m=+0.110295580 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step3, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container) Nov 28 03:23:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:23:47 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:23:47 localhost podman[75781]: 2025-11-28 08:23:47.960731689 +0000 UTC m=+0.058683137 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team) Nov 28 03:23:47 localhost podman[75781]: 2025-11-28 08:23:47.999053449 +0000 UTC m=+0.097004867 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, container_name=iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, vcs-type=git, version=17.1.12) Nov 28 03:23:48 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:23:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:23:48 localhost podman[75800]: 2025-11-28 08:23:48.8419099 +0000 UTC m=+0.078250460 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:23:49 localhost podman[75800]: 2025-11-28 08:23:49.058249312 +0000 UTC m=+0.294589792 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, container_name=metrics_qdr, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:23:49 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:23:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:23:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:23:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:23:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:23:53 localhost systemd[1]: tmp-crun.CxWm2h.mount: Deactivated successfully. Nov 28 03:23:53 localhost podman[75831]: 2025-11-28 08:23:53.872713587 +0000 UTC m=+0.109751042 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git) Nov 28 03:23:53 localhost podman[75830]: 2025-11-28 08:23:53.830486864 +0000 UTC m=+0.073153978 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:23:53 localhost podman[75837]: 2025-11-28 08:23:53.858599438 +0000 UTC m=+0.089964192 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-cron-container) Nov 28 03:23:53 localhost podman[75830]: 2025-11-28 08:23:53.913559696 +0000 UTC m=+0.156226810 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container) Nov 28 03:23:53 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:23:53 localhost podman[75837]: 2025-11-28 08:23:53.938103477 +0000 UTC m=+0.169468251 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=logrotate_crond, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron) Nov 28 03:23:53 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:23:53 localhost podman[75838]: 2025-11-28 08:23:53.946666999 +0000 UTC m=+0.173736497 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, config_id=tripleo_step4) Nov 28 03:23:53 localhost podman[75838]: 2025-11-28 08:23:53.9642978 +0000 UTC m=+0.191367288 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, container_name=ceilometer_agent_ipmi, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc.) Nov 28 03:23:53 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:23:54 localhost podman[75831]: 2025-11-28 08:23:54.216646527 +0000 UTC m=+0.453683992 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_migration_target, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:23:54 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:23:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:23:55 localhost podman[75923]: 2025-11-28 08:23:55.836709951 +0000 UTC m=+0.072588530 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vcs-type=git, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:23:55 localhost podman[75923]: 2025-11-28 08:23:55.895698818 +0000 UTC m=+0.131577397 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, container_name=nova_compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:23:55 localhost podman[75923]: unhealthy Nov 28 03:23:55 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:23:55 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'. Nov 28 03:23:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:23:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:23:58 localhost systemd[1]: tmp-crun.eIJfbb.mount: Deactivated successfully. Nov 28 03:23:58 localhost podman[75945]: 2025-11-28 08:23:58.839386135 +0000 UTC m=+0.076079171 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:23:58 localhost podman[75946]: 2025-11-28 08:23:58.893640941 +0000 UTC m=+0.126717162 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller) Nov 28 03:23:58 localhost podman[75945]: 2025-11-28 08:23:58.911563041 +0000 UTC m=+0.148256067 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team) Nov 28 03:23:58 localhost podman[75946]: 2025-11-28 08:23:58.92032278 +0000 UTC m=+0.153398971 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team) Nov 28 03:23:58 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:23:58 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:24:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:24:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:24:18 localhost systemd[1]: tmp-crun.mkd0p4.mount: Deactivated successfully. Nov 28 03:24:18 localhost podman[76068]: 2025-11-28 08:24:18.864765736 +0000 UTC m=+0.097397970 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.12, architecture=x86_64, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container) Nov 28 03:24:18 localhost podman[76068]: 2025-11-28 08:24:18.880321311 +0000 UTC m=+0.112953555 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid) Nov 28 03:24:18 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:24:18 localhost podman[76069]: 2025-11-28 08:24:18.959868781 +0000 UTC m=+0.192929408 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, vcs-type=git) Nov 28 03:24:18 localhost podman[76069]: 2025-11-28 08:24:18.973368411 +0000 UTC m=+0.206429018 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z) Nov 28 03:24:18 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:24:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:24:19 localhost podman[76106]: 2025-11-28 08:24:19.843078585 +0000 UTC m=+0.086331587 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, architecture=x86_64, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd) Nov 28 03:24:20 localhost podman[76106]: 2025-11-28 08:24:20.020246681 +0000 UTC m=+0.263499673 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64) Nov 28 03:24:20 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:24:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:24:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:24:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:24:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:24:24 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 03:24:24 localhost recover_tripleo_nova_virtqemud[76156]: 61397 Nov 28 03:24:24 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 03:24:24 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 03:24:24 localhost podman[76135]: 2025-11-28 08:24:24.842696581 +0000 UTC m=+0.075258955 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:24:24 localhost podman[76135]: 2025-11-28 08:24:24.874317087 +0000 UTC m=+0.106879501 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4) Nov 28 03:24:24 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:24:25 localhost podman[76136]: 2025-11-28 08:24:24.999538 +0000 UTC m=+0.229902794 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true) Nov 28 03:24:25 localhost podman[76138]: 2025-11-28 08:24:25.030599818 +0000 UTC m=+0.253257796 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:24:25 localhost podman[76137]: 2025-11-28 08:24:25.069595589 +0000 UTC m=+0.296649678 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=logrotate_crond, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-cron, vcs-type=git) Nov 28 03:24:25 localhost podman[76137]: 2025-11-28 08:24:25.099972745 +0000 UTC m=+0.327026864 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, container_name=logrotate_crond, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, tcib_managed=true, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Nov 28 03:24:25 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:24:25 localhost podman[76138]: 2025-11-28 08:24:25.108421274 +0000 UTC m=+0.331079252 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, release=1761123044, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, version=17.1.12, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:24:25 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:24:25 localhost podman[76136]: 2025-11-28 08:24:25.362603189 +0000 UTC m=+0.592967983 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team) Nov 28 03:24:25 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:24:25 localhost systemd[1]: tmp-crun.IRGTvo.mount: Deactivated successfully. Nov 28 03:24:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:24:26 localhost podman[76232]: 2025-11-28 08:24:26.840973785 +0000 UTC m=+0.079004314 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, url=https://www.redhat.com, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:24:26 localhost podman[76232]: 2025-11-28 08:24:26.900429736 +0000 UTC m=+0.138460255 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:24:26 localhost podman[76232]: unhealthy Nov 28 03:24:26 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:24:26 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'. Nov 28 03:24:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:24:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:24:29 localhost podman[76254]: 2025-11-28 08:24:29.83140032 +0000 UTC m=+0.072948731 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 03:24:29 localhost podman[76254]: 2025-11-28 08:24:29.875476521 +0000 UTC m=+0.117024912 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12) Nov 28 03:24:29 localhost systemd[1]: tmp-crun.0ycmAw.mount: Deactivated successfully. Nov 28 03:24:29 localhost podman[76255]: 2025-11-28 08:24:29.893587808 +0000 UTC m=+0.130020097 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=ovn_controller, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-18T23:34:05Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20251118.1) Nov 28 03:24:29 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:24:29 localhost podman[76255]: 2025-11-28 08:24:29.914589766 +0000 UTC m=+0.151022035 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller) Nov 28 03:24:29 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:24:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:24:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:24:49 localhost podman[76302]: 2025-11-28 08:24:49.839043793 +0000 UTC m=+0.079407327 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:24:49 localhost podman[76303]: 2025-11-28 08:24:49.888892999 +0000 UTC m=+0.127039872 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, config_id=tripleo_step3, vcs-type=git, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:24:49 localhost podman[76302]: 2025-11-28 08:24:49.924233283 +0000 UTC m=+0.164596847 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, vcs-type=git, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public) Nov 28 03:24:49 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:24:49 localhost podman[76303]: 2025-11-28 08:24:49.976269479 +0000 UTC m=+0.214416312 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:24:49 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:24:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:24:50 localhost podman[76339]: 2025-11-28 08:24:50.848723311 +0000 UTC m=+0.087418842 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, container_name=metrics_qdr, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 28 03:24:51 localhost podman[76339]: 2025-11-28 08:24:51.028107967 +0000 UTC m=+0.266803568 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_id=tripleo_step1, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:24:51 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:24:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:24:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:24:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:24:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:24:55 localhost podman[76368]: 2025-11-28 08:24:55.815190832 +0000 UTC m=+0.061556080 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git) Nov 28 03:24:55 localhost podman[76368]: 2025-11-28 08:24:55.836301804 +0000 UTC m=+0.082667012 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute) Nov 28 03:24:55 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:24:55 localhost podman[76375]: 2025-11-28 08:24:55.875123618 +0000 UTC m=+0.111891241 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, release=1761123044, architecture=x86_64, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.12, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, io.openshift.expose-services=, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.buildah.version=1.41.4) Nov 28 03:24:55 localhost podman[76375]: 2025-11-28 08:24:55.88651221 +0000 UTC m=+0.123279813 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, batch=17.1_20251118.1) Nov 28 03:24:55 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:24:55 localhost podman[76369]: 2025-11-28 08:24:55.926164322 +0000 UTC m=+0.167402106 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1) Nov 28 03:24:55 localhost podman[76379]: 2025-11-28 08:24:55.989148845 +0000 UTC m=+0.223640585 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:24:56 localhost podman[76379]: 2025-11-28 08:24:56.043445452 +0000 UTC m=+0.277937122 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, version=17.1.12, io.buildah.version=1.41.4, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 28 03:24:56 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:24:56 localhost podman[76369]: 2025-11-28 08:24:56.26123694 +0000 UTC m=+0.502474694 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container) Nov 28 03:24:56 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:24:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:24:57 localhost systemd[1]: tmp-crun.kQbXOK.mount: Deactivated successfully. Nov 28 03:24:57 localhost podman[76462]: 2025-11-28 08:24:57.849672527 +0000 UTC m=+0.087832025 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute) Nov 28 03:24:57 localhost podman[76462]: 2025-11-28 08:24:57.910351757 +0000 UTC m=+0.148511245 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, release=1761123044, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:24:57 localhost podman[76462]: unhealthy Nov 28 03:24:57 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:24:57 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'. Nov 28 03:25:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:25:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:25:00 localhost podman[76485]: 2025-11-28 08:25:00.843098936 +0000 UTC m=+0.077124224 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 28 03:25:00 localhost podman[76484]: 2025-11-28 08:25:00.894442369 +0000 UTC m=+0.131592927 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, config_id=tripleo_step4) Nov 28 03:25:00 localhost podman[76485]: 2025-11-28 08:25:00.918375781 +0000 UTC m=+0.152401079 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044) Nov 28 03:25:00 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:25:00 localhost podman[76484]: 2025-11-28 08:25:00.97400098 +0000 UTC m=+0.211151568 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, vcs-type=git, batch=17.1_20251118.1, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=) Nov 28 03:25:00 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:25:16 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 03:25:16 localhost recover_tripleo_nova_virtqemud[76549]: 61397 Nov 28 03:25:16 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 03:25:16 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 03:25:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:25:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:25:20 localhost systemd[1]: tmp-crun.1s5MRS.mount: Deactivated successfully. Nov 28 03:25:20 localhost podman[76614]: 2025-11-28 08:25:20.917556778 +0000 UTC m=+0.142609967 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible) Nov 28 03:25:20 localhost podman[76613]: 2025-11-28 08:25:20.887733519 +0000 UTC m=+0.112680545 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step3, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, architecture=x86_64, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:25:20 localhost podman[76614]: 2025-11-28 08:25:20.953300255 +0000 UTC m=+0.178353434 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public) Nov 28 03:25:20 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:25:20 localhost podman[76613]: 2025-11-28 08:25:20.972305279 +0000 UTC m=+0.197252225 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:25:20 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:25:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:25:21 localhost podman[76650]: 2025-11-28 08:25:21.849451431 +0000 UTC m=+0.086579985 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, managed_by=tripleo_ansible) Nov 28 03:25:21 localhost systemd[1]: tmp-crun.DqiLNp.mount: Deactivated successfully. Nov 28 03:25:22 localhost podman[76650]: 2025-11-28 08:25:22.046930763 +0000 UTC m=+0.284059267 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64) Nov 28 03:25:22 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:25:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:25:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:25:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:25:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:25:26 localhost systemd[1]: tmp-crun.MU2j7h.mount: Deactivated successfully. Nov 28 03:25:26 localhost podman[76678]: 2025-11-28 08:25:26.864067424 +0000 UTC m=+0.099887478 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, container_name=ceilometer_agent_compute, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, distribution-scope=public) Nov 28 03:25:26 localhost podman[76679]: 2025-11-28 08:25:26.895172383 +0000 UTC m=+0.126252307 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:25:26 localhost podman[76678]: 2025-11-28 08:25:26.900384649 +0000 UTC m=+0.136204683 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc.) Nov 28 03:25:26 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:25:26 localhost podman[76680]: 2025-11-28 08:25:26.914427006 +0000 UTC m=+0.143511116 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=logrotate_crond, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, maintainer=OpenStack TripleO Team) Nov 28 03:25:26 localhost podman[76686]: 2025-11-28 08:25:26.962578289 +0000 UTC m=+0.186316239 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 28 03:25:26 localhost podman[76680]: 2025-11-28 08:25:26.977895225 +0000 UTC m=+0.206979365 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, release=1761123044, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, name=rhosp17/openstack-cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:25:26 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:25:26 localhost podman[76686]: 2025-11-28 08:25:26.996823908 +0000 UTC m=+0.220561908 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, architecture=x86_64) Nov 28 03:25:27 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:25:27 localhost podman[76679]: 2025-11-28 08:25:27.264589485 +0000 UTC m=+0.495669459 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, container_name=nova_migration_target, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git) Nov 28 03:25:27 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:25:27 localhost systemd[1]: tmp-crun.CVdkAQ.mount: Deactivated successfully. Nov 28 03:25:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:25:28 localhost podman[76768]: 2025-11-28 08:25:28.855416758 +0000 UTC m=+0.089081645 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=nova_compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, release=1761123044, version=17.1.12, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:25:28 localhost podman[76768]: 2025-11-28 08:25:28.926450158 +0000 UTC m=+0.160115035 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step5, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, managed_by=tripleo_ansible) Nov 28 03:25:28 localhost podman[76768]: unhealthy Nov 28 03:25:28 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:25:28 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'. Nov 28 03:25:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:25:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:25:31 localhost podman[76791]: 2025-11-28 08:25:31.830062481 +0000 UTC m=+0.064558884 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, container_name=ovn_controller, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 03:25:31 localhost podman[76791]: 2025-11-28 08:25:31.877247652 +0000 UTC m=+0.111744105 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, container_name=ovn_controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public) Nov 28 03:25:31 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:25:31 localhost podman[76790]: 2025-11-28 08:25:31.897591749 +0000 UTC m=+0.133413324 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent) Nov 28 03:25:31 localhost podman[76790]: 2025-11-28 08:25:31.937100026 +0000 UTC m=+0.172921651 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-19T00:14:25Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vendor=Red Hat, Inc.) Nov 28 03:25:31 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:25:45 localhost sshd[76838]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:25:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:25:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:25:51 localhost podman[76840]: 2025-11-28 08:25:51.851190594 +0000 UTC m=+0.087141022 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.buildah.version=1.41.4, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container) Nov 28 03:25:51 localhost podman[76840]: 2025-11-28 08:25:51.88531191 +0000 UTC m=+0.121262308 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-18T23:44:13Z) Nov 28 03:25:51 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:25:51 localhost podman[76841]: 2025-11-28 08:25:51.90512924 +0000 UTC m=+0.138994213 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, vcs-type=git, version=17.1.12, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container) Nov 28 03:25:51 localhost podman[76841]: 2025-11-28 08:25:51.918426383 +0000 UTC m=+0.152291356 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, release=1761123044) Nov 28 03:25:51 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:25:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:25:52 localhost podman[76881]: 2025-11-28 08:25:52.828920535 +0000 UTC m=+0.073215449 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:25:53 localhost podman[76881]: 2025-11-28 08:25:53.020257152 +0000 UTC m=+0.264551996 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1) Nov 28 03:25:53 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:25:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:25:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:25:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:25:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:25:57 localhost systemd[1]: tmp-crun.Pz9Wsq.mount: Deactivated successfully. Nov 28 03:25:57 localhost podman[76937]: 2025-11-28 08:25:57.362462736 +0000 UTC m=+0.087872586 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1) Nov 28 03:25:57 localhost podman[76936]: 2025-11-28 08:25:57.336218011 +0000 UTC m=+0.069644346 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, distribution-scope=public, name=rhosp17/openstack-cron, vcs-type=git, container_name=logrotate_crond, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1761123044) Nov 28 03:25:57 localhost podman[76937]: 2025-11-28 08:25:57.443462152 +0000 UTC m=+0.168872012 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc.) Nov 28 03:25:57 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:25:57 localhost podman[76972]: 2025-11-28 08:25:57.446100606 +0000 UTC m=+0.138791636 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4) Nov 28 03:25:57 localhost podman[76935]: 2025-11-28 08:25:57.5015481 +0000 UTC m=+0.233230129 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible) Nov 28 03:25:57 localhost podman[76936]: 2025-11-28 08:25:57.523309043 +0000 UTC m=+0.256735328 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, tcib_managed=true, container_name=logrotate_crond, release=1761123044, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron) Nov 28 03:25:57 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:25:57 localhost podman[76935]: 2025-11-28 08:25:57.556325572 +0000 UTC m=+0.288007551 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Nov 28 03:25:57 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:25:57 localhost podman[76972]: 2025-11-28 08:25:57.840399989 +0000 UTC m=+0.533091009 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, container_name=nova_migration_target, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute) Nov 28 03:25:57 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:25:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:25:59 localhost podman[77095]: 2025-11-28 08:25:59.841330077 +0000 UTC m=+0.074922124 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, architecture=x86_64, container_name=nova_compute, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-nova-compute, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:25:59 localhost podman[77095]: 2025-11-28 08:25:59.873532792 +0000 UTC m=+0.107124799 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, release=1761123044, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:25:59 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:26:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:26:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:26:02 localhost podman[77124]: 2025-11-28 08:26:02.836324877 +0000 UTC m=+0.072816937 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 28 03:26:02 localhost podman[77124]: 2025-11-28 08:26:02.891893565 +0000 UTC m=+0.128385625 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team) Nov 28 03:26:02 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:26:02 localhost podman[77125]: 2025-11-28 08:26:02.893262868 +0000 UTC m=+0.127688882 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 03:26:02 localhost podman[77125]: 2025-11-28 08:26:02.976505476 +0000 UTC m=+0.210931530 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller) Nov 28 03:26:02 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:26:07 localhost systemd[1]: libpod-b23471aa557660c5dac2cb5b1a3e4f53c2898babc5e8dcd2f3287d589ad38589.scope: Deactivated successfully. Nov 28 03:26:07 localhost podman[75375]: 2025-11-28 08:26:07.133983394 +0000 UTC m=+192.556216126 container died b23471aa557660c5dac2cb5b1a3e4f53c2898babc5e8dcd2f3287d589ad38589 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, version=17.1.12, container_name=nova_wait_for_compute_service, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=) Nov 28 03:26:07 localhost systemd[1]: tmp-crun.PhvEsT.mount: Deactivated successfully. Nov 28 03:26:07 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b23471aa557660c5dac2cb5b1a3e4f53c2898babc5e8dcd2f3287d589ad38589-userdata-shm.mount: Deactivated successfully. Nov 28 03:26:07 localhost systemd[1]: var-lib-containers-storage-overlay-722caa9ca8bac97f5e9f74c37c16703e7dbaf1a59ddfd48c9720d9c7ac6caca3-merged.mount: Deactivated successfully. Nov 28 03:26:07 localhost podman[77172]: 2025-11-28 08:26:07.227750637 +0000 UTC m=+0.082610899 container cleanup b23471aa557660c5dac2cb5b1a3e4f53c2898babc5e8dcd2f3287d589ad38589 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, version=17.1.12, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_wait_for_compute_service, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4) Nov 28 03:26:07 localhost systemd[1]: libpod-conmon-b23471aa557660c5dac2cb5b1a3e4f53c2898babc5e8dcd2f3287d589ad38589.scope: Deactivated successfully. Nov 28 03:26:07 localhost python3[75215]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_wait_for_compute_service --conmon-pidfile /run/nova_wait_for_compute_service.pid --detach=False --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env __OS_DEBUG=true --env TRIPLEO_CONFIG_HASH=0f0904943dda1bf1d123bdf96d71020f --label config_id=tripleo_step5 --label container_name=nova_wait_for_compute_service --label managed_by=tripleo_ansible --label config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_wait_for_compute_service.log --network host --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/nova:/var/log/nova --volume /var/lib/container-config-scripts:/container-config-scripts registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Nov 28 03:26:07 localhost python3[77228]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:26:08 localhost python3[77244]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 28 03:26:08 localhost python3[77305]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1764318368.165464-119283-166483390457805/source dest=/etc/systemd/system/tripleo_nova_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:26:09 localhost python3[77321]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 28 03:26:09 localhost systemd[1]: Reloading. Nov 28 03:26:09 localhost systemd-rc-local-generator[77342]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:26:09 localhost systemd-sysv-generator[77346]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:26:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:26:10 localhost python3[77373]: ansible-systemd Invoked with state=restarted name=tripleo_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 03:26:10 localhost systemd[1]: Reloading. Nov 28 03:26:10 localhost systemd-rc-local-generator[77400]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:26:10 localhost systemd-sysv-generator[77403]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:26:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:26:10 localhost systemd[1]: Starting nova_compute container... Nov 28 03:26:10 localhost tripleo-start-podman-container[77413]: Creating additional drop-in dependency for "nova_compute" (c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6) Nov 28 03:26:10 localhost systemd[1]: Reloading. Nov 28 03:26:10 localhost systemd-rc-local-generator[77473]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 03:26:10 localhost systemd-sysv-generator[77478]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 03:26:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 03:26:11 localhost systemd[1]: Started nova_compute container. Nov 28 03:26:11 localhost python3[77512]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks5.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:26:12 localhost python3[77633]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks5.json short_hostname=np0005538513 step=5 update_config_hash_only=False Nov 28 03:26:13 localhost python3[77649]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 03:26:13 localhost python3[77665]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_5 config_pattern=container-puppet-*.json config_overrides={} debug=True Nov 28 03:26:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:26:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:26:22 localhost podman[77745]: 2025-11-28 08:26:22.859149265 +0000 UTC m=+0.090881233 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:26:22 localhost podman[77745]: 2025-11-28 08:26:22.873267065 +0000 UTC m=+0.104999022 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:26:22 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:26:22 localhost systemd[1]: tmp-crun.o4X1h3.mount: Deactivated successfully. Nov 28 03:26:22 localhost podman[77744]: 2025-11-28 08:26:22.965821239 +0000 UTC m=+0.198180006 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step3, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 28 03:26:22 localhost podman[77744]: 2025-11-28 08:26:22.978601175 +0000 UTC m=+0.210959902 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-iscsid-container, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, version=17.1.12, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team) Nov 28 03:26:22 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:26:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:26:23 localhost podman[77782]: 2025-11-28 08:26:23.843043543 +0000 UTC m=+0.077766835 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, release=1761123044, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd) Nov 28 03:26:24 localhost podman[77782]: 2025-11-28 08:26:24.068446022 +0000 UTC m=+0.303169244 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, config_id=tripleo_step1, version=17.1.12, io.openshift.expose-services=, container_name=metrics_qdr, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:26:24 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:26:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:26:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:26:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:26:27 localhost systemd[1]: tmp-crun.Xej9LQ.mount: Deactivated successfully. Nov 28 03:26:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:26:27 localhost systemd[1]: tmp-crun.K0Sh2l.mount: Deactivated successfully. Nov 28 03:26:27 localhost podman[77815]: 2025-11-28 08:26:27.905950542 +0000 UTC m=+0.139718575 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_id=tripleo_step4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Nov 28 03:26:27 localhost podman[77815]: 2025-11-28 08:26:27.94047328 +0000 UTC m=+0.174241273 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:26:27 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:26:27 localhost podman[77813]: 2025-11-28 08:26:27.958325899 +0000 UTC m=+0.193497156 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:26:27 localhost podman[77814]: 2025-11-28 08:26:27.877342172 +0000 UTC m=+0.111918581 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team) Nov 28 03:26:28 localhost podman[77814]: 2025-11-28 08:26:28.011645925 +0000 UTC m=+0.246222334 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron) Nov 28 03:26:28 localhost podman[77813]: 2025-11-28 08:26:28.018443881 +0000 UTC m=+0.253615148 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git) Nov 28 03:26:28 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:26:28 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:26:28 localhost podman[77854]: 2025-11-28 08:26:28.023347977 +0000 UTC m=+0.135786080 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, container_name=nova_migration_target, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:26:28 localhost podman[77854]: 2025-11-28 08:26:28.397435897 +0000 UTC m=+0.509874020 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, container_name=nova_migration_target, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:26:28 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:26:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:26:30 localhost systemd[1]: tmp-crun.Ez8SSj.mount: Deactivated successfully. Nov 28 03:26:30 localhost podman[77904]: 2025-11-28 08:26:30.853630157 +0000 UTC m=+0.091458231 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:26:30 localhost podman[77904]: 2025-11-28 08:26:30.880219103 +0000 UTC m=+0.118047197 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, release=1761123044, batch=17.1_20251118.1) Nov 28 03:26:30 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:26:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:26:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:26:33 localhost podman[77929]: 2025-11-28 08:26:33.844931509 +0000 UTC m=+0.082953439 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, container_name=ovn_controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 28 03:26:33 localhost podman[77929]: 2025-11-28 08:26:33.893807074 +0000 UTC m=+0.131829044 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=ovn_controller) Nov 28 03:26:33 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:26:33 localhost podman[77928]: 2025-11-28 08:26:33.894642771 +0000 UTC m=+0.135414979 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, vcs-type=git, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 28 03:26:33 localhost podman[77928]: 2025-11-28 08:26:33.979536131 +0000 UTC m=+0.220308369 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 03:26:33 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:26:41 localhost sshd[77976]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:26:41 localhost systemd-logind[764]: New session 33 of user zuul. Nov 28 03:26:41 localhost systemd[1]: Started Session 33 of User zuul. Nov 28 03:26:42 localhost python3[78085]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 03:26:49 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 03:26:49 localhost recover_tripleo_nova_virtqemud[78350]: 61397 Nov 28 03:26:49 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 03:26:49 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 03:26:49 localhost python3[78348]: ansible-ansible.legacy.dnf Invoked with name=['iptables'] allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None state=None Nov 28 03:26:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:26:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:26:53 localhost podman[78368]: 2025-11-28 08:26:53.858312386 +0000 UTC m=+0.094098783 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z) Nov 28 03:26:53 localhost podman[78368]: 2025-11-28 08:26:53.895476648 +0000 UTC m=+0.131263105 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=collectd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step3) Nov 28 03:26:53 localhost systemd[1]: tmp-crun.5gbyYP.mount: Deactivated successfully. Nov 28 03:26:53 localhost podman[78367]: 2025-11-28 08:26:53.919304637 +0000 UTC m=+0.157166801 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-type=git, release=1761123044, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team) Nov 28 03:26:53 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:26:53 localhost podman[78367]: 2025-11-28 08:26:53.958360529 +0000 UTC m=+0.196222713 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1) Nov 28 03:26:53 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:26:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:26:54 localhost podman[78407]: 2025-11-28 08:26:54.840002974 +0000 UTC m=+0.079126179 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc.) Nov 28 03:26:55 localhost podman[78407]: 2025-11-28 08:26:55.05648047 +0000 UTC m=+0.295603645 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, container_name=metrics_qdr, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z) Nov 28 03:26:55 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:26:56 localhost python3[78512]: ansible-ansible.builtin.iptables Invoked with action=insert chain=INPUT comment=allow ssh access for zuul executor in_interface=eth0 jump=ACCEPT protocol=tcp source=38.102.83.114 table=filter state=present ip_version=ipv4 match=[] destination_ports=[] ctstate=[] syn=ignore flush=False chain_management=False numeric=False rule_num=None wait=None to_source=None destination=None to_destination=None tcp_flags=None gateway=None log_prefix=None log_level=None goto=None out_interface=None fragment=None set_counters=None source_port=None destination_port=None to_ports=None set_dscp_mark=None set_dscp_mark_class=None src_range=None dst_range=None match_set=None match_set_flags=None limit=None limit_burst=None uid_owner=None gid_owner=None reject_with=None icmp_type=None policy=None Nov 28 03:26:56 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled Nov 28 03:26:56 localhost systemd-journald[47227]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 81.1 (270 of 333 items), suggesting rotation. Nov 28 03:26:56 localhost systemd-journald[47227]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 28 03:26:56 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 28 03:26:56 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 28 03:26:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:26:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:26:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:26:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:26:58 localhost podman[78583]: 2025-11-28 08:26:58.858661595 +0000 UTC m=+0.085492960 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:26:58 localhost podman[78583]: 2025-11-28 08:26:58.898515333 +0000 UTC m=+0.125346698 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, version=17.1.12) Nov 28 03:26:58 localhost systemd[1]: tmp-crun.CjOJH2.mount: Deactivated successfully. Nov 28 03:26:58 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:26:58 localhost podman[78581]: 2025-11-28 08:26:58.927478675 +0000 UTC m=+0.156615823 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, url=https://www.redhat.com) Nov 28 03:26:58 localhost podman[78581]: 2025-11-28 08:26:58.961394013 +0000 UTC m=+0.190531171 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step4, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team) Nov 28 03:26:58 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:26:58 localhost podman[78584]: 2025-11-28 08:26:58.977654761 +0000 UTC m=+0.199071964 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, tcib_managed=true, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 28 03:26:59 localhost podman[78582]: 2025-11-28 08:26:59.062900422 +0000 UTC m=+0.291699400 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, version=17.1.12, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, vcs-type=git) Nov 28 03:26:59 localhost podman[78584]: 2025-11-28 08:26:59.087385011 +0000 UTC m=+0.308802214 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_id=tripleo_step4, version=17.1.12, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git) Nov 28 03:26:59 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:26:59 localhost podman[78582]: 2025-11-28 08:26:59.467540483 +0000 UTC m=+0.696339521 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, container_name=nova_migration_target, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:26:59 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:27:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:27:01 localhost systemd[1]: tmp-crun.OB6UTm.mount: Deactivated successfully. Nov 28 03:27:01 localhost podman[78678]: 2025-11-28 08:27:01.862176705 +0000 UTC m=+0.093908079 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, tcib_managed=true, config_id=tripleo_step5, io.openshift.expose-services=, container_name=nova_compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:27:01 localhost podman[78678]: 2025-11-28 08:27:01.889476044 +0000 UTC m=+0.121207478 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, batch=17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, architecture=x86_64, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:27:01 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:27:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:27:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:27:04 localhost systemd[1]: tmp-crun.9LEEbZ.mount: Deactivated successfully. Nov 28 03:27:04 localhost podman[78704]: 2025-11-28 08:27:04.845594736 +0000 UTC m=+0.086501624 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 28 03:27:04 localhost podman[78704]: 2025-11-28 08:27:04.891975632 +0000 UTC m=+0.132882590 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, config_id=tripleo_step4, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:27:04 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:27:04 localhost podman[78705]: 2025-11-28 08:27:04.89975031 +0000 UTC m=+0.137550898 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container) Nov 28 03:27:04 localhost podman[78705]: 2025-11-28 08:27:04.982363637 +0000 UTC m=+0.220164215 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step4, container_name=ovn_controller, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:27:04 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:27:05 localhost systemd[1]: tmp-crun.geDpMV.mount: Deactivated successfully. Nov 28 03:27:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:27:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:27:24 localhost systemd[1]: tmp-crun.ALBu8f.mount: Deactivated successfully. Nov 28 03:27:24 localhost podman[78829]: 2025-11-28 08:27:24.8937444 +0000 UTC m=+0.128890001 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=) Nov 28 03:27:24 localhost podman[78829]: 2025-11-28 08:27:24.906883027 +0000 UTC m=+0.142028598 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, release=1761123044, architecture=x86_64, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:27:24 localhost podman[78830]: 2025-11-28 08:27:24.863165207 +0000 UTC m=+0.096902994 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, distribution-scope=public, config_id=tripleo_step3, container_name=collectd, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible) Nov 28 03:27:24 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:27:24 localhost podman[78830]: 2025-11-28 08:27:24.948405638 +0000 UTC m=+0.182143435 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, config_id=tripleo_step3, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, tcib_managed=true, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:27:24 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:27:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:27:25 localhost podman[78870]: 2025-11-28 08:27:25.825158737 +0000 UTC m=+0.067579901 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, version=17.1.12, release=1761123044, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:27:26 localhost podman[78870]: 2025-11-28 08:27:26.034750564 +0000 UTC m=+0.277171698 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-type=git, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:27:26 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:27:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:27:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:27:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:27:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:27:29 localhost systemd[1]: tmp-crun.G7ee6b.mount: Deactivated successfully. Nov 28 03:27:29 localhost podman[78899]: 2025-11-28 08:27:29.863408423 +0000 UTC m=+0.096972376 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, config_id=tripleo_step4, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z) Nov 28 03:27:29 localhost podman[78900]: 2025-11-28 08:27:29.91240115 +0000 UTC m=+0.140146208 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Nov 28 03:27:29 localhost podman[78899]: 2025-11-28 08:27:29.921605844 +0000 UTC m=+0.155169777 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 28 03:27:29 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:27:30 localhost podman[78901]: 2025-11-28 08:27:30.014396856 +0000 UTC m=+0.237954541 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron) Nov 28 03:27:30 localhost podman[78901]: 2025-11-28 08:27:30.025345784 +0000 UTC m=+0.248903469 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, release=1761123044, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron) Nov 28 03:27:30 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:27:30 localhost podman[78907]: 2025-11-28 08:27:30.109141489 +0000 UTC m=+0.328948605 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible) Nov 28 03:27:30 localhost podman[78907]: 2025-11-28 08:27:30.161328659 +0000 UTC m=+0.381135755 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:27:30 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:27:30 localhost podman[78900]: 2025-11-28 08:27:30.31224257 +0000 UTC m=+0.539987608 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:27:30 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:27:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:27:32 localhost podman[78995]: 2025-11-28 08:27:32.846312197 +0000 UTC m=+0.079982756 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step5, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container) Nov 28 03:27:32 localhost podman[78995]: 2025-11-28 08:27:32.87281448 +0000 UTC m=+0.106484979 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=) Nov 28 03:27:32 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:27:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:27:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:27:35 localhost systemd[1]: tmp-crun.8uCkYX.mount: Deactivated successfully. Nov 28 03:27:35 localhost systemd[1]: tmp-crun.Onfx5b.mount: Deactivated successfully. Nov 28 03:27:35 localhost podman[79023]: 2025-11-28 08:27:35.863868425 +0000 UTC m=+0.091233224 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, architecture=x86_64) Nov 28 03:27:35 localhost podman[79022]: 2025-11-28 08:27:35.830315538 +0000 UTC m=+0.067096726 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=) Nov 28 03:27:35 localhost podman[79023]: 2025-11-28 08:27:35.910478197 +0000 UTC m=+0.137843006 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, version=17.1.12, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, url=https://www.redhat.com) Nov 28 03:27:35 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:27:35 localhost podman[79022]: 2025-11-28 08:27:35.966132357 +0000 UTC m=+0.202913535 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, version=17.1.12, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, release=1761123044) Nov 28 03:27:35 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:27:50 localhost sshd[79069]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:27:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:27:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:27:55 localhost systemd[1]: tmp-crun.l3Zpt7.mount: Deactivated successfully. Nov 28 03:27:55 localhost podman[79071]: 2025-11-28 08:27:55.865652023 +0000 UTC m=+0.101571313 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3) Nov 28 03:27:55 localhost podman[79071]: 2025-11-28 08:27:55.902622679 +0000 UTC m=+0.138542019 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, io.buildah.version=1.41.4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, architecture=x86_64) Nov 28 03:27:55 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:27:55 localhost podman[79072]: 2025-11-28 08:27:55.956332307 +0000 UTC m=+0.190363226 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, architecture=x86_64, container_name=collectd, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, distribution-scope=public) Nov 28 03:27:55 localhost podman[79072]: 2025-11-28 08:27:55.994386098 +0000 UTC m=+0.228417027 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:27:56 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:27:56 localhost systemd[1]: session-33.scope: Deactivated successfully. Nov 28 03:27:56 localhost systemd[1]: session-33.scope: Consumed 5.529s CPU time. Nov 28 03:27:56 localhost systemd-logind[764]: Session 33 logged out. Waiting for processes to exit. Nov 28 03:27:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:27:56 localhost systemd-logind[764]: Removed session 33. Nov 28 03:27:56 localhost podman[79111]: 2025-11-28 08:27:56.435461478 +0000 UTC m=+0.077990483 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1761123044, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, managed_by=tripleo_ansible) Nov 28 03:27:56 localhost podman[79111]: 2025-11-28 08:27:56.622409395 +0000 UTC m=+0.264938410 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:27:56 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:28:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:28:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:28:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:28:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:28:00 localhost podman[79187]: 2025-11-28 08:28:00.837783764 +0000 UTC m=+0.066049513 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, url=https://www.redhat.com, release=1761123044, version=17.1.12, config_id=tripleo_step4, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, build-date=2025-11-18T22:49:32Z, distribution-scope=public, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team) Nov 28 03:28:00 localhost podman[79193]: 2025-11-28 08:28:00.857162849 +0000 UTC m=+0.076572726 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 28 03:28:00 localhost podman[79186]: 2025-11-28 08:28:00.90241553 +0000 UTC m=+0.129722609 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step4, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute) Nov 28 03:28:00 localhost podman[79193]: 2025-11-28 08:28:00.912443708 +0000 UTC m=+0.131853585 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi) Nov 28 03:28:00 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:28:00 localhost podman[79185]: 2025-11-28 08:28:00.964856705 +0000 UTC m=+0.196472660 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 28 03:28:00 localhost podman[79187]: 2025-11-28 08:28:00.978866431 +0000 UTC m=+0.207132210 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git) Nov 28 03:28:00 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:28:00 localhost podman[79185]: 2025-11-28 08:28:00.996374888 +0000 UTC m=+0.227990843 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:11:48Z, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 28 03:28:01 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:28:01 localhost podman[79186]: 2025-11-28 08:28:01.317517394 +0000 UTC m=+0.544824503 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git) Nov 28 03:28:01 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:28:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:28:03 localhost podman[79277]: 2025-11-28 08:28:03.84112123 +0000 UTC m=+0.078278508 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute) Nov 28 03:28:03 localhost podman[79277]: 2025-11-28 08:28:03.871247408 +0000 UTC m=+0.108404616 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, tcib_managed=true, version=17.1.12, container_name=nova_compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:28:03 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:28:05 localhost sshd[79303]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:28:05 localhost systemd-logind[764]: New session 34 of user zuul. Nov 28 03:28:05 localhost systemd[1]: Started Session 34 of User zuul. Nov 28 03:28:05 localhost python3[79322]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 28 03:28:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:28:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:28:06 localhost podman[79325]: 2025-11-28 08:28:06.841735906 +0000 UTC m=+0.082819697 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, release=1761123044, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=ovn_controller, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:28:06 localhost podman[79325]: 2025-11-28 08:28:06.867392178 +0000 UTC m=+0.108475969 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, release=1761123044, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z) Nov 28 03:28:06 localhost podman[79324]: 2025-11-28 08:28:06.890388149 +0000 UTC m=+0.132105619 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, vcs-type=git, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 03:28:06 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:28:06 localhost podman[79324]: 2025-11-28 08:28:06.941377884 +0000 UTC m=+0.183095354 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git) Nov 28 03:28:06 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:28:24 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 03:28:24 localhost recover_tripleo_nova_virtqemud[79448]: 61397 Nov 28 03:28:24 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 03:28:24 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 03:28:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:28:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:28:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:28:26 localhost podman[79450]: 2025-11-28 08:28:26.879077362 +0000 UTC m=+0.075097630 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, container_name=collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:28:26 localhost systemd[1]: tmp-crun.bubrRw.mount: Deactivated successfully. Nov 28 03:28:26 localhost podman[79451]: 2025-11-28 08:28:26.947876331 +0000 UTC m=+0.142122516 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container) Nov 28 03:28:26 localhost podman[79449]: 2025-11-28 08:28:26.911341086 +0000 UTC m=+0.112835571 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, architecture=x86_64, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3) Nov 28 03:28:26 localhost podman[79450]: 2025-11-28 08:28:26.964688433 +0000 UTC m=+0.160708701 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step3, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team) Nov 28 03:28:26 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:28:26 localhost podman[79449]: 2025-11-28 08:28:26.99539004 +0000 UTC m=+0.196884455 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 28 03:28:27 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:28:27 localhost podman[79451]: 2025-11-28 08:28:27.16527811 +0000 UTC m=+0.359524355 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_id=tripleo_step1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.4) Nov 28 03:28:27 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:28:30 localhost python3[79531]: ansible-ansible.legacy.dnf Invoked with name=['sos'] state=latest allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Nov 28 03:28:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:28:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:28:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:28:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:28:31 localhost systemd[1]: tmp-crun.x95Rv2.mount: Deactivated successfully. Nov 28 03:28:31 localhost podman[79543]: 2025-11-28 08:28:31.888222849 +0000 UTC m=+0.108715346 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4) Nov 28 03:28:31 localhost podman[79543]: 2025-11-28 08:28:31.934263723 +0000 UTC m=+0.154756240 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 28 03:28:31 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:28:31 localhost podman[79534]: 2025-11-28 08:28:31.935662235 +0000 UTC m=+0.163032742 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4) Nov 28 03:28:32 localhost podman[79535]: 2025-11-28 08:28:32.035874802 +0000 UTC m=+0.258206355 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, name=rhosp17/openstack-cron, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:28:32 localhost podman[79533]: 2025-11-28 08:28:31.990484128 +0000 UTC m=+0.218862546 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, distribution-scope=public) Nov 28 03:28:32 localhost podman[79533]: 2025-11-28 08:28:32.069373683 +0000 UTC m=+0.297752101 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, release=1761123044, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:28:32 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:28:32 localhost podman[79535]: 2025-11-28 08:28:32.094745147 +0000 UTC m=+0.317076680 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:28:32 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:28:32 localhost podman[79534]: 2025-11-28 08:28:32.277355686 +0000 UTC m=+0.504726283 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z) Nov 28 03:28:32 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:28:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:28:34 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 28 03:28:34 localhost systemd[1]: Starting man-db-cache-update.service... Nov 28 03:28:34 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 28 03:28:34 localhost systemd[1]: tmp-crun.clpOO2.mount: Deactivated successfully. Nov 28 03:28:34 localhost podman[79637]: 2025-11-28 08:28:34.155206932 +0000 UTC m=+0.118773954 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vcs-type=git, container_name=nova_compute, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4) Nov 28 03:28:34 localhost podman[79637]: 2025-11-28 08:28:34.181596926 +0000 UTC m=+0.145163898 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, version=17.1.12, container_name=nova_compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step5, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:28:34 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:28:34 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 28 03:28:34 localhost systemd[1]: Finished man-db-cache-update.service. Nov 28 03:28:34 localhost systemd[1]: run-r232545fe677e499cba963d77f85a7852.service: Deactivated successfully. Nov 28 03:28:34 localhost systemd[1]: run-r8502e199661141c28b05d34d678c42bb.service: Deactivated successfully. Nov 28 03:28:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 03:28:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 4541 writes, 20K keys, 4541 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4541 writes, 459 syncs, 9.89 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 28 03:28:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:28:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:28:37 localhost podman[79807]: 2025-11-28 08:28:37.846925902 +0000 UTC m=+0.083037943 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 28 03:28:37 localhost podman[79808]: 2025-11-28 08:28:37.901053563 +0000 UTC m=+0.135972768 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, tcib_managed=true, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-type=git, architecture=x86_64) Nov 28 03:28:37 localhost podman[79807]: 2025-11-28 08:28:37.920439434 +0000 UTC m=+0.156551495 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:28:37 localhost podman[79808]: 2025-11-28 08:28:37.929330845 +0000 UTC m=+0.164250090 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:28:37 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:28:37 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:28:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 03:28:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.3 total, 600.0 interval#012Cumulative writes: 5030 writes, 22K keys, 5030 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5030 writes, 563 syncs, 8.93 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 28 03:28:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:28:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:28:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:28:57 localhost podman[79860]: 2025-11-28 08:28:57.846258922 +0000 UTC m=+0.074788273 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, container_name=metrics_qdr, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, version=17.1.12, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:28:57 localhost systemd[1]: tmp-crun.A7H2oI.mount: Deactivated successfully. Nov 28 03:28:57 localhost podman[79858]: 2025-11-28 08:28:57.945588091 +0000 UTC m=+0.180167056 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, distribution-scope=public, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, container_name=iscsid, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:28:57 localhost podman[79859]: 2025-11-28 08:28:57.924684254 +0000 UTC m=+0.156877506 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, tcib_managed=true, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:28:57 localhost podman[79858]: 2025-11-28 08:28:57.981403793 +0000 UTC m=+0.215982758 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, version=17.1.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, container_name=iscsid) Nov 28 03:28:57 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:28:58 localhost podman[79859]: 2025-11-28 08:28:58.007318593 +0000 UTC m=+0.239511825 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container) Nov 28 03:28:58 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:28:58 localhost podman[79860]: 2025-11-28 08:28:58.092453729 +0000 UTC m=+0.320983070 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, architecture=x86_64, distribution-scope=public, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc.) Nov 28 03:28:58 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:28:58 localhost systemd[1]: tmp-crun.F2wSEb.mount: Deactivated successfully. Nov 28 03:29:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:29:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:29:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:29:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:29:02 localhost podman[79971]: 2025-11-28 08:29:02.866146765 +0000 UTC m=+0.098071432 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=nova_migration_target, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 28 03:29:02 localhost podman[79972]: 2025-11-28 08:29:02.910767145 +0000 UTC m=+0.140676470 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, tcib_managed=true, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:29:02 localhost podman[79972]: 2025-11-28 08:29:02.919142501 +0000 UTC m=+0.149051826 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step4, name=rhosp17/openstack-cron, container_name=logrotate_crond, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container) Nov 28 03:29:02 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:29:02 localhost podman[79970]: 2025-11-28 08:29:02.971162028 +0000 UTC m=+0.203831808 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com) Nov 28 03:29:03 localhost systemd[1]: tmp-crun.9vgNBG.mount: Deactivated successfully. Nov 28 03:29:03 localhost podman[79973]: 2025-11-28 08:29:03.019571564 +0000 UTC m=+0.244832237 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1) Nov 28 03:29:03 localhost podman[79970]: 2025-11-28 08:29:03.033506129 +0000 UTC m=+0.266175929 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 28 03:29:03 localhost podman[79973]: 2025-11-28 08:29:03.050360112 +0000 UTC m=+0.275620835 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, io.buildah.version=1.41.4) Nov 28 03:29:03 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:29:03 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:29:03 localhost podman[79971]: 2025-11-28 08:29:03.218333455 +0000 UTC m=+0.450258072 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:29:03 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:29:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:29:04 localhost podman[80063]: 2025-11-28 08:29:04.868372053 +0000 UTC m=+0.102970611 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, distribution-scope=public, config_id=tripleo_step5, vcs-type=git, url=https://www.redhat.com, container_name=nova_compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=) Nov 28 03:29:04 localhost podman[80063]: 2025-11-28 08:29:04.900447132 +0000 UTC m=+0.135045670 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:29:04 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:29:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:29:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:29:08 localhost systemd[1]: tmp-crun.PImKxF.mount: Deactivated successfully. Nov 28 03:29:08 localhost podman[80090]: 2025-11-28 08:29:08.8474857 +0000 UTC m=+0.085356205 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64) Nov 28 03:29:08 localhost podman[80091]: 2025-11-28 08:29:08.897098442 +0000 UTC m=+0.131036857 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Nov 28 03:29:08 localhost podman[80090]: 2025-11-28 08:29:08.900596849 +0000 UTC m=+0.138467364 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git) Nov 28 03:29:08 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:29:08 localhost podman[80091]: 2025-11-28 08:29:08.920392463 +0000 UTC m=+0.154330898 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, release=1761123044, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, config_id=tripleo_step4) Nov 28 03:29:08 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:29:09 localhost python3[80151]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhel-9-for-x86_64-baseos-eus-rpms --disable rhel-9-for-x86_64-appstream-eus-rpms --disable rhel-9-for-x86_64-highavailability-eus-rpms --disable openstack-17.1-for-rhel-9-x86_64-rpms --disable fast-datapath-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 03:29:12 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 03:29:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:29:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:29:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:29:28 localhost podman[80467]: 2025-11-28 08:29:28.855666106 +0000 UTC m=+0.091662957 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, version=17.1.12, container_name=iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 28 03:29:28 localhost systemd[1]: tmp-crun.9Nzz1p.mount: Deactivated successfully. Nov 28 03:29:28 localhost podman[80468]: 2025-11-28 08:29:28.912323374 +0000 UTC m=+0.147987175 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=collectd, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, name=rhosp17/openstack-collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc.) Nov 28 03:29:28 localhost podman[80469]: 2025-11-28 08:29:28.955574632 +0000 UTC m=+0.188355935 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, release=1761123044, managed_by=tripleo_ansible) Nov 28 03:29:28 localhost podman[80467]: 2025-11-28 08:29:28.970718224 +0000 UTC m=+0.206715045 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, config_id=tripleo_step3, release=1761123044, distribution-scope=public, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:29:28 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:29:29 localhost podman[80468]: 2025-11-28 08:29:29.024321099 +0000 UTC m=+0.259984900 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:29:29 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:29:29 localhost podman[80469]: 2025-11-28 08:29:29.14437247 +0000 UTC m=+0.377153763 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20251118.1) Nov 28 03:29:29 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:29:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:29:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:29:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:29:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:29:33 localhost systemd[1]: tmp-crun.JKFMo5.mount: Deactivated successfully. Nov 28 03:29:33 localhost podman[80539]: 2025-11-28 08:29:33.860696676 +0000 UTC m=+0.091264423 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, release=1761123044, container_name=logrotate_crond, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, io.openshift.expose-services=) Nov 28 03:29:33 localhost podman[80539]: 2025-11-28 08:29:33.903359978 +0000 UTC m=+0.133927665 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public) Nov 28 03:29:33 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:29:33 localhost podman[80537]: 2025-11-28 08:29:33.903157792 +0000 UTC m=+0.140725043 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12) Nov 28 03:29:34 localhost podman[80538]: 2025-11-28 08:29:33.956192919 +0000 UTC m=+0.189337255 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, batch=17.1_20251118.1) Nov 28 03:29:34 localhost podman[80545]: 2025-11-28 08:29:34.106697098 +0000 UTC m=+0.333949064 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-19T00:12:45Z, vcs-type=git, distribution-scope=public) Nov 28 03:29:34 localhost podman[80537]: 2025-11-28 08:29:34.124710538 +0000 UTC m=+0.362277799 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, architecture=x86_64, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible) Nov 28 03:29:34 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:29:34 localhost podman[80545]: 2025-11-28 08:29:34.160423047 +0000 UTC m=+0.387674973 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, url=https://www.redhat.com, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public) Nov 28 03:29:34 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:29:34 localhost podman[80538]: 2025-11-28 08:29:34.323468519 +0000 UTC m=+0.556612835 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4) Nov 28 03:29:34 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:29:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:29:35 localhost podman[80628]: 2025-11-28 08:29:35.841354028 +0000 UTC m=+0.082634331 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, release=1761123044, tcib_managed=true, container_name=nova_compute, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:29:35 localhost podman[80628]: 2025-11-28 08:29:35.872448017 +0000 UTC m=+0.113728260 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5) Nov 28 03:29:35 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:29:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:29:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:29:39 localhost systemd[1]: tmp-crun.UjkoC3.mount: Deactivated successfully. Nov 28 03:29:39 localhost podman[80654]: 2025-11-28 08:29:39.844892499 +0000 UTC m=+0.081752955 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, distribution-scope=public) Nov 28 03:29:39 localhost podman[80655]: 2025-11-28 08:29:39.894071068 +0000 UTC m=+0.127944453 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 03:29:39 localhost podman[80654]: 2025-11-28 08:29:39.945958041 +0000 UTC m=+0.182818467 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:29:39 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:29:39 localhost podman[80655]: 2025-11-28 08:29:39.99777852 +0000 UTC m=+0.231651955 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z) Nov 28 03:29:40 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:29:41 localhost sshd[80704]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:29:46 localhost sshd[80706]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:29:52 localhost sshd[80708]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:29:55 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 03:29:55 localhost recover_tripleo_nova_virtqemud[80711]: 61397 Nov 28 03:29:55 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 03:29:55 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 03:29:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:29:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:29:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:29:59 localhost systemd[1]: tmp-crun.2cK36M.mount: Deactivated successfully. Nov 28 03:29:59 localhost podman[80732]: 2025-11-28 08:29:59.907576748 +0000 UTC m=+0.143603270 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-18T23:44:13Z) Nov 28 03:29:59 localhost podman[80733]: 2025-11-28 08:29:59.92405594 +0000 UTC m=+0.157285387 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, name=rhosp17/openstack-collectd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:29:59 localhost podman[80733]: 2025-11-28 08:29:59.930500797 +0000 UTC m=+0.163730264 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:29:59 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:29:59 localhost podman[80732]: 2025-11-28 08:29:59.944339449 +0000 UTC m=+0.180365931 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, release=1761123044, config_id=tripleo_step3, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:29:59 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:29:59 localhost podman[80734]: 2025-11-28 08:29:59.909497797 +0000 UTC m=+0.139190386 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:30:00 localhost podman[80734]: 2025-11-28 08:30:00.085277337 +0000 UTC m=+0.314969956 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, managed_by=tripleo_ansible) Nov 28 03:30:00 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:30:00 localhost systemd[1]: tmp-crun.FbXjve.mount: Deactivated successfully. Nov 28 03:30:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:30:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:30:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:30:04 localhost podman[80838]: 2025-11-28 08:30:04.428223737 +0000 UTC m=+0.087064506 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:30:04 localhost podman[80838]: 2025-11-28 08:30:04.440184402 +0000 UTC m=+0.099025131 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:30:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:30:04 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:30:04 localhost python3[80836]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhceph-7-tools-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 03:30:04 localhost systemd[1]: tmp-crun.gMFxF5.mount: Deactivated successfully. Nov 28 03:30:04 localhost podman[80837]: 2025-11-28 08:30:04.53589756 +0000 UTC m=+0.197066240 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:30:04 localhost podman[80877]: 2025-11-28 08:30:04.547078931 +0000 UTC m=+0.096683129 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z) Nov 28 03:30:04 localhost podman[80839]: 2025-11-28 08:30:04.505126132 +0000 UTC m=+0.158057091 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:30:04 localhost podman[80837]: 2025-11-28 08:30:04.58735108 +0000 UTC m=+0.248519790 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, release=1761123044) Nov 28 03:30:04 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:30:04 localhost podman[80839]: 2025-11-28 08:30:04.638364715 +0000 UTC m=+0.291295765 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step4, release=1761123044, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com) Nov 28 03:30:04 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:30:04 localhost podman[80877]: 2025-11-28 08:30:04.943388357 +0000 UTC m=+0.492992615 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Nov 28 03:30:04 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:30:05 localhost systemd[1]: tmp-crun.JIdA4d.mount: Deactivated successfully. Nov 28 03:30:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:30:06 localhost podman[80932]: 2025-11-28 08:30:06.048212669 +0000 UTC m=+0.075638207 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, config_id=tripleo_step5, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Nov 28 03:30:06 localhost podman[80932]: 2025-11-28 08:30:06.079408671 +0000 UTC m=+0.106834279 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:30:06 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:30:07 localhost sshd[80959]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:30:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:30:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:30:10 localhost podman[81080]: 2025-11-28 08:30:10.843554446 +0000 UTC m=+0.079938299 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:30:10 localhost podman[81080]: 2025-11-28 08:30:10.892494408 +0000 UTC m=+0.128878261 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:30:10 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:30:10 localhost podman[81081]: 2025-11-28 08:30:10.897412298 +0000 UTC m=+0.131052488 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4) Nov 28 03:30:10 localhost podman[81081]: 2025-11-28 08:30:10.979338046 +0000 UTC m=+0.212978196 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, version=17.1.12, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, config_id=tripleo_step4, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 03:30:10 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:30:11 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 03:30:12 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Nov 28 03:30:15 localhost sshd[81134]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:30:19 localhost sshd[81194]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:30:29 localhost sshd[81272]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:30:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:30:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:30:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:30:30 localhost podman[81274]: 2025-11-28 08:30:30.861929006 +0000 UTC m=+0.098669830 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, config_id=tripleo_step3, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com) Nov 28 03:30:30 localhost systemd[1]: tmp-crun.5ETw2l.mount: Deactivated successfully. Nov 28 03:30:30 localhost podman[81274]: 2025-11-28 08:30:30.911365164 +0000 UTC m=+0.148105968 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:30:30 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:30:30 localhost podman[81276]: 2025-11-28 08:30:30.964381659 +0000 UTC m=+0.196068690 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, container_name=metrics_qdr, distribution-scope=public, build-date=2025-11-18T22:49:46Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Nov 28 03:30:30 localhost podman[81275]: 2025-11-28 08:30:30.917116169 +0000 UTC m=+0.151610185 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, tcib_managed=true, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd) Nov 28 03:30:30 localhost podman[81275]: 2025-11-28 08:30:30.997349725 +0000 UTC m=+0.231843741 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, version=17.1.12, distribution-scope=public, build-date=2025-11-18T22:51:28Z, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:30:31 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:30:31 localhost podman[81276]: 2025-11-28 08:30:31.170558037 +0000 UTC m=+0.402245138 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12) Nov 28 03:30:31 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:30:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:30:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:30:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:30:34 localhost systemd[1]: tmp-crun.vA2V32.mount: Deactivated successfully. Nov 28 03:30:34 localhost podman[81344]: 2025-11-28 08:30:34.913697535 +0000 UTC m=+0.137944048 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, release=1761123044, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi) Nov 28 03:30:34 localhost podman[81342]: 2025-11-28 08:30:34.878664118 +0000 UTC m=+0.108560612 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, version=17.1.12, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:30:34 localhost podman[81344]: 2025-11-28 08:30:34.948861728 +0000 UTC m=+0.173108260 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=) Nov 28 03:30:34 localhost podman[81342]: 2025-11-28 08:30:34.96105285 +0000 UTC m=+0.190949384 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, release=1761123044, container_name=ceilometer_agent_compute, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_id=tripleo_step4) Nov 28 03:30:34 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:30:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:30:34 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:30:35 localhost podman[81343]: 2025-11-28 08:30:35.026367061 +0000 UTC m=+0.251990815 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, container_name=logrotate_crond, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:30:35 localhost podman[81343]: 2025-11-28 08:30:35.03352006 +0000 UTC m=+0.259143784 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron) Nov 28 03:30:35 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:30:35 localhost podman[81405]: 2025-11-28 08:30:35.089299411 +0000 UTC m=+0.092688478 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, url=https://www.redhat.com, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public) Nov 28 03:30:35 localhost podman[81405]: 2025-11-28 08:30:35.455419526 +0000 UTC m=+0.458808523 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=nova_migration_target, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:30:35 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:30:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:30:36 localhost podman[81436]: 2025-11-28 08:30:36.856363208 +0000 UTC m=+0.086453607 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-type=git, config_id=tripleo_step5, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible) Nov 28 03:30:36 localhost podman[81436]: 2025-11-28 08:30:36.879242726 +0000 UTC m=+0.109333115 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Nov 28 03:30:36 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:30:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:30:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:30:41 localhost podman[81463]: 2025-11-28 08:30:41.837157519 +0000 UTC m=+0.078623108 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, container_name=ovn_metadata_agent, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, batch=17.1_20251118.1) Nov 28 03:30:41 localhost podman[81463]: 2025-11-28 08:30:41.889376971 +0000 UTC m=+0.130842520 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, release=1761123044) Nov 28 03:30:41 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:30:41 localhost podman[81464]: 2025-11-28 08:30:41.890349452 +0000 UTC m=+0.128740987 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., container_name=ovn_controller, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, version=17.1.12) Nov 28 03:30:41 localhost podman[81464]: 2025-11-28 08:30:41.969803824 +0000 UTC m=+0.208195349 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, batch=17.1_20251118.1, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12) Nov 28 03:30:41 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:30:51 localhost sshd[81510]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:30:52 localhost python3[81525]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname Nov 28 03:31:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:31:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:31:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:31:01 localhost systemd[1]: tmp-crun.oNtBr0.mount: Deactivated successfully. Nov 28 03:31:01 localhost podman[81547]: 2025-11-28 08:31:01.851352081 +0000 UTC m=+0.088584623 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, architecture=x86_64, name=rhosp17/openstack-collectd, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, vcs-type=git) Nov 28 03:31:01 localhost podman[81547]: 2025-11-28 08:31:01.861749729 +0000 UTC m=+0.098982331 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:31:01 localhost podman[81546]: 2025-11-28 08:31:01.887855514 +0000 UTC m=+0.129889341 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-iscsid-container, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:31:01 localhost podman[81546]: 2025-11-28 08:31:01.89624094 +0000 UTC m=+0.138274747 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z) Nov 28 03:31:01 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:31:01 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:31:02 localhost podman[81548]: 2025-11-28 08:31:02.002640575 +0000 UTC m=+0.234482252 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, batch=17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, distribution-scope=public, build-date=2025-11-18T22:49:46Z) Nov 28 03:31:02 localhost podman[81548]: 2025-11-28 08:31:02.221367095 +0000 UTC m=+0.453208722 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:31:02 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:31:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:31:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:31:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:31:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:31:05 localhost systemd[1]: tmp-crun.SydMrP.mount: Deactivated successfully. Nov 28 03:31:05 localhost podman[81639]: 2025-11-28 08:31:05.836410578 +0000 UTC m=+0.064351654 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., container_name=logrotate_crond, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, version=17.1.12, config_id=tripleo_step4, name=rhosp17/openstack-cron) Nov 28 03:31:05 localhost podman[81638]: 2025-11-28 08:31:05.899103739 +0000 UTC m=+0.126946122 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, release=1761123044, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:31:05 localhost podman[81640]: 2025-11-28 08:31:05.868076343 +0000 UTC m=+0.091117210 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public) Nov 28 03:31:05 localhost podman[81640]: 2025-11-28 08:31:05.954380025 +0000 UTC m=+0.177420862 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true) Nov 28 03:31:05 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:31:05 localhost podman[81639]: 2025-11-28 08:31:05.969756213 +0000 UTC m=+0.197697269 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.expose-services=, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:31:05 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:31:06 localhost podman[81637]: 2025-11-28 08:31:06.05686529 +0000 UTC m=+0.287452197 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z) Nov 28 03:31:06 localhost podman[81637]: 2025-11-28 08:31:06.109526666 +0000 UTC m=+0.340113603 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1) Nov 28 03:31:06 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:31:06 localhost podman[81638]: 2025-11-28 08:31:06.257281422 +0000 UTC m=+0.485123745 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public) Nov 28 03:31:06 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:31:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:31:07 localhost podman[81732]: 2025-11-28 08:31:07.877257083 +0000 UTC m=+0.114351179 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:31:07 localhost podman[81732]: 2025-11-28 08:31:07.92960456 +0000 UTC m=+0.166698656 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=nova_compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc.) Nov 28 03:31:07 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:31:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:31:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:31:12 localhost systemd[1]: tmp-crun.XLpkcs.mount: Deactivated successfully. Nov 28 03:31:12 localhost podman[81759]: 2025-11-28 08:31:12.858872849 +0000 UTC m=+0.098041911 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=ovn_controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64) Nov 28 03:31:12 localhost podman[81759]: 2025-11-28 08:31:12.908601026 +0000 UTC m=+0.147770098 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, architecture=x86_64, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4) Nov 28 03:31:12 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:31:12 localhost podman[81758]: 2025-11-28 08:31:12.995927679 +0000 UTC m=+0.236318358 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 28 03:31:13 localhost podman[81758]: 2025-11-28 08:31:13.041324503 +0000 UTC m=+0.281715192 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent) Nov 28 03:31:13 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:31:25 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 03:31:25 localhost recover_tripleo_nova_virtqemud[81806]: 61397 Nov 28 03:31:25 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 03:31:25 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 03:31:29 localhost podman[81909]: 2025-11-28 08:31:29.913769857 +0000 UTC m=+0.093391678 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, distribution-scope=public, vendor=Red Hat, Inc., ceph=True, version=7, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, name=rhceph, release=553, RELEASE=main, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-type=git) Nov 28 03:31:30 localhost podman[81909]: 2025-11-28 08:31:30.042350169 +0000 UTC m=+0.221971970 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, description=Red Hat Ceph Storage 7, release=553, distribution-scope=public, CEPH_POINT_RELEASE=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, version=7, io.openshift.tags=rhceph ceph, vcs-type=git, ceph=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 28 03:31:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:31:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:31:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:31:32 localhost systemd[1]: tmp-crun.PbDM9a.mount: Deactivated successfully. Nov 28 03:31:32 localhost podman[82054]: 2025-11-28 08:31:32.863363577 +0000 UTC m=+0.096395661 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, release=1761123044, container_name=iscsid, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, tcib_managed=true) Nov 28 03:31:32 localhost podman[82054]: 2025-11-28 08:31:32.875529048 +0000 UTC m=+0.108561152 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.openshift.expose-services=, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, architecture=x86_64) Nov 28 03:31:32 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:31:32 localhost podman[82056]: 2025-11-28 08:31:32.923181951 +0000 UTC m=+0.151136110 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, tcib_managed=true, distribution-scope=public, version=17.1.12, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:31:32 localhost podman[82055]: 2025-11-28 08:31:32.964197471 +0000 UTC m=+0.194997196 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, version=17.1.12, tcib_managed=true, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:31:32 localhost podman[82055]: 2025-11-28 08:31:32.978458746 +0000 UTC m=+0.209258501 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, version=17.1.12, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-18T22:51:28Z) Nov 28 03:31:32 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:31:33 localhost podman[82056]: 2025-11-28 08:31:33.100914371 +0000 UTC m=+0.328868460 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com) Nov 28 03:31:33 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:31:33 localhost systemd[1]: tmp-crun.TciVpf.mount: Deactivated successfully. Nov 28 03:31:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:31:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:31:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:31:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:31:36 localhost systemd[1]: tmp-crun.kLofw2.mount: Deactivated successfully. Nov 28 03:31:36 localhost podman[82121]: 2025-11-28 08:31:36.860676537 +0000 UTC m=+0.094096240 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target) Nov 28 03:31:36 localhost systemd[1]: tmp-crun.V5LNCJ.mount: Deactivated successfully. Nov 28 03:31:36 localhost podman[82123]: 2025-11-28 08:31:36.902118111 +0000 UTC m=+0.130407959 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 28 03:31:36 localhost podman[82123]: 2025-11-28 08:31:36.938772318 +0000 UTC m=+0.167062576 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi) Nov 28 03:31:36 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:31:36 localhost podman[82120]: 2025-11-28 08:31:36.954113166 +0000 UTC m=+0.189621603 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Nov 28 03:31:37 localhost podman[82122]: 2025-11-28 08:31:37.001954926 +0000 UTC m=+0.232021427 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, batch=17.1_20251118.1, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:31:37 localhost podman[82122]: 2025-11-28 08:31:37.007878806 +0000 UTC m=+0.237945317 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc.) Nov 28 03:31:37 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:31:37 localhost podman[82120]: 2025-11-28 08:31:37.053830887 +0000 UTC m=+0.289339274 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute) Nov 28 03:31:37 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:31:37 localhost podman[82121]: 2025-11-28 08:31:37.245383688 +0000 UTC m=+0.478803351 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 28 03:31:37 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:31:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:31:38 localhost podman[82216]: 2025-11-28 08:31:38.838436689 +0000 UTC m=+0.075715840 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible) Nov 28 03:31:38 localhost podman[82216]: 2025-11-28 08:31:38.891359503 +0000 UTC m=+0.128638654 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, container_name=nova_compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 28 03:31:38 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:31:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:31:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:31:43 localhost podman[82243]: 2025-11-28 08:31:43.838111877 +0000 UTC m=+0.076451252 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public) Nov 28 03:31:43 localhost podman[82242]: 2025-11-28 08:31:43.894101125 +0000 UTC m=+0.134599636 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, config_id=tripleo_step4) Nov 28 03:31:43 localhost podman[82243]: 2025-11-28 08:31:43.919892361 +0000 UTC m=+0.158231756 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, release=1761123044) Nov 28 03:31:43 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:31:43 localhost podman[82242]: 2025-11-28 08:31:43.935950381 +0000 UTC m=+0.176448902 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1761123044) Nov 28 03:31:43 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:31:52 localhost systemd[1]: session-34.scope: Deactivated successfully. Nov 28 03:31:52 localhost systemd[1]: session-34.scope: Consumed 18.385s CPU time. Nov 28 03:31:52 localhost systemd-logind[764]: Session 34 logged out. Waiting for processes to exit. Nov 28 03:31:52 localhost systemd-logind[764]: Removed session 34. Nov 28 03:32:01 localhost sshd[82289]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:32:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:32:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:32:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:32:03 localhost systemd[1]: tmp-crun.rO5FiW.mount: Deactivated successfully. Nov 28 03:32:03 localhost podman[82336]: 2025-11-28 08:32:03.862654033 +0000 UTC m=+0.096981559 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, container_name=iscsid, io.buildah.version=1.41.4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:32:03 localhost podman[82337]: 2025-11-28 08:32:03.907443029 +0000 UTC m=+0.139459004 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, managed_by=tripleo_ansible, container_name=collectd) Nov 28 03:32:03 localhost podman[82337]: 2025-11-28 08:32:03.917617689 +0000 UTC m=+0.149633704 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git) Nov 28 03:32:03 localhost podman[82336]: 2025-11-28 08:32:03.925230991 +0000 UTC m=+0.159558567 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 28 03:32:03 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:32:03 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:32:04 localhost podman[82338]: 2025-11-28 08:32:04.008009926 +0000 UTC m=+0.237787403 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, io.openshift.expose-services=) Nov 28 03:32:04 localhost podman[82338]: 2025-11-28 08:32:04.206325293 +0000 UTC m=+0.436102710 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:32:04 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:32:04 localhost systemd[1]: tmp-crun.uzs0tc.mount: Deactivated successfully. Nov 28 03:32:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:32:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:32:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:32:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:32:07 localhost podman[82405]: 2025-11-28 08:32:07.857661803 +0000 UTC m=+0.090051797 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute) Nov 28 03:32:07 localhost podman[82406]: 2025-11-28 08:32:07.916687003 +0000 UTC m=+0.144349343 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:32:07 localhost podman[82405]: 2025-11-28 08:32:07.917424605 +0000 UTC m=+0.149814609 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 28 03:32:07 localhost podman[82407]: 2025-11-28 08:32:07.964927474 +0000 UTC m=+0.189315804 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, url=https://www.redhat.com, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-cron) Nov 28 03:32:07 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:32:08 localhost podman[82413]: 2025-11-28 08:32:08.025945175 +0000 UTC m=+0.245932821 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 28 03:32:08 localhost podman[82413]: 2025-11-28 08:32:08.094615989 +0000 UTC m=+0.314603625 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com) Nov 28 03:32:08 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:32:08 localhost podman[82407]: 2025-11-28 08:32:08.11041428 +0000 UTC m=+0.334802580 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, version=17.1.12, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron) Nov 28 03:32:08 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:32:08 localhost podman[82406]: 2025-11-28 08:32:08.29728962 +0000 UTC m=+0.524951910 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, managed_by=tripleo_ansible, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:32:08 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:32:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:32:09 localhost systemd[1]: tmp-crun.biG08W.mount: Deactivated successfully. Nov 28 03:32:09 localhost podman[82500]: 2025-11-28 08:32:09.846663578 +0000 UTC m=+0.085058974 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, tcib_managed=true, vcs-type=git, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, release=1761123044, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:32:09 localhost podman[82500]: 2025-11-28 08:32:09.904516983 +0000 UTC m=+0.142912349 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, tcib_managed=true, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:32:09 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:32:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:32:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:32:14 localhost podman[82525]: 2025-11-28 08:32:14.851121521 +0000 UTC m=+0.083501067 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, config_id=tripleo_step4, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.) Nov 28 03:32:14 localhost podman[82525]: 2025-11-28 08:32:14.906402667 +0000 UTC m=+0.138782193 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:32:14 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:32:14 localhost podman[82524]: 2025-11-28 08:32:14.909535182 +0000 UTC m=+0.142893218 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 28 03:32:14 localhost podman[82524]: 2025-11-28 08:32:14.994441002 +0000 UTC m=+0.227799028 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public) Nov 28 03:32:15 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:32:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:32:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:32:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:32:34 localhost podman[82648]: 2025-11-28 08:32:34.851765001 +0000 UTC m=+0.088763539 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, name=rhosp17/openstack-iscsid, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1) Nov 28 03:32:34 localhost podman[82648]: 2025-11-28 08:32:34.888786119 +0000 UTC m=+0.125784637 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, version=17.1.12, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:32:34 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:32:34 localhost podman[82649]: 2025-11-28 08:32:34.912012637 +0000 UTC m=+0.145500268 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:32:34 localhost podman[82649]: 2025-11-28 08:32:34.920852798 +0000 UTC m=+0.154340489 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, config_id=tripleo_step3, release=1761123044, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z) Nov 28 03:32:34 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:32:35 localhost podman[82650]: 2025-11-28 08:32:35.017310619 +0000 UTC m=+0.247332293 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd) Nov 28 03:32:35 localhost podman[82650]: 2025-11-28 08:32:35.215419861 +0000 UTC m=+0.445441555 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, container_name=metrics_qdr, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.12, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:32:35 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:32:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:32:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:32:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:32:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:32:38 localhost podman[82719]: 2025-11-28 08:32:38.860167349 +0000 UTC m=+0.088775229 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, release=1761123044, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi) Nov 28 03:32:38 localhost podman[82716]: 2025-11-28 08:32:38.906898253 +0000 UTC m=+0.140841866 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, version=17.1.12, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step4, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:32:38 localhost podman[82718]: 2025-11-28 08:32:38.956571788 +0000 UTC m=+0.188869650 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, container_name=logrotate_crond, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container) Nov 28 03:32:38 localhost podman[82719]: 2025-11-28 08:32:38.960269112 +0000 UTC m=+0.188877062 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:32:38 localhost podman[82718]: 2025-11-28 08:32:38.97040527 +0000 UTC m=+0.202703152 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, container_name=logrotate_crond, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1) Nov 28 03:32:38 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:32:38 localhost podman[82716]: 2025-11-28 08:32:38.983121858 +0000 UTC m=+0.217065431 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute) Nov 28 03:32:38 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:32:38 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:32:39 localhost podman[82717]: 2025-11-28 08:32:39.06974023 +0000 UTC m=+0.302864268 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, batch=17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.4) Nov 28 03:32:39 localhost podman[82717]: 2025-11-28 08:32:39.443217639 +0000 UTC m=+0.676341697 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true) Nov 28 03:32:39 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:32:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:32:40 localhost systemd[1]: tmp-crun.yBKgeH.mount: Deactivated successfully. Nov 28 03:32:40 localhost podman[82815]: 2025-11-28 08:32:40.854088014 +0000 UTC m=+0.093220774 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, version=17.1.12, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1761123044, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, io.buildah.version=1.41.4, config_id=tripleo_step5) Nov 28 03:32:40 localhost podman[82815]: 2025-11-28 08:32:40.884452079 +0000 UTC m=+0.123584829 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vcs-type=git) Nov 28 03:32:40 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:32:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:32:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:32:45 localhost systemd[1]: tmp-crun.9W834D.mount: Deactivated successfully. Nov 28 03:32:45 localhost podman[82841]: 2025-11-28 08:32:45.866998805 +0000 UTC m=+0.095464453 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, batch=17.1_20251118.1) Nov 28 03:32:45 localhost podman[82842]: 2025-11-28 08:32:45.909733458 +0000 UTC m=+0.135596856 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step4, batch=17.1_20251118.1, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:32:45 localhost podman[82841]: 2025-11-28 08:32:45.918401032 +0000 UTC m=+0.146866690 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Nov 28 03:32:45 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:32:45 localhost podman[82842]: 2025-11-28 08:32:45.963487737 +0000 UTC m=+0.189351155 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, tcib_managed=true) Nov 28 03:32:45 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:33:04 localhost systemd-logind[764]: Existing logind session ID 28 used by new audit session, ignoring. Nov 28 03:33:04 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 03:33:04 localhost systemd[1]: Created slice User Slice of UID 0. Nov 28 03:33:04 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Nov 28 03:33:04 localhost recover_tripleo_nova_virtqemud[83311]: 61397 Nov 28 03:33:04 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 03:33:04 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 03:33:04 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Nov 28 03:33:04 localhost systemd[1]: Starting User Manager for UID 0... Nov 28 03:33:04 localhost systemd[83313]: Queued start job for default target Main User Target. Nov 28 03:33:04 localhost systemd[83313]: Created slice User Application Slice. Nov 28 03:33:04 localhost systemd[83313]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Nov 28 03:33:04 localhost systemd[83313]: Started Daily Cleanup of User's Temporary Directories. Nov 28 03:33:04 localhost systemd[83313]: Reached target Paths. Nov 28 03:33:04 localhost systemd[83313]: Reached target Timers. Nov 28 03:33:04 localhost systemd[83313]: Starting D-Bus User Message Bus Socket... Nov 28 03:33:04 localhost systemd[83313]: Starting Create User's Volatile Files and Directories... Nov 28 03:33:04 localhost systemd[83313]: Finished Create User's Volatile Files and Directories. Nov 28 03:33:04 localhost systemd[83313]: Listening on D-Bus User Message Bus Socket. Nov 28 03:33:04 localhost systemd[83313]: Reached target Sockets. Nov 28 03:33:04 localhost systemd[83313]: Reached target Basic System. Nov 28 03:33:04 localhost systemd[83313]: Reached target Main User Target. Nov 28 03:33:04 localhost systemd[83313]: Startup finished in 151ms. Nov 28 03:33:04 localhost systemd[1]: Started User Manager for UID 0. Nov 28 03:33:04 localhost systemd[1]: Started Session c11 of User root. Nov 28 03:33:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:33:05 localhost systemd[1]: tmp-crun.LRZehy.mount: Deactivated successfully. Nov 28 03:33:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:33:05 localhost podman[83328]: 2025-11-28 08:33:05.031207357 +0000 UTC m=+0.107135617 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vcs-type=git, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 28 03:33:05 localhost podman[83328]: 2025-11-28 08:33:05.073550459 +0000 UTC m=+0.149478769 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, release=1761123044, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 28 03:33:05 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:33:05 localhost podman[83345]: 2025-11-28 08:33:05.131506706 +0000 UTC m=+0.091044448 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team) Nov 28 03:33:05 localhost podman[83345]: 2025-11-28 08:33:05.143301766 +0000 UTC m=+0.102839498 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc.) Nov 28 03:33:05 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:33:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:33:05 localhost podman[83370]: 2025-11-28 08:33:05.692623978 +0000 UTC m=+0.092027188 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., config_id=tripleo_step1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:33:05 localhost podman[83370]: 2025-11-28 08:33:05.909386607 +0000 UTC m=+0.308789747 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd) Nov 28 03:33:05 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:33:06 localhost kernel: tun: Universal TUN/TAP device driver, 1.6 Nov 28 03:33:06 localhost kernel: device tap09612b07-51 entered promiscuous mode Nov 28 03:33:06 localhost NetworkManager[5967]: [1764318786.0180] manager: (tap09612b07-51): new Tun device (/org/freedesktop/NetworkManager/Devices/13) Nov 28 03:33:06 localhost systemd-udevd[83414]: Network interface NamePolicy= disabled on kernel command line. Nov 28 03:33:06 localhost NetworkManager[5967]: [1764318786.0377] device (tap09612b07-51): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Nov 28 03:33:06 localhost NetworkManager[5967]: [1764318786.0382] device (tap09612b07-51): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Nov 28 03:33:06 localhost systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Nov 28 03:33:06 localhost systemd[1]: Starting Virtual Machine and Container Registration Service... Nov 28 03:33:06 localhost systemd[1]: Started Virtual Machine and Container Registration Service. Nov 28 03:33:06 localhost systemd-machined[83422]: New machine qemu-1-instance-00000002. Nov 28 03:33:06 localhost systemd[1]: Started Virtual Machine qemu-1-instance-00000002. Nov 28 03:33:06 localhost NetworkManager[5967]: [1764318786.3216] manager: (tap40d5da59-60): new Veth device (/org/freedesktop/NetworkManager/Devices/14) Nov 28 03:33:06 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap40d5da59-61: link becomes ready Nov 28 03:33:06 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap40d5da59-60: link becomes ready Nov 28 03:33:06 localhost NetworkManager[5967]: [1764318786.3841] device (tap40d5da59-60): carrier: link connected Nov 28 03:33:06 localhost kernel: device tap40d5da59-60 entered promiscuous mode Nov 28 03:33:08 localhost podman[83546]: 2025-11-28 08:33:08.216968179 +0000 UTC m=+0.096365960 container create 9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 28 03:33:08 localhost systemd[1]: Started libpod-conmon-9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef.scope. Nov 28 03:33:08 localhost podman[83546]: 2025-11-28 08:33:08.172621116 +0000 UTC m=+0.052018947 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Nov 28 03:33:08 localhost systemd[1]: tmp-crun.jr6fta.mount: Deactivated successfully. Nov 28 03:33:08 localhost systemd[1]: Started libcrun container. Nov 28 03:33:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b264a93705d5a28ba8f902d268499c1bea32890d992fb54a7c6890490d1eeb3f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 03:33:08 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Nov 28 03:33:08 localhost podman[83546]: 2025-11-28 08:33:08.36620294 +0000 UTC m=+0.245600741 container init 9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 03:33:08 localhost podman[83546]: 2025-11-28 08:33:08.373167953 +0000 UTC m=+0.252565764 container start 9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc.) Nov 28 03:33:08 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Nov 28 03:33:08 localhost systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged. Nov 28 03:33:08 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service. Nov 28 03:33:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:33:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:33:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:33:09 localhost podman[83578]: 2025-11-28 08:33:09.083006789 +0000 UTC m=+0.071796840 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-18T22:49:32Z) Nov 28 03:33:09 localhost podman[83580]: 2025-11-28 08:33:09.154906062 +0000 UTC m=+0.137292488 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=ceilometer_agent_compute) Nov 28 03:33:09 localhost podman[83578]: 2025-11-28 08:33:09.173385876 +0000 UTC m=+0.162175957 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, config_id=tripleo_step4, container_name=logrotate_crond, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc.) Nov 28 03:33:09 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:33:09 localhost podman[83579]: 2025-11-28 08:33:09.207182587 +0000 UTC m=+0.191695158 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible) Nov 28 03:33:09 localhost podman[83580]: 2025-11-28 08:33:09.228124255 +0000 UTC m=+0.210510631 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:33:09 localhost podman[83579]: 2025-11-28 08:33:09.23487334 +0000 UTC m=+0.219385971 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, architecture=x86_64, config_id=tripleo_step4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 28 03:33:09 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:33:09 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:33:09 localhost setroubleshoot[83564]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count. For complete SELinux messages run: sealert -l 96d97920-1546-4f45-b9c9-d0d51c7a6a1d Nov 28 03:33:09 localhost setroubleshoot[83564]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count.#012#012***** Plugin qemu_file_image (98.8 confidence) suggests *******************#012#012If max_map_count is a virtualization target#012Then you need to change the label on max_map_count'#012Do#012# semanage fcontext -a -t virt_image_t 'max_map_count'#012# restorecon -v 'max_map_count'#012#012***** Plugin catchall (2.13 confidence) suggests **************************#012#012If you believe that qemu-kvm should be allowed read access on the max_map_count file by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'qemu-kvm' --raw | audit2allow -M my-qemukvm#012# semodule -X 300 -i my-qemukvm.pp#012 Nov 28 03:33:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:33:09 localhost systemd[1]: tmp-crun.YFp2kM.mount: Deactivated successfully. Nov 28 03:33:09 localhost podman[83650]: 2025-11-28 08:33:09.854560809 +0000 UTC m=+0.093271246 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=nova_migration_target, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git) Nov 28 03:33:10 localhost podman[83650]: 2025-11-28 08:33:10.284730256 +0000 UTC m=+0.523440713 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.12, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:33:10 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:33:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:33:11 localhost systemd[1]: tmp-crun.8RvJjY.mount: Deactivated successfully. Nov 28 03:33:11 localhost podman[83673]: 2025-11-28 08:33:11.837390785 +0000 UTC m=+0.073524553 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:33:11 localhost podman[83673]: 2025-11-28 08:33:11.866341819 +0000 UTC m=+0.102475607 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, name=rhosp17/openstack-nova-compute, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:33:11 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:33:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:33:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:33:16 localhost systemd[1]: tmp-crun.Ayd8YV.mount: Deactivated successfully. Nov 28 03:33:16 localhost podman[83703]: 2025-11-28 08:33:16.860514968 +0000 UTC m=+0.094355728 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step4, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc.) Nov 28 03:33:16 localhost systemd[1]: tmp-crun.SWxAQw.mount: Deactivated successfully. Nov 28 03:33:16 localhost podman[83703]: 2025-11-28 08:33:16.909874333 +0000 UTC m=+0.143715093 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:33:16 localhost podman[83704]: 2025-11-28 08:33:16.918569518 +0000 UTC m=+0.148002815 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller) Nov 28 03:33:16 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:33:16 localhost podman[83704]: 2025-11-28 08:33:16.969393548 +0000 UTC m=+0.198826845 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 28 03:33:16 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:33:19 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully. Nov 28 03:33:19 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Nov 28 03:33:25 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58086 [28/Nov/2025:08:33:24.020] listener listener/metadata 0/0/0/1251/1251 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Nov 28 03:33:25 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58092 [28/Nov/2025:08:33:25.351] listener listener/metadata 0/0/0/13/13 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1" Nov 28 03:33:25 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58102 [28/Nov/2025:08:33:25.410] listener listener/metadata 0/0/0/14/14 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Nov 28 03:33:25 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58108 [28/Nov/2025:08:33:25.465] listener listener/metadata 0/0/0/13/13 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" Nov 28 03:33:25 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58114 [28/Nov/2025:08:33:25.516] listener listener/metadata 0/0/0/12/12 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1" Nov 28 03:33:25 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58120 [28/Nov/2025:08:33:25.569] listener listener/metadata 0/0/0/13/13 200 133 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" Nov 28 03:33:25 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58122 [28/Nov/2025:08:33:25.621] listener listener/metadata 0/0/0/11/11 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" Nov 28 03:33:25 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58138 [28/Nov/2025:08:33:25.683] listener listener/metadata 0/0/0/12/12 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1" Nov 28 03:33:25 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58148 [28/Nov/2025:08:33:25.735] listener listener/metadata 0/0/0/13/13 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" Nov 28 03:33:25 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58156 [28/Nov/2025:08:33:25.791] listener listener/metadata 0/0/0/13/13 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1" Nov 28 03:33:25 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58166 [28/Nov/2025:08:33:25.845] listener listener/metadata 0/0/0/13/13 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" Nov 28 03:33:25 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58170 [28/Nov/2025:08:33:25.887] listener listener/metadata 0/0/0/12/12 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" Nov 28 03:33:25 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58180 [28/Nov/2025:08:33:25.929] listener listener/metadata 0/0/0/13/13 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" Nov 28 03:33:26 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58190 [28/Nov/2025:08:33:26.012] listener listener/metadata 0/0/0/13/13 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" Nov 28 03:33:26 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58206 [28/Nov/2025:08:33:26.077] listener listener/metadata 0/0/0/12/12 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" Nov 28 03:33:26 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[83569]: 192.168.0.142:58214 [28/Nov/2025:08:33:26.129] listener listener/metadata 0/0/0/14/14 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" Nov 28 03:33:27 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0. Nov 28 03:33:31 localhost sshd[83753]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:33:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:33:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:33:35 localhost podman[83833]: 2025-11-28 08:33:35.641587185 +0000 UTC m=+0.089967345 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step3, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1) Nov 28 03:33:35 localhost podman[83833]: 2025-11-28 08:33:35.653460937 +0000 UTC m=+0.101841087 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, container_name=collectd, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc.) Nov 28 03:33:35 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:33:35 localhost systemd[1]: tmp-crun.C9GXzB.mount: Deactivated successfully. Nov 28 03:33:35 localhost podman[83832]: 2025-11-28 08:33:35.707077751 +0000 UTC m=+0.155356268 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, container_name=iscsid, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044) Nov 28 03:33:35 localhost podman[83832]: 2025-11-28 08:33:35.745532725 +0000 UTC m=+0.193811242 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-iscsid-container, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public) Nov 28 03:33:35 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:33:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:33:36 localhost systemd[1]: tmp-crun.5sk5ce.mount: Deactivated successfully. Nov 28 03:33:36 localhost podman[83871]: 2025-11-28 08:33:36.856674669 +0000 UTC m=+0.090767179 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, version=17.1.12, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:33:37 localhost podman[83871]: 2025-11-28 08:33:37.072592704 +0000 UTC m=+0.306685194 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, release=1761123044) Nov 28 03:33:37 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:33:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:33:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:33:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:33:39 localhost systemd[1]: tmp-crun.qh4S22.mount: Deactivated successfully. Nov 28 03:33:39 localhost podman[83902]: 2025-11-28 08:33:39.857582534 +0000 UTC m=+0.092210043 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, url=https://www.redhat.com) Nov 28 03:33:39 localhost podman[83902]: 2025-11-28 08:33:39.890558739 +0000 UTC m=+0.125186248 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, architecture=x86_64, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, distribution-scope=public) Nov 28 03:33:39 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:33:39 localhost podman[83903]: 2025-11-28 08:33:39.908968691 +0000 UTC m=+0.140594038 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, version=17.1.12, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi) Nov 28 03:33:40 localhost podman[83901]: 2025-11-28 08:33:39.999638266 +0000 UTC m=+0.236674948 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible) Nov 28 03:33:40 localhost podman[83903]: 2025-11-28 08:33:40.017663146 +0000 UTC m=+0.249288493 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 28 03:33:40 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:33:40 localhost podman[83901]: 2025-11-28 08:33:40.062490493 +0000 UTC m=+0.299527145 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, release=1761123044, distribution-scope=public, architecture=x86_64) Nov 28 03:33:40 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:33:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:33:40 localhost podman[83974]: 2025-11-28 08:33:40.845104658 +0000 UTC m=+0.078868036 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, architecture=x86_64, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:33:41 localhost podman[83974]: 2025-11-28 08:33:41.243476097 +0000 UTC m=+0.477239485 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:33:41 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:33:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:33:42 localhost systemd[1]: tmp-crun.d7AKF0.mount: Deactivated successfully. Nov 28 03:33:42 localhost podman[83998]: 2025-11-28 08:33:42.845096009 +0000 UTC m=+0.085291001 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, release=1761123044, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute) Nov 28 03:33:42 localhost podman[83998]: 2025-11-28 08:33:42.895338762 +0000 UTC m=+0.135533754 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, container_name=nova_compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Nov 28 03:33:42 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:33:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:33:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:33:47 localhost podman[84024]: 2025-11-28 08:33:47.843595059 +0000 UTC m=+0.077582706 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc.) Nov 28 03:33:47 localhost systemd[1]: tmp-crun.yinZW2.mount: Deactivated successfully. Nov 28 03:33:47 localhost podman[84025]: 2025-11-28 08:33:47.917277037 +0000 UTC m=+0.146622783 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:33:47 localhost podman[84025]: 2025-11-28 08:33:47.945526238 +0000 UTC m=+0.174872004 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 03:33:47 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:33:47 localhost podman[84024]: 2025-11-28 08:33:47.971622014 +0000 UTC m=+0.205609631 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, container_name=ovn_metadata_agent, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible) Nov 28 03:33:47 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:33:58 localhost snmpd[66832]: empty variable list in _query Nov 28 03:33:58 localhost snmpd[66832]: empty variable list in _query Nov 28 03:34:05 localhost sshd[84098]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:34:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:34:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:34:05 localhost systemd[1]: tmp-crun.6leuK8.mount: Deactivated successfully. Nov 28 03:34:05 localhost podman[84119]: 2025-11-28 08:34:05.868418746 +0000 UTC m=+0.100436064 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:34:05 localhost podman[84119]: 2025-11-28 08:34:05.88396729 +0000 UTC m=+0.115984658 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:34:05 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:34:05 localhost podman[84132]: 2025-11-28 08:34:05.937604735 +0000 UTC m=+0.081824586 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:34:05 localhost podman[84132]: 2025-11-28 08:34:05.97348307 +0000 UTC m=+0.117702901 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 28 03:34:05 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:34:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:34:07 localhost podman[84160]: 2025-11-28 08:34:07.851645416 +0000 UTC m=+0.086654934 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:34:08 localhost podman[84160]: 2025-11-28 08:34:08.046633802 +0000 UTC m=+0.281643280 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:34:08 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:34:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:34:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:34:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:34:10 localhost podman[84189]: 2025-11-28 08:34:10.855592292 +0000 UTC m=+0.090232653 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible) Nov 28 03:34:10 localhost systemd[1]: tmp-crun.eASN3L.mount: Deactivated successfully. Nov 28 03:34:10 localhost podman[84188]: 2025-11-28 08:34:10.915825429 +0000 UTC m=+0.153628985 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, release=1761123044, architecture=x86_64, container_name=ceilometer_agent_compute, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:34:10 localhost podman[84189]: 2025-11-28 08:34:10.921355467 +0000 UTC m=+0.155995827 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.) Nov 28 03:34:10 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:34:10 localhost podman[84188]: 2025-11-28 08:34:10.946214676 +0000 UTC m=+0.184018252 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, tcib_managed=true) Nov 28 03:34:10 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:34:11 localhost podman[84190]: 2025-11-28 08:34:11.009164686 +0000 UTC m=+0.241056022 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 28 03:34:11 localhost podman[84190]: 2025-11-28 08:34:11.067696361 +0000 UTC m=+0.299587647 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, version=17.1.12, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public) Nov 28 03:34:11 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:34:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:34:11 localhost podman[84259]: 2025-11-28 08:34:11.85075059 +0000 UTC m=+0.085289401 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, version=17.1.12, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=) Nov 28 03:34:12 localhost podman[84259]: 2025-11-28 08:34:12.242460576 +0000 UTC m=+0.476999337 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, container_name=nova_migration_target, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible) Nov 28 03:34:12 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:34:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:34:13 localhost systemd[1]: tmp-crun.QusPHg.mount: Deactivated successfully. Nov 28 03:34:13 localhost podman[84283]: 2025-11-28 08:34:13.845009666 +0000 UTC m=+0.082331292 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, tcib_managed=true, vcs-type=git, container_name=nova_compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:34:13 localhost podman[84283]: 2025-11-28 08:34:13.90252218 +0000 UTC m=+0.139843756 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, config_id=tripleo_step5, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public) Nov 28 03:34:13 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:34:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:34:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:34:18 localhost podman[84313]: 2025-11-28 08:34:18.847989583 +0000 UTC m=+0.090124429 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com) Nov 28 03:34:18 localhost systemd[1]: tmp-crun.v7tumG.mount: Deactivated successfully. Nov 28 03:34:18 localhost podman[84313]: 2025-11-28 08:34:18.896696478 +0000 UTC m=+0.138831324 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, name=rhosp17/openstack-ovn-controller, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:34:18 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:34:18 localhost podman[84312]: 2025-11-28 08:34:18.90493191 +0000 UTC m=+0.146746206 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-19T00:14:25Z, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 28 03:34:18 localhost podman[84312]: 2025-11-28 08:34:18.98855951 +0000 UTC m=+0.230373826 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, container_name=ovn_metadata_agent, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=) Nov 28 03:34:19 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:34:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:34:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:34:36 localhost podman[84424]: 2025-11-28 08:34:36.86695362 +0000 UTC m=+0.099447763 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, container_name=collectd, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:34:36 localhost podman[84423]: 2025-11-28 08:34:36.909311811 +0000 UTC m=+0.141799314 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:34:36 localhost podman[84423]: 2025-11-28 08:34:36.920437771 +0000 UTC m=+0.152925314 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, release=1761123044, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, distribution-scope=public, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:34:36 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:34:36 localhost podman[84424]: 2025-11-28 08:34:36.960856844 +0000 UTC m=+0.193350937 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:34:36 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:34:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:34:38 localhost podman[84477]: 2025-11-28 08:34:38.843466795 +0000 UTC m=+0.081311571 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12) Nov 28 03:34:39 localhost podman[84477]: 2025-11-28 08:34:39.023371391 +0000 UTC m=+0.261216097 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:34:39 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:34:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:34:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:34:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:34:41 localhost systemd[1]: tmp-crun.pwiLgB.mount: Deactivated successfully. Nov 28 03:34:41 localhost systemd[1]: tmp-crun.fje9kv.mount: Deactivated successfully. Nov 28 03:34:41 localhost podman[84507]: 2025-11-28 08:34:41.904471641 +0000 UTC m=+0.142160755 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, name=rhosp17/openstack-cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1) Nov 28 03:34:41 localhost podman[84507]: 2025-11-28 08:34:41.914069865 +0000 UTC m=+0.151758979 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:34:41 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:34:41 localhost podman[84506]: 2025-11-28 08:34:41.869123214 +0000 UTC m=+0.108227262 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, tcib_managed=true, release=1761123044, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com) Nov 28 03:34:42 localhost podman[84506]: 2025-11-28 08:34:42.005625756 +0000 UTC m=+0.244729754 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, vcs-type=git, container_name=ceilometer_agent_compute, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true) Nov 28 03:34:42 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:34:42 localhost podman[84508]: 2025-11-28 08:34:42.058086336 +0000 UTC m=+0.289201740 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.expose-services=) Nov 28 03:34:42 localhost podman[84508]: 2025-11-28 08:34:42.113451605 +0000 UTC m=+0.344566929 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 28 03:34:42 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:34:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:34:42 localhost podman[84578]: 2025-11-28 08:34:42.837841444 +0000 UTC m=+0.077950837 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, container_name=nova_migration_target, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z) Nov 28 03:34:43 localhost podman[84578]: 2025-11-28 08:34:43.193713358 +0000 UTC m=+0.433822721 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step4) Nov 28 03:34:43 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:34:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:34:44 localhost systemd[1]: tmp-crun.8j3sVr.mount: Deactivated successfully. Nov 28 03:34:44 localhost podman[84602]: 2025-11-28 08:34:44.843438126 +0000 UTC m=+0.076675758 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, container_name=nova_compute, release=1761123044) Nov 28 03:34:44 localhost podman[84602]: 2025-11-28 08:34:44.897738903 +0000 UTC m=+0.130976545 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Nov 28 03:34:44 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:34:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:34:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:34:49 localhost podman[84630]: 2025-11-28 08:34:49.839087362 +0000 UTC m=+0.076276748 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Nov 28 03:34:49 localhost podman[84629]: 2025-11-28 08:34:49.897516244 +0000 UTC m=+0.137139834 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 28 03:34:49 localhost podman[84630]: 2025-11-28 08:34:49.915796891 +0000 UTC m=+0.152986227 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public) Nov 28 03:34:49 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:34:49 localhost podman[84629]: 2025-11-28 08:34:49.970461958 +0000 UTC m=+0.210085578 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 03:34:49 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:35:05 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 03:35:05 localhost recover_tripleo_nova_virtqemud[84695]: 61397 Nov 28 03:35:05 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 03:35:05 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 03:35:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:35:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:35:07 localhost podman[84721]: 2025-11-28 08:35:07.848292329 +0000 UTC m=+0.080566903 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, vcs-type=git) Nov 28 03:35:07 localhost podman[84721]: 2025-11-28 08:35:07.862390811 +0000 UTC m=+0.094665385 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, container_name=iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:35:07 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:35:07 localhost podman[84722]: 2025-11-28 08:35:07.953592357 +0000 UTC m=+0.186678697 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Nov 28 03:35:07 localhost podman[84722]: 2025-11-28 08:35:07.988547191 +0000 UTC m=+0.221633531 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.buildah.version=1.41.4, config_id=tripleo_step3) Nov 28 03:35:08 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:35:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:35:09 localhost systemd[1]: tmp-crun.S6rCgt.mount: Deactivated successfully. Nov 28 03:35:09 localhost podman[84760]: 2025-11-28 08:35:09.858462314 +0000 UTC m=+0.095017147 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:35:10 localhost podman[84760]: 2025-11-28 08:35:10.043759186 +0000 UTC m=+0.280313989 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 28 03:35:10 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:35:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:35:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:35:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:35:12 localhost podman[84791]: 2025-11-28 08:35:12.871856082 +0000 UTC m=+0.104930918 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 28 03:35:12 localhost systemd[1]: tmp-crun.NgTO4n.mount: Deactivated successfully. Nov 28 03:35:12 localhost podman[84793]: 2025-11-28 08:35:12.910199631 +0000 UTC m=+0.139427377 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4) Nov 28 03:35:12 localhost podman[84791]: 2025-11-28 08:35:12.928562287 +0000 UTC m=+0.161637093 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com) Nov 28 03:35:12 localhost systemd[1]: tmp-crun.kywFKN.mount: Deactivated successfully. Nov 28 03:35:12 localhost podman[84792]: 2025-11-28 08:35:12.966146683 +0000 UTC m=+0.197076941 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, tcib_managed=true, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Nov 28 03:35:12 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:35:12 localhost podman[84793]: 2025-11-28 08:35:12.993262763 +0000 UTC m=+0.222490479 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:35:13 localhost podman[84792]: 2025-11-28 08:35:12.99988686 +0000 UTC m=+0.230817098 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, batch=17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step4, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=) Nov 28 03:35:13 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:35:13 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:35:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:35:13 localhost podman[84863]: 2025-11-28 08:35:13.84014064 +0000 UTC m=+0.079011025 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com) Nov 28 03:35:14 localhost podman[84863]: 2025-11-28 08:35:14.177040509 +0000 UTC m=+0.415910924 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 28 03:35:14 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:35:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:35:15 localhost podman[84887]: 2025-11-28 08:35:15.840110564 +0000 UTC m=+0.078506579 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git) Nov 28 03:35:15 localhost podman[84887]: 2025-11-28 08:35:15.89332831 +0000 UTC m=+0.131724315 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:35:15 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:35:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:35:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:35:20 localhost systemd[1]: tmp-crun.JE1Vr0.mount: Deactivated successfully. Nov 28 03:35:20 localhost podman[84916]: 2025-11-28 08:35:20.843373668 +0000 UTC m=+0.079015045 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step4, url=https://www.redhat.com, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:35:20 localhost systemd[1]: tmp-crun.bfo6QS.mount: Deactivated successfully. Nov 28 03:35:20 localhost podman[84916]: 2025-11-28 08:35:20.899075743 +0000 UTC m=+0.134717070 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible) Nov 28 03:35:20 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:35:20 localhost podman[84915]: 2025-11-28 08:35:20.903359147 +0000 UTC m=+0.141733899 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, container_name=ovn_metadata_agent, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public) Nov 28 03:35:20 localhost podman[84915]: 2025-11-28 08:35:20.987393439 +0000 UTC m=+0.225768201 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team) Nov 28 03:35:20 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:35:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:35:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:35:38 localhost systemd[1]: tmp-crun.e6oo8H.mount: Deactivated successfully. Nov 28 03:35:38 localhost podman[85042]: 2025-11-28 08:35:38.806614956 +0000 UTC m=+0.069898729 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step3, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:35:38 localhost podman[85043]: 2025-11-28 08:35:38.862714163 +0000 UTC m=+0.123206399 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step3, vcs-type=git, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team) Nov 28 03:35:38 localhost podman[85043]: 2025-11-28 08:35:38.871877041 +0000 UTC m=+0.132369327 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, container_name=collectd, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:35:38 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:35:38 localhost podman[85042]: 2025-11-28 08:35:38.888552123 +0000 UTC m=+0.151835896 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.component=openstack-iscsid-container, release=1761123044, io.openshift.expose-services=, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:35:38 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:35:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:35:40 localhost podman[85081]: 2025-11-28 08:35:40.846383526 +0000 UTC m=+0.083700131 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, container_name=metrics_qdr, version=17.1.12) Nov 28 03:35:41 localhost podman[85081]: 2025-11-28 08:35:41.034283801 +0000 UTC m=+0.271600416 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, io.openshift.expose-services=, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4) Nov 28 03:35:41 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:35:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:35:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:35:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:35:43 localhost podman[85110]: 2025-11-28 08:35:43.853212539 +0000 UTC m=+0.084767535 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, release=1761123044, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 28 03:35:43 localhost systemd[1]: tmp-crun.S2hPWm.mount: Deactivated successfully. Nov 28 03:35:43 localhost podman[85111]: 2025-11-28 08:35:43.916566972 +0000 UTC m=+0.145163205 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, vcs-type=git, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 28 03:35:43 localhost podman[85110]: 2025-11-28 08:35:43.931253303 +0000 UTC m=+0.162808339 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-19T00:11:48Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:35:43 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:35:43 localhost podman[85111]: 2025-11-28 08:35:43.957487594 +0000 UTC m=+0.186083797 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:35:43 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:35:44 localhost podman[85112]: 2025-11-28 08:35:44.006288502 +0000 UTC m=+0.232871833 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, version=17.1.12, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4) Nov 28 03:35:44 localhost podman[85112]: 2025-11-28 08:35:44.037405337 +0000 UTC m=+0.263988658 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, architecture=x86_64, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, release=1761123044, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.12) Nov 28 03:35:44 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:35:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:35:44 localhost podman[85180]: 2025-11-28 08:35:44.84773424 +0000 UTC m=+0.082456453 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, distribution-scope=public, version=17.1.12, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:35:45 localhost podman[85180]: 2025-11-28 08:35:45.250505802 +0000 UTC m=+0.485228005 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1) Nov 28 03:35:45 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:35:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:35:46 localhost podman[85205]: 2025-11-28 08:35:46.841950184 +0000 UTC m=+0.075884647 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, version=17.1.12, release=1761123044, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:35:46 localhost podman[85205]: 2025-11-28 08:35:46.875793693 +0000 UTC m=+0.109728106 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., tcib_managed=true, container_name=nova_compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_id=tripleo_step5, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:35:46 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:35:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:35:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:35:51 localhost podman[85231]: 2025-11-28 08:35:51.852606161 +0000 UTC m=+0.084503107 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, container_name=ovn_metadata_agent) Nov 28 03:35:51 localhost systemd[1]: tmp-crun.uEwlXM.mount: Deactivated successfully. Nov 28 03:35:51 localhost podman[85232]: 2025-11-28 08:35:51.905720114 +0000 UTC m=+0.135091481 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=) Nov 28 03:35:51 localhost podman[85232]: 2025-11-28 08:35:51.928144886 +0000 UTC m=+0.157516293 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=) Nov 28 03:35:51 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:35:51 localhost podman[85231]: 2025-11-28 08:35:51.958452795 +0000 UTC m=+0.190349791 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:35:51 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:36:05 localhost sshd[85280]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:36:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:36:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:36:09 localhost podman[85327]: 2025-11-28 08:36:09.848081676 +0000 UTC m=+0.086694156 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, distribution-scope=public, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:36:09 localhost podman[85327]: 2025-11-28 08:36:09.860413262 +0000 UTC m=+0.099025802 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step3, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:36:09 localhost systemd[1]: tmp-crun.PNkeJO.mount: Deactivated successfully. Nov 28 03:36:09 localhost podman[85328]: 2025-11-28 08:36:09.906541667 +0000 UTC m=+0.144075952 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:36:09 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:36:09 localhost podman[85328]: 2025-11-28 08:36:09.967689452 +0000 UTC m=+0.205223747 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step3, tcib_managed=true, com.redhat.component=openstack-collectd-container, distribution-scope=public, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:36:09 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:36:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:36:11 localhost podman[85365]: 2025-11-28 08:36:11.84868169 +0000 UTC m=+0.084631070 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, version=17.1.12, build-date=2025-11-18T22:49:46Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd) Nov 28 03:36:12 localhost podman[85365]: 2025-11-28 08:36:12.060272596 +0000 UTC m=+0.296221946 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team) Nov 28 03:36:12 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:36:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:36:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:36:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:36:14 localhost systemd[1]: tmp-crun.I1u0EE.mount: Deactivated successfully. Nov 28 03:36:14 localhost systemd[1]: tmp-crun.1NfuDn.mount: Deactivated successfully. Nov 28 03:36:14 localhost podman[85395]: 2025-11-28 08:36:14.857989249 +0000 UTC m=+0.095468381 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, release=1761123044, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Nov 28 03:36:14 localhost podman[85395]: 2025-11-28 08:36:14.865104732 +0000 UTC m=+0.102583894 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container) Nov 28 03:36:14 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:36:14 localhost podman[85394]: 2025-11-28 08:36:14.830387725 +0000 UTC m=+0.073670748 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, container_name=ceilometer_agent_compute) Nov 28 03:36:14 localhost podman[85394]: 2025-11-28 08:36:14.910523474 +0000 UTC m=+0.153806537 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:36:14 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:36:14 localhost podman[85397]: 2025-11-28 08:36:14.953331994 +0000 UTC m=+0.187140410 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-11-19T00:12:45Z, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 28 03:36:14 localhost podman[85397]: 2025-11-28 08:36:14.979461683 +0000 UTC m=+0.213270069 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi) Nov 28 03:36:14 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:36:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:36:15 localhost podman[85469]: 2025-11-28 08:36:15.843082535 +0000 UTC m=+0.079335135 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:36:16 localhost podman[85469]: 2025-11-28 08:36:16.238963401 +0000 UTC m=+0.475216031 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, version=17.1.12, architecture=x86_64, release=1761123044, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:36:16 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:36:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:36:17 localhost podman[85492]: 2025-11-28 08:36:17.848246712 +0000 UTC m=+0.086746747 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, release=1761123044, architecture=x86_64, distribution-scope=public, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Nov 28 03:36:17 localhost podman[85492]: 2025-11-28 08:36:17.881432982 +0000 UTC m=+0.119932967 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, url=https://www.redhat.com) Nov 28 03:36:17 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:36:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:36:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:36:22 localhost systemd[1]: tmp-crun.VAYe1d.mount: Deactivated successfully. Nov 28 03:36:22 localhost podman[85518]: 2025-11-28 08:36:22.833155093 +0000 UTC m=+0.075348770 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-type=git, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 03:36:22 localhost systemd[1]: tmp-crun.VWUuyc.mount: Deactivated successfully. Nov 28 03:36:22 localhost podman[85517]: 2025-11-28 08:36:22.874162527 +0000 UTC m=+0.118556943 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 03:36:22 localhost podman[85518]: 2025-11-28 08:36:22.881337822 +0000 UTC m=+0.123531449 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1761123044, vcs-type=git, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com) Nov 28 03:36:22 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:36:22 localhost podman[85517]: 2025-11-28 08:36:22.916381529 +0000 UTC m=+0.160775935 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, container_name=ovn_metadata_agent, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, batch=17.1_20251118.1, vcs-type=git) Nov 28 03:36:22 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:36:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:36:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:36:40 localhost systemd[1]: tmp-crun.WghwZx.mount: Deactivated successfully. Nov 28 03:36:40 localhost podman[85641]: 2025-11-28 08:36:40.404336041 +0000 UTC m=+0.078375525 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3) Nov 28 03:36:40 localhost podman[85639]: 2025-11-28 08:36:40.465754748 +0000 UTC m=+0.140571337 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, architecture=x86_64, version=17.1.12, container_name=iscsid, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:36:40 localhost podman[85641]: 2025-11-28 08:36:40.495167291 +0000 UTC m=+0.169206835 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true) Nov 28 03:36:40 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:36:40 localhost podman[85639]: 2025-11-28 08:36:40.55275363 +0000 UTC m=+0.227570259 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step3, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, container_name=iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 28 03:36:40 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:36:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:36:42 localhost podman[85679]: 2025-11-28 08:36:42.848587622 +0000 UTC m=+0.087512849 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:36:43 localhost podman[85679]: 2025-11-28 08:36:43.035112455 +0000 UTC m=+0.274037692 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12) Nov 28 03:36:43 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:36:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:36:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:36:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:36:45 localhost podman[85709]: 2025-11-28 08:36:45.840617387 +0000 UTC m=+0.080332517 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, version=17.1.12, tcib_managed=true, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, distribution-scope=public, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:36:45 localhost podman[85709]: 2025-11-28 08:36:45.893318164 +0000 UTC m=+0.133033294 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Nov 28 03:36:45 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:36:45 localhost systemd[1]: tmp-crun.xBUOpH.mount: Deactivated successfully. Nov 28 03:36:45 localhost podman[85711]: 2025-11-28 08:36:45.919333481 +0000 UTC m=+0.152894339 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, config_id=tripleo_step4) Nov 28 03:36:45 localhost podman[85711]: 2025-11-28 08:36:45.945463884 +0000 UTC m=+0.179024752 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc.) Nov 28 03:36:45 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:36:45 localhost podman[85710]: 2025-11-28 08:36:45.962673898 +0000 UTC m=+0.198115034 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 28 03:36:45 localhost podman[85710]: 2025-11-28 08:36:45.976301621 +0000 UTC m=+0.211742787 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, version=17.1.12) Nov 28 03:36:45 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:36:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:36:46 localhost podman[85780]: 2025-11-28 08:36:46.842572865 +0000 UTC m=+0.080032117 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1761123044, name=rhosp17/openstack-nova-compute, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:36:47 localhost podman[85780]: 2025-11-28 08:36:47.215351932 +0000 UTC m=+0.452811184 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1) Nov 28 03:36:47 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:36:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:36:48 localhost podman[85803]: 2025-11-28 08:36:48.83742764 +0000 UTC m=+0.074480095 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:36:48 localhost podman[85803]: 2025-11-28 08:36:48.88957533 +0000 UTC m=+0.126627755 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc.) Nov 28 03:36:48 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:36:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:36:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:36:53 localhost podman[85830]: 2025-11-28 08:36:53.853547925 +0000 UTC m=+0.091728249 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:36:53 localhost podman[85829]: 2025-11-28 08:36:53.890224745 +0000 UTC m=+0.128240264 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1) Nov 28 03:36:53 localhost podman[85830]: 2025-11-28 08:36:53.903433324 +0000 UTC m=+0.141613648 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, io.openshift.expose-services=, vcs-type=git, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 03:36:53 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:36:53 localhost podman[85829]: 2025-11-28 08:36:53.936406899 +0000 UTC m=+0.174422438 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn) Nov 28 03:36:53 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:37:05 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 03:37:05 localhost recover_tripleo_nova_virtqemud[85875]: 61397 Nov 28 03:37:05 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 03:37:05 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 03:37:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:37:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:37:10 localhost systemd[1]: tmp-crun.aDubYi.mount: Deactivated successfully. Nov 28 03:37:10 localhost podman[85920]: 2025-11-28 08:37:10.85739055 +0000 UTC m=+0.090239124 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, container_name=collectd, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Nov 28 03:37:10 localhost systemd[1]: tmp-crun.oAg0GM.mount: Deactivated successfully. Nov 28 03:37:10 localhost podman[85919]: 2025-11-28 08:37:10.911775379 +0000 UTC m=+0.147325707 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible) Nov 28 03:37:10 localhost podman[85920]: 2025-11-28 08:37:10.917394304 +0000 UTC m=+0.150242868 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, distribution-scope=public, url=https://www.redhat.com) Nov 28 03:37:10 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:37:10 localhost podman[85919]: 2025-11-28 08:37:10.94722627 +0000 UTC m=+0.182776608 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step3, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:37:10 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:37:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:37:13 localhost podman[85960]: 2025-11-28 08:37:13.855014198 +0000 UTC m=+0.080416388 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public) Nov 28 03:37:14 localhost podman[85960]: 2025-11-28 08:37:14.044811912 +0000 UTC m=+0.270214082 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20251118.1, container_name=metrics_qdr, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, architecture=x86_64, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com) Nov 28 03:37:14 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:37:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:37:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:37:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:37:16 localhost systemd[1]: tmp-crun.Z9Mkgq.mount: Deactivated successfully. Nov 28 03:37:16 localhost systemd[1]: tmp-crun.QSqJxM.mount: Deactivated successfully. Nov 28 03:37:16 localhost podman[85990]: 2025-11-28 08:37:16.912389622 +0000 UTC m=+0.140747113 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, distribution-scope=public, config_id=tripleo_step4, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron) Nov 28 03:37:16 localhost podman[85990]: 2025-11-28 08:37:16.919265396 +0000 UTC m=+0.147622847 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:37:16 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:37:16 localhost podman[85989]: 2025-11-28 08:37:16.999089925 +0000 UTC m=+0.230442798 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public) Nov 28 03:37:17 localhost podman[85991]: 2025-11-28 08:37:16.876366774 +0000 UTC m=+0.103354221 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi) Nov 28 03:37:17 localhost podman[85989]: 2025-11-28 08:37:17.031234803 +0000 UTC m=+0.262587676 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Nov 28 03:37:17 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:37:17 localhost podman[85991]: 2025-11-28 08:37:17.060768531 +0000 UTC m=+0.287755938 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, release=1761123044, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:37:17 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:37:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:37:17 localhost podman[86061]: 2025-11-28 08:37:17.87408345 +0000 UTC m=+0.069992996 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, url=https://www.redhat.com, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:37:18 localhost podman[86061]: 2025-11-28 08:37:18.238308272 +0000 UTC m=+0.434217818 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, version=17.1.12, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:37:18 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:37:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:37:19 localhost systemd[1]: tmp-crun.RCJsnB.mount: Deactivated successfully. Nov 28 03:37:19 localhost podman[86085]: 2025-11-28 08:37:19.853409193 +0000 UTC m=+0.092811734 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true) Nov 28 03:37:19 localhost podman[86085]: 2025-11-28 08:37:19.905562862 +0000 UTC m=+0.144965393 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step5, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:37:19 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:37:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:37:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:37:24 localhost podman[86111]: 2025-11-28 08:37:24.842778572 +0000 UTC m=+0.080004216 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 28 03:37:24 localhost podman[86112]: 2025-11-28 08:37:24.896599183 +0000 UTC m=+0.131017480 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, container_name=ovn_controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, name=rhosp17/openstack-ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1) Nov 28 03:37:24 localhost podman[86111]: 2025-11-28 08:37:24.905414127 +0000 UTC m=+0.142639751 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64) Nov 28 03:37:24 localhost podman[86112]: 2025-11-28 08:37:24.91356343 +0000 UTC m=+0.147981707 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=ovn_controller) Nov 28 03:37:24 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:37:24 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:37:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:37:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:37:41 localhost podman[86219]: 2025-11-28 08:37:41.86929636 +0000 UTC m=+0.099567213 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, distribution-scope=public) Nov 28 03:37:41 localhost systemd[1]: tmp-crun.kxMEaU.mount: Deactivated successfully. Nov 28 03:37:41 localhost podman[86218]: 2025-11-28 08:37:41.927194758 +0000 UTC m=+0.156934005 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4) Nov 28 03:37:41 localhost podman[86219]: 2025-11-28 08:37:41.932228925 +0000 UTC m=+0.162499758 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Nov 28 03:37:41 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:37:41 localhost podman[86218]: 2025-11-28 08:37:41.965421356 +0000 UTC m=+0.195160553 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 28 03:37:41 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:37:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:37:44 localhost systemd[1]: tmp-crun.0edv5f.mount: Deactivated successfully. Nov 28 03:37:44 localhost podman[86273]: 2025-11-28 08:37:44.832062555 +0000 UTC m=+0.073303287 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, version=17.1.12, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, release=1761123044, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:37:45 localhost podman[86273]: 2025-11-28 08:37:45.025311177 +0000 UTC m=+0.266551879 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd) Nov 28 03:37:45 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:37:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:37:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:37:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:37:47 localhost systemd[1]: tmp-crun.aUMGuR.mount: Deactivated successfully. Nov 28 03:37:47 localhost podman[86303]: 2025-11-28 08:37:47.846563407 +0000 UTC m=+0.076564288 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-type=git, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, version=17.1.12, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container) Nov 28 03:37:47 localhost podman[86303]: 2025-11-28 08:37:47.858334623 +0000 UTC m=+0.088335534 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, architecture=x86_64, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, container_name=logrotate_crond, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com) Nov 28 03:37:47 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:37:47 localhost podman[86302]: 2025-11-28 08:37:47.910983668 +0000 UTC m=+0.141875028 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Nov 28 03:37:47 localhost podman[86302]: 2025-11-28 08:37:47.964697986 +0000 UTC m=+0.195589326 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, url=https://www.redhat.com) Nov 28 03:37:47 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:37:48 localhost podman[86304]: 2025-11-28 08:37:47.96898769 +0000 UTC m=+0.193719368 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, version=17.1.12, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1) Nov 28 03:37:48 localhost podman[86304]: 2025-11-28 08:37:48.053411712 +0000 UTC m=+0.278143410 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, release=1761123044, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team) Nov 28 03:37:48 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:37:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:37:48 localhost systemd[1]: tmp-crun.YB4Skz.mount: Deactivated successfully. Nov 28 03:37:48 localhost podman[86376]: 2025-11-28 08:37:48.843642564 +0000 UTC m=+0.082854155 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1761123044, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, tcib_managed=true, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:37:49 localhost podman[86376]: 2025-11-28 08:37:49.21734095 +0000 UTC m=+0.456552491 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, tcib_managed=true, release=1761123044) Nov 28 03:37:49 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:37:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:37:50 localhost podman[86400]: 2025-11-28 08:37:50.840945525 +0000 UTC m=+0.078779617 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute) Nov 28 03:37:50 localhost podman[86400]: 2025-11-28 08:37:50.892849067 +0000 UTC m=+0.130683129 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, version=17.1.12, container_name=nova_compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044) Nov 28 03:37:50 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:37:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:37:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:37:55 localhost podman[86426]: 2025-11-28 08:37:55.859642273 +0000 UTC m=+0.087333723 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 28 03:37:55 localhost podman[86427]: 2025-11-28 08:37:55.91392213 +0000 UTC m=+0.138658768 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 28 03:37:55 localhost podman[86426]: 2025-11-28 08:37:55.931618989 +0000 UTC m=+0.159310469 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, tcib_managed=true, distribution-scope=public, release=1761123044, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, config_id=tripleo_step4, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team) Nov 28 03:37:55 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:37:55 localhost podman[86427]: 2025-11-28 08:37:55.990868369 +0000 UTC m=+0.215605037 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, vcs-type=git, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 28 03:37:56 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:38:05 localhost systemd[83313]: Created slice User Background Tasks Slice. Nov 28 03:38:05 localhost systemd[83313]: Starting Cleanup of User's Temporary Files and Directories... Nov 28 03:38:05 localhost systemd[83313]: Finished Cleanup of User's Temporary Files and Directories. Nov 28 03:38:12 localhost sshd[86519]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:38:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:38:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:38:12 localhost systemd[1]: tmp-crun.bRlNdJ.mount: Deactivated successfully. Nov 28 03:38:12 localhost podman[86522]: 2025-11-28 08:38:12.847355097 +0000 UTC m=+0.086409104 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., container_name=collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Nov 28 03:38:12 localhost podman[86522]: 2025-11-28 08:38:12.861247878 +0000 UTC m=+0.100301865 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=collectd, vcs-type=git, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd) Nov 28 03:38:12 localhost podman[86521]: 2025-11-28 08:38:12.88577417 +0000 UTC m=+0.124314011 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, container_name=iscsid) Nov 28 03:38:12 localhost podman[86521]: 2025-11-28 08:38:12.896248356 +0000 UTC m=+0.134788187 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step3, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:38:12 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:38:12 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:38:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:38:15 localhost podman[86562]: 2025-11-28 08:38:15.83870339 +0000 UTC m=+0.077818687 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step1, vendor=Red Hat, Inc., vcs-type=git, release=1761123044) Nov 28 03:38:16 localhost podman[86562]: 2025-11-28 08:38:16.028420102 +0000 UTC m=+0.267535399 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 28 03:38:16 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:38:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:38:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:38:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:38:18 localhost systemd[1]: tmp-crun.lNQiGh.mount: Deactivated successfully. Nov 28 03:38:18 localhost podman[86591]: 2025-11-28 08:38:18.855472823 +0000 UTC m=+0.095987362 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, distribution-scope=public, container_name=ceilometer_agent_compute) Nov 28 03:38:18 localhost podman[86591]: 2025-11-28 08:38:18.886393423 +0000 UTC m=+0.126907972 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git) Nov 28 03:38:18 localhost systemd[1]: tmp-crun.YBFHk4.mount: Deactivated successfully. Nov 28 03:38:18 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:38:18 localhost podman[86593]: 2025-11-28 08:38:18.915496158 +0000 UTC m=+0.149315889 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, vcs-type=git) Nov 28 03:38:18 localhost podman[86593]: 2025-11-28 08:38:18.953484447 +0000 UTC m=+0.187304218 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 28 03:38:18 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:38:19 localhost podman[86592]: 2025-11-28 08:38:18.957457181 +0000 UTC m=+0.194720879 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team) Nov 28 03:38:19 localhost podman[86592]: 2025-11-28 08:38:19.038485838 +0000 UTC m=+0.275749536 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, io.openshift.expose-services=, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Nov 28 03:38:19 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:38:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:38:19 localhost podman[86662]: 2025-11-28 08:38:19.847228454 +0000 UTC m=+0.085904038 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, tcib_managed=true, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git) Nov 28 03:38:20 localhost podman[86662]: 2025-11-28 08:38:20.236512295 +0000 UTC m=+0.475187829 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com) Nov 28 03:38:20 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:38:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:38:21 localhost podman[86684]: 2025-11-28 08:38:21.837805678 +0000 UTC m=+0.076065203 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public) Nov 28 03:38:21 localhost podman[86684]: 2025-11-28 08:38:21.87133465 +0000 UTC m=+0.109594205 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:38:21 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:38:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:38:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:38:26 localhost systemd[1]: tmp-crun.loXtch.mount: Deactivated successfully. Nov 28 03:38:26 localhost podman[86710]: 2025-11-28 08:38:26.852992507 +0000 UTC m=+0.090771290 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, architecture=x86_64, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1) Nov 28 03:38:26 localhost podman[86711]: 2025-11-28 08:38:26.903990941 +0000 UTC m=+0.137346977 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, container_name=ovn_controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 28 03:38:26 localhost podman[86710]: 2025-11-28 08:38:26.921387871 +0000 UTC m=+0.159166674 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 03:38:26 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:38:26 localhost podman[86711]: 2025-11-28 08:38:26.976937206 +0000 UTC m=+0.210293242 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.buildah.version=1.41.4, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 03:38:26 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:38:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 03:38:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.1 total, 600.0 interval#012Cumulative writes: 4899 writes, 22K keys, 4899 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4899 writes, 608 syncs, 8.06 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 358 writes, 1243 keys, 358 commit groups, 1.0 writes per commit group, ingest: 1.48 MB, 0.00 MB/s#012Interval WAL: 358 writes, 149 syncs, 2.40 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 28 03:38:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 03:38:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.3 total, 600.0 interval#012Cumulative writes: 5616 writes, 25K keys, 5616 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5616 writes, 758 syncs, 7.41 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 586 writes, 2486 keys, 586 commit groups, 1.0 writes per commit group, ingest: 3.16 MB, 0.01 MB/s#012Interval WAL: 586 writes, 195 syncs, 3.01 writes per sync, written: 0.00 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 28 03:38:42 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 03:38:42 localhost recover_tripleo_nova_virtqemud[86773]: 61397 Nov 28 03:38:42 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 03:38:42 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 03:38:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:38:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:38:43 localhost podman[86821]: 2025-11-28 08:38:43.844486798 +0000 UTC m=+0.078057924 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., container_name=iscsid, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=) Nov 28 03:38:43 localhost podman[86821]: 2025-11-28 08:38:43.88638937 +0000 UTC m=+0.119960456 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, name=rhosp17/openstack-iscsid, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, config_id=tripleo_step3, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044) Nov 28 03:38:43 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:38:43 localhost podman[86822]: 2025-11-28 08:38:43.895739251 +0000 UTC m=+0.128826252 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., container_name=collectd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12) Nov 28 03:38:43 localhost podman[86822]: 2025-11-28 08:38:43.980431151 +0000 UTC m=+0.213518152 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, container_name=collectd, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd) Nov 28 03:38:43 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:38:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:38:46 localhost systemd[1]: tmp-crun.XfXIrz.mount: Deactivated successfully. Nov 28 03:38:46 localhost podman[86874]: 2025-11-28 08:38:46.859864778 +0000 UTC m=+0.097903801 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, container_name=metrics_qdr, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z) Nov 28 03:38:47 localhost podman[86874]: 2025-11-28 08:38:47.056422013 +0000 UTC m=+0.294461036 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, architecture=x86_64, tcib_managed=true, vcs-type=git) Nov 28 03:38:47 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:38:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:38:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:38:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:38:49 localhost podman[86905]: 2025-11-28 08:38:49.867926231 +0000 UTC m=+0.101825354 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4) Nov 28 03:38:49 localhost systemd[1]: tmp-crun.KrrWjT.mount: Deactivated successfully. Nov 28 03:38:49 localhost podman[86903]: 2025-11-28 08:38:49.931612868 +0000 UTC m=+0.170776195 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:38:49 localhost podman[86903]: 2025-11-28 08:38:49.961063293 +0000 UTC m=+0.200226630 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, url=https://www.redhat.com) Nov 28 03:38:49 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:38:49 localhost podman[86904]: 2025-11-28 08:38:49.983735977 +0000 UTC m=+0.221060267 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, tcib_managed=true, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.) Nov 28 03:38:49 localhost podman[86904]: 2025-11-28 08:38:49.991203819 +0000 UTC m=+0.228528109 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, io.openshift.expose-services=) Nov 28 03:38:50 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:38:50 localhost podman[86905]: 2025-11-28 08:38:50.048152918 +0000 UTC m=+0.282052041 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, distribution-scope=public, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible) Nov 28 03:38:50 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:38:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:38:50 localhost podman[86975]: 2025-11-28 08:38:50.838083021 +0000 UTC m=+0.076557569 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, vendor=Red Hat, Inc.) Nov 28 03:38:50 localhost systemd[1]: tmp-crun.X9nNub.mount: Deactivated successfully. Nov 28 03:38:51 localhost podman[86975]: 2025-11-28 08:38:51.208390251 +0000 UTC m=+0.446864779 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:38:51 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:38:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:38:52 localhost systemd[1]: tmp-crun.J4XyRa.mount: Deactivated successfully. Nov 28 03:38:52 localhost podman[86998]: 2025-11-28 08:38:52.849388866 +0000 UTC m=+0.091149362 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:38:52 localhost podman[86998]: 2025-11-28 08:38:52.880339068 +0000 UTC m=+0.122099594 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com) Nov 28 03:38:52 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:38:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:38:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:38:57 localhost systemd[1]: tmp-crun.u263jz.mount: Deactivated successfully. Nov 28 03:38:57 localhost podman[87025]: 2025-11-28 08:38:57.85188362 +0000 UTC m=+0.085035003 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, container_name=ovn_controller, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 28 03:38:57 localhost podman[87024]: 2025-11-28 08:38:57.904318767 +0000 UTC m=+0.140703110 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 28 03:38:57 localhost podman[87025]: 2025-11-28 08:38:57.92662619 +0000 UTC m=+0.159777583 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, tcib_managed=true, distribution-scope=public, container_name=ovn_controller, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-18T23:34:05Z) Nov 28 03:38:57 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:38:57 localhost podman[87024]: 2025-11-28 08:38:57.979486772 +0000 UTC m=+0.215871155 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=) Nov 28 03:38:57 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:38:58 localhost systemd[1]: tmp-crun.uocFgP.mount: Deactivated successfully. Nov 28 03:39:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:39:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:39:14 localhost podman[87116]: 2025-11-28 08:39:14.857865596 +0000 UTC m=+0.087603301 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, vcs-type=git, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Nov 28 03:39:14 localhost podman[87116]: 2025-11-28 08:39:14.898783507 +0000 UTC m=+0.128521272 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=) Nov 28 03:39:14 localhost systemd[1]: tmp-crun.9gG2sI.mount: Deactivated successfully. Nov 28 03:39:14 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:39:14 localhost podman[87117]: 2025-11-28 08:39:14.921139932 +0000 UTC m=+0.148551445 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, distribution-scope=public) Nov 28 03:39:14 localhost podman[87117]: 2025-11-28 08:39:14.935415455 +0000 UTC m=+0.162826988 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, distribution-scope=public, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, container_name=collectd, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:39:14 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:39:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:39:17 localhost systemd[1]: tmp-crun.QcOmEs.mount: Deactivated successfully. Nov 28 03:39:17 localhost podman[87159]: 2025-11-28 08:39:17.856360401 +0000 UTC m=+0.094486005 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1) Nov 28 03:39:18 localhost podman[87159]: 2025-11-28 08:39:18.043592146 +0000 UTC m=+0.281717820 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=metrics_qdr, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z) Nov 28 03:39:18 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:39:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:39:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:39:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:39:20 localhost systemd[1]: tmp-crun.qZnROW.mount: Deactivated successfully. Nov 28 03:39:20 localhost podman[87186]: 2025-11-28 08:39:20.865099474 +0000 UTC m=+0.101104331 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:39:20 localhost podman[87187]: 2025-11-28 08:39:20.958807105 +0000 UTC m=+0.189180117 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, container_name=logrotate_crond, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-type=git, version=17.1.12, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron) Nov 28 03:39:20 localhost podman[87187]: 2025-11-28 08:39:20.967970219 +0000 UTC m=+0.198343261 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, vcs-type=git, version=17.1.12, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z) Nov 28 03:39:20 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:39:21 localhost podman[87186]: 2025-11-28 08:39:21.024398822 +0000 UTC m=+0.260403669 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container) Nov 28 03:39:21 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:39:21 localhost podman[87188]: 2025-11-28 08:39:21.113465528 +0000 UTC m=+0.342456976 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, version=17.1.12, batch=17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 28 03:39:21 localhost podman[87188]: 2025-11-28 08:39:21.144185212 +0000 UTC m=+0.373176620 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, container_name=ceilometer_agent_ipmi, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 28 03:39:21 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:39:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:39:21 localhost podman[87259]: 2025-11-28 08:39:21.860120998 +0000 UTC m=+0.091496763 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1) Nov 28 03:39:22 localhost podman[87259]: 2025-11-28 08:39:22.257485208 +0000 UTC m=+0.488860943 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:39:22 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:39:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:39:23 localhost systemd[1]: tmp-crun.VGAhPV.mount: Deactivated successfully. Nov 28 03:39:23 localhost podman[87283]: 2025-11-28 08:39:23.853483065 +0000 UTC m=+0.090846162 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Nov 28 03:39:23 localhost podman[87283]: 2025-11-28 08:39:23.88291539 +0000 UTC m=+0.120278487 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step5, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, tcib_managed=true, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:39:23 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:39:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:39:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:39:28 localhost systemd[1]: tmp-crun.sPJSFc.mount: Deactivated successfully. Nov 28 03:39:28 localhost podman[87310]: 2025-11-28 08:39:28.85921652 +0000 UTC m=+0.094745024 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:39:28 localhost podman[87310]: 2025-11-28 08:39:28.887345553 +0000 UTC m=+0.122874097 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, version=17.1.12, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container) Nov 28 03:39:28 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:39:28 localhost systemd[1]: tmp-crun.sKhyOl.mount: Deactivated successfully. Nov 28 03:39:28 localhost podman[87309]: 2025-11-28 08:39:28.95578773 +0000 UTC m=+0.194436011 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.12, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com) Nov 28 03:39:28 localhost podman[87309]: 2025-11-28 08:39:28.996936667 +0000 UTC m=+0.235584948 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12) Nov 28 03:39:29 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:39:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:39:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:39:45 localhost systemd[1]: tmp-crun.iySGs3.mount: Deactivated successfully. Nov 28 03:39:45 localhost podman[87355]: 2025-11-28 08:39:45.847342215 +0000 UTC m=+0.084088102 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4) Nov 28 03:39:45 localhost podman[87355]: 2025-11-28 08:39:45.856490409 +0000 UTC m=+0.093236326 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, distribution-scope=public, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step3, architecture=x86_64) Nov 28 03:39:45 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:39:45 localhost systemd[1]: tmp-crun.xXlFKd.mount: Deactivated successfully. Nov 28 03:39:45 localhost podman[87356]: 2025-11-28 08:39:45.961010636 +0000 UTC m=+0.194495762 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, container_name=collectd, batch=17.1_20251118.1) Nov 28 03:39:45 localhost podman[87356]: 2025-11-28 08:39:45.971910564 +0000 UTC m=+0.205395680 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., container_name=collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:39:45 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:39:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:39:48 localhost podman[87523]: 2025-11-28 08:39:48.381180449 +0000 UTC m=+0.076304041 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, container_name=metrics_qdr, vcs-type=git, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step1, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:39:48 localhost podman[87523]: 2025-11-28 08:39:48.568490577 +0000 UTC m=+0.263614219 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, container_name=metrics_qdr, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true) Nov 28 03:39:48 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:39:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:39:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:39:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:39:51 localhost systemd[1]: tmp-crun.wpbXhR.mount: Deactivated successfully. Nov 28 03:39:51 localhost systemd[1]: tmp-crun.c4oX9U.mount: Deactivated successfully. Nov 28 03:39:51 localhost podman[87553]: 2025-11-28 08:39:51.918347825 +0000 UTC m=+0.154360425 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=ceilometer_agent_compute, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12) Nov 28 03:39:51 localhost podman[87554]: 2025-11-28 08:39:51.873526792 +0000 UTC m=+0.107476108 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, container_name=logrotate_crond, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:39:51 localhost podman[87554]: 2025-11-28 08:39:51.959538904 +0000 UTC m=+0.193488230 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, release=1761123044, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:39:51 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:39:51 localhost podman[87553]: 2025-11-28 08:39:51.974456287 +0000 UTC m=+0.210468917 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 28 03:39:52 localhost podman[87555]: 2025-11-28 08:39:52.019169816 +0000 UTC m=+0.246012171 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 28 03:39:52 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:39:52 localhost podman[87555]: 2025-11-28 08:39:52.082093511 +0000 UTC m=+0.308935786 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20251118.1, release=1761123044) Nov 28 03:39:52 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:39:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:39:52 localhost podman[87628]: 2025-11-28 08:39:52.85131612 +0000 UTC m=+0.088788338 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:39:53 localhost podman[87628]: 2025-11-28 08:39:53.293485572 +0000 UTC m=+0.530957770 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target) Nov 28 03:39:53 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:39:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:39:54 localhost systemd[1]: tmp-crun.ksQEil.mount: Deactivated successfully. Nov 28 03:39:54 localhost podman[87653]: 2025-11-28 08:39:54.870608054 +0000 UTC m=+0.095785196 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, container_name=nova_compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1) Nov 28 03:39:54 localhost podman[87653]: 2025-11-28 08:39:54.907344765 +0000 UTC m=+0.132521887 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:39:54 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:39:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:39:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:39:59 localhost systemd[1]: tmp-crun.eUI2SD.mount: Deactivated successfully. Nov 28 03:39:59 localhost podman[87679]: 2025-11-28 08:39:59.867156364 +0000 UTC m=+0.097314833 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, release=1761123044, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:39:59 localhost podman[87679]: 2025-11-28 08:39:59.917547189 +0000 UTC m=+0.147705668 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 28 03:39:59 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:40:00 localhost podman[87680]: 2025-11-28 08:39:59.999544996 +0000 UTC m=+0.232130851 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-18T23:34:05Z) Nov 28 03:40:00 localhost podman[87680]: 2025-11-28 08:40:00.031499798 +0000 UTC m=+0.264085673 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, build-date=2025-11-18T23:34:05Z, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, batch=17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:40:00 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:40:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:40:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:40:16 localhost systemd[1]: tmp-crun.7B4iGC.mount: Deactivated successfully. Nov 28 03:40:16 localhost podman[87773]: 2025-11-28 08:40:16.847376232 +0000 UTC m=+0.088367715 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, vendor=Red Hat, Inc., container_name=iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:40:16 localhost podman[87774]: 2025-11-28 08:40:16.904163845 +0000 UTC m=+0.139820893 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, batch=17.1_20251118.1, container_name=collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, version=17.1.12, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, release=1761123044) Nov 28 03:40:16 localhost podman[87773]: 2025-11-28 08:40:16.933359573 +0000 UTC m=+0.174351076 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:40:16 localhost podman[87774]: 2025-11-28 08:40:16.942465815 +0000 UTC m=+0.178122833 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, container_name=collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, tcib_managed=true, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd) Nov 28 03:40:16 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:40:16 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:40:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:40:18 localhost systemd[1]: tmp-crun.maPk1q.mount: Deactivated successfully. Nov 28 03:40:18 localhost podman[87814]: 2025-11-28 08:40:18.849838273 +0000 UTC m=+0.088969674 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z) Nov 28 03:40:19 localhost podman[87814]: 2025-11-28 08:40:19.082275602 +0000 UTC m=+0.321407063 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, url=https://www.redhat.com, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, io.buildah.version=1.41.4, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:40:19 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:40:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:40:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:40:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:40:22 localhost systemd[1]: tmp-crun.aQdWDT.mount: Deactivated successfully. Nov 28 03:40:22 localhost podman[87843]: 2025-11-28 08:40:22.846791478 +0000 UTC m=+0.084317660 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z) Nov 28 03:40:22 localhost podman[87845]: 2025-11-28 08:40:22.889580796 +0000 UTC m=+0.119316757 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z) Nov 28 03:40:22 localhost podman[87844]: 2025-11-28 08:40:22.943294154 +0000 UTC m=+0.179816085 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, name=rhosp17/openstack-cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z) Nov 28 03:40:22 localhost podman[87844]: 2025-11-28 08:40:22.948461675 +0000 UTC m=+0.184983606 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64) Nov 28 03:40:22 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:40:22 localhost podman[87845]: 2025-11-28 08:40:22.969582161 +0000 UTC m=+0.199318182 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-19T00:12:45Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 28 03:40:22 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:40:23 localhost podman[87843]: 2025-11-28 08:40:23.021606636 +0000 UTC m=+0.259132828 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc.) Nov 28 03:40:23 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:40:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:40:23 localhost systemd[1]: tmp-crun.IvXF6I.mount: Deactivated successfully. Nov 28 03:40:23 localhost podman[87913]: 2025-11-28 08:40:23.848936701 +0000 UTC m=+0.080484920 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1) Nov 28 03:40:24 localhost podman[87913]: 2025-11-28 08:40:24.219417808 +0000 UTC m=+0.450966057 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, release=1761123044, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=) Nov 28 03:40:24 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:40:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:40:25 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 03:40:25 localhost recover_tripleo_nova_virtqemud[87940]: 61397 Nov 28 03:40:25 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 03:40:25 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 03:40:25 localhost podman[87934]: 2025-11-28 08:40:25.837576903 +0000 UTC m=+0.078802218 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:40:25 localhost podman[87934]: 2025-11-28 08:40:25.893872441 +0000 UTC m=+0.135097706 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step5, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044) Nov 28 03:40:25 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:40:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:40:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:40:30 localhost systemd[1]: tmp-crun.5vgE4e.mount: Deactivated successfully. Nov 28 03:40:30 localhost podman[87963]: 2025-11-28 08:40:30.858722678 +0000 UTC m=+0.091298957 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:40:30 localhost podman[87963]: 2025-11-28 08:40:30.904633794 +0000 UTC m=+0.137210073 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 03:40:30 localhost podman[87962]: 2025-11-28 08:40:30.903819639 +0000 UTC m=+0.138556555 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 03:40:30 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:40:30 localhost podman[87962]: 2025-11-28 08:40:30.983846474 +0000 UTC m=+0.218583330 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1761123044, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1) Nov 28 03:40:30 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:40:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:40:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:40:47 localhost podman[88012]: 2025-11-28 08:40:47.853946851 +0000 UTC m=+0.089708647 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container) Nov 28 03:40:47 localhost podman[88012]: 2025-11-28 08:40:47.865522221 +0000 UTC m=+0.101284027 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., container_name=iscsid, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:40:47 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:40:47 localhost podman[88013]: 2025-11-28 08:40:47.953279066 +0000 UTC m=+0.186588106 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-collectd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:40:47 localhost podman[88013]: 2025-11-28 08:40:47.962077429 +0000 UTC m=+0.195386429 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step3, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-collectd-container, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:40:47 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:40:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:40:49 localhost systemd[1]: tmp-crun.hRtnyV.mount: Deactivated successfully. Nov 28 03:40:49 localhost podman[88112]: 2025-11-28 08:40:49.848854568 +0000 UTC m=+0.086259331 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, distribution-scope=public, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=) Nov 28 03:40:50 localhost podman[88112]: 2025-11-28 08:40:50.038494386 +0000 UTC m=+0.275899199 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=metrics_qdr) Nov 28 03:40:50 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:40:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:40:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:40:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:40:53 localhost systemd[1]: tmp-crun.NQZLjS.mount: Deactivated successfully. Nov 28 03:40:53 localhost podman[88157]: 2025-11-28 08:40:53.855570546 +0000 UTC m=+0.092137843 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, version=17.1.12, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute) Nov 28 03:40:53 localhost podman[88157]: 2025-11-28 08:40:53.889591932 +0000 UTC m=+0.126159179 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044) Nov 28 03:40:53 localhost podman[88159]: 2025-11-28 08:40:53.902302376 +0000 UTC m=+0.136451248 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 28 03:40:53 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:40:53 localhost podman[88158]: 2025-11-28 08:40:53.957525242 +0000 UTC m=+0.192533081 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, vcs-type=git, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container) Nov 28 03:40:53 localhost podman[88158]: 2025-11-28 08:40:53.965889251 +0000 UTC m=+0.200897140 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com) Nov 28 03:40:53 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:40:54 localhost podman[88159]: 2025-11-28 08:40:54.010842748 +0000 UTC m=+0.244991670 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, version=17.1.12, release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com) Nov 28 03:40:54 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:40:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:40:54 localhost systemd[1]: tmp-crun.LGOMcN.mount: Deactivated successfully. Nov 28 03:40:54 localhost podman[88228]: 2025-11-28 08:40:54.857933827 +0000 UTC m=+0.088608933 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, release=1761123044, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:40:55 localhost podman[88228]: 2025-11-28 08:40:55.222423407 +0000 UTC m=+0.453098503 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:40:55 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:40:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:40:56 localhost systemd[1]: tmp-crun.2lXeVo.mount: Deactivated successfully. Nov 28 03:40:56 localhost podman[88252]: 2025-11-28 08:40:56.853548605 +0000 UTC m=+0.088322005 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, build-date=2025-11-19T00:36:58Z, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.buildah.version=1.41.4) Nov 28 03:40:56 localhost podman[88252]: 2025-11-28 08:40:56.884460995 +0000 UTC m=+0.119234385 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container) Nov 28 03:40:56 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:41:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:41:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:41:01 localhost podman[88278]: 2025-11-28 08:41:01.831102525 +0000 UTC m=+0.068783478 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, tcib_managed=true, container_name=ovn_metadata_agent, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1) Nov 28 03:41:01 localhost systemd[1]: tmp-crun.NLC4mH.mount: Deactivated successfully. Nov 28 03:41:01 localhost podman[88278]: 2025-11-28 08:41:01.892609375 +0000 UTC m=+0.130290348 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=) Nov 28 03:41:01 localhost podman[88279]: 2025-11-28 08:41:01.891924803 +0000 UTC m=+0.125723895 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, io.buildah.version=1.41.4, container_name=ovn_controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 03:41:01 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:41:01 localhost podman[88279]: 2025-11-28 08:41:01.976230542 +0000 UTC m=+0.210029564 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public) Nov 28 03:41:01 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:41:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:41:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:41:18 localhost systemd[1]: tmp-crun.kwt0KY.mount: Deactivated successfully. Nov 28 03:41:18 localhost podman[88350]: 2025-11-28 08:41:18.85793997 +0000 UTC m=+0.092360339 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:41:18 localhost podman[88350]: 2025-11-28 08:41:18.869350105 +0000 UTC m=+0.103770504 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:41:18 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:41:18 localhost podman[88349]: 2025-11-28 08:41:18.955379827 +0000 UTC m=+0.191263442 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, container_name=iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:41:18 localhost podman[88349]: 2025-11-28 08:41:18.964736347 +0000 UTC m=+0.200619992 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.) Nov 28 03:41:18 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:41:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:41:20 localhost podman[88387]: 2025-11-28 08:41:20.844233759 +0000 UTC m=+0.081960577 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:41:21 localhost podman[88387]: 2025-11-28 08:41:21.034339543 +0000 UTC m=+0.272066371 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:41:21 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:41:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:41:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:41:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:41:24 localhost systemd[1]: tmp-crun.Z6Piwy.mount: Deactivated successfully. Nov 28 03:41:24 localhost podman[88418]: 2025-11-28 08:41:24.850801511 +0000 UTC m=+0.090245173 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Nov 28 03:41:24 localhost podman[88418]: 2025-11-28 08:41:24.890567136 +0000 UTC m=+0.130010788 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, batch=17.1_20251118.1, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:41:24 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:41:24 localhost podman[88417]: 2025-11-28 08:41:24.943411267 +0000 UTC m=+0.183158700 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, distribution-scope=public, url=https://www.redhat.com, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc.) Nov 28 03:41:24 localhost podman[88419]: 2025-11-28 08:41:24.894105646 +0000 UTC m=+0.127992706 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible) Nov 28 03:41:24 localhost podman[88417]: 2025-11-28 08:41:24.997618271 +0000 UTC m=+0.237365693 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 28 03:41:25 localhost podman[88419]: 2025-11-28 08:41:25.02945063 +0000 UTC m=+0.263337740 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, version=17.1.12) Nov 28 03:41:25 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:41:25 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:41:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:41:25 localhost podman[88488]: 2025-11-28 08:41:25.843691418 +0000 UTC m=+0.079892492 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, distribution-scope=public, io.buildah.version=1.41.4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, release=1761123044, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=) Nov 28 03:41:26 localhost podman[88488]: 2025-11-28 08:41:26.214382661 +0000 UTC m=+0.450583775 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, tcib_managed=true, distribution-scope=public, build-date=2025-11-19T00:36:58Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container) Nov 28 03:41:26 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:41:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:41:27 localhost systemd[1]: tmp-crun.fXYHIa.mount: Deactivated successfully. Nov 28 03:41:27 localhost podman[88511]: 2025-11-28 08:41:27.844274722 +0000 UTC m=+0.082666379 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.buildah.version=1.41.4, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, io.openshift.expose-services=) Nov 28 03:41:27 localhost podman[88511]: 2025-11-28 08:41:27.876365058 +0000 UTC m=+0.114756715 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, container_name=nova_compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:41:27 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:41:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:41:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:41:32 localhost podman[88538]: 2025-11-28 08:41:32.830840472 +0000 UTC m=+0.073205435 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, release=1761123044, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.buildah.version=1.41.4) Nov 28 03:41:32 localhost podman[88538]: 2025-11-28 08:41:32.877454769 +0000 UTC m=+0.119819682 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, batch=17.1_20251118.1, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 28 03:41:32 localhost systemd[1]: tmp-crun.uBtEVH.mount: Deactivated successfully. Nov 28 03:41:32 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:41:32 localhost podman[88539]: 2025-11-28 08:41:32.901595479 +0000 UTC m=+0.139682369 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, vendor=Red Hat, Inc.) Nov 28 03:41:32 localhost podman[88539]: 2025-11-28 08:41:32.95348362 +0000 UTC m=+0.191570510 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:41:32 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:41:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:41:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:41:49 localhost systemd[1]: tmp-crun.qZuRae.mount: Deactivated successfully. Nov 28 03:41:49 localhost systemd[1]: tmp-crun.aNkFPe.mount: Deactivated successfully. Nov 28 03:41:49 localhost podman[88585]: 2025-11-28 08:41:49.892942138 +0000 UTC m=+0.129248455 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, container_name=iscsid, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:41:49 localhost podman[88586]: 2025-11-28 08:41:49.866543799 +0000 UTC m=+0.098679897 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, url=https://www.redhat.com, container_name=collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:41:49 localhost podman[88585]: 2025-11-28 08:41:49.926200281 +0000 UTC m=+0.162506618 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git) Nov 28 03:41:49 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:41:49 localhost podman[88586]: 2025-11-28 08:41:49.951451606 +0000 UTC m=+0.183587674 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:41:49 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:41:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:41:51 localhost podman[88725]: 2025-11-28 08:41:51.08531269 +0000 UTC m=+0.101883745 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, description=Red Hat Ceph Storage 7, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, io.buildah.version=1.33.12, name=rhceph, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , version=7, RELEASE=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_BRANCH=main, ceph=True, CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.openshift.tags=rhceph ceph) Nov 28 03:41:51 localhost systemd[1]: tmp-crun.OpDrWk.mount: Deactivated successfully. Nov 28 03:41:51 localhost podman[88725]: 2025-11-28 08:41:51.190350932 +0000 UTC m=+0.206921967 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, release=553, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, version=7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, GIT_CLEAN=True, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 03:41:51 localhost podman[88744]: 2025-11-28 08:41:51.193374826 +0000 UTC m=+0.100019286 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, container_name=metrics_qdr, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible) Nov 28 03:41:51 localhost podman[88744]: 2025-11-28 08:41:51.356455091 +0000 UTC m=+0.263099491 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true) Nov 28 03:41:51 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:41:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:41:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:41:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:41:55 localhost systemd[1]: tmp-crun.HxaACk.mount: Deactivated successfully. Nov 28 03:41:55 localhost podman[88901]: 2025-11-28 08:41:55.85392472 +0000 UTC m=+0.087334473 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 28 03:41:55 localhost podman[88899]: 2025-11-28 08:41:55.900564429 +0000 UTC m=+0.136996796 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_id=tripleo_step4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=ceilometer_agent_compute) Nov 28 03:41:55 localhost podman[88901]: 2025-11-28 08:41:55.90834969 +0000 UTC m=+0.141759423 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:41:55 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:41:55 localhost podman[88899]: 2025-11-28 08:41:55.975384742 +0000 UTC m=+0.211817049 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, container_name=ceilometer_agent_compute, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4) Nov 28 03:41:55 localhost podman[88900]: 2025-11-28 08:41:55.973127223 +0000 UTC m=+0.209344613 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Nov 28 03:41:55 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:41:56 localhost podman[88900]: 2025-11-28 08:41:56.060528707 +0000 UTC m=+0.296746137 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, container_name=logrotate_crond) Nov 28 03:41:56 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:41:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:41:56 localhost podman[88973]: 2025-11-28 08:41:56.84146896 +0000 UTC m=+0.081872404 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64) Nov 28 03:41:57 localhost podman[88973]: 2025-11-28 08:41:57.211378818 +0000 UTC m=+0.451782312 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, distribution-scope=public, build-date=2025-11-19T00:36:58Z, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:41:57 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:41:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:41:58 localhost podman[88997]: 2025-11-28 08:41:58.839292506 +0000 UTC m=+0.078691155 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, maintainer=OpenStack TripleO Team) Nov 28 03:41:58 localhost podman[88997]: 2025-11-28 08:41:58.899110914 +0000 UTC m=+0.138509533 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, release=1761123044, name=rhosp17/openstack-nova-compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true) Nov 28 03:41:58 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:42:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:42:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:42:03 localhost systemd[1]: tmp-crun.26yWzQ.mount: Deactivated successfully. Nov 28 03:42:03 localhost podman[89024]: 2025-11-28 08:42:03.842995567 +0000 UTC m=+0.080504902 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, release=1761123044, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=) Nov 28 03:42:03 localhost podman[89024]: 2025-11-28 08:42:03.889057067 +0000 UTC m=+0.126566442 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-19T00:14:25Z, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:42:03 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:42:03 localhost podman[89025]: 2025-11-28 08:42:03.904031042 +0000 UTC m=+0.138851083 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:42:03 localhost podman[89025]: 2025-11-28 08:42:03.983461349 +0000 UTC m=+0.218281370 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=) Nov 28 03:42:03 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:42:05 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 03:42:05 localhost recover_tripleo_nova_virtqemud[89075]: 61397 Nov 28 03:42:05 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 03:42:05 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 03:42:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:42:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:42:20 localhost systemd[1]: tmp-crun.zY3GfN.mount: Deactivated successfully. Nov 28 03:42:20 localhost podman[89077]: 2025-11-28 08:42:20.852069943 +0000 UTC m=+0.086085734 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, container_name=collectd, com.redhat.component=openstack-collectd-container, release=1761123044, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step3, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64) Nov 28 03:42:20 localhost podman[89077]: 2025-11-28 08:42:20.863364203 +0000 UTC m=+0.097379964 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64) Nov 28 03:42:20 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:42:20 localhost podman[89076]: 2025-11-28 08:42:20.95886524 +0000 UTC m=+0.191562301 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1761123044, url=https://www.redhat.com, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, build-date=2025-11-18T23:44:13Z, vcs-type=git, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:42:20 localhost podman[89076]: 2025-11-28 08:42:20.998511811 +0000 UTC m=+0.231208852 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, release=1761123044, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vcs-type=git, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:42:21 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:42:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:42:21 localhost systemd[1]: tmp-crun.xATJQM.mount: Deactivated successfully. Nov 28 03:42:21 localhost podman[89117]: 2025-11-28 08:42:21.858079787 +0000 UTC m=+0.091795102 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=metrics_qdr, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:42:22 localhost podman[89117]: 2025-11-28 08:42:22.047243492 +0000 UTC m=+0.280958837 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:42:22 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:42:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:42:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:42:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:42:26 localhost systemd[1]: tmp-crun.mdgcNu.mount: Deactivated successfully. Nov 28 03:42:26 localhost podman[89148]: 2025-11-28 08:42:26.860231019 +0000 UTC m=+0.100476761 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z) Nov 28 03:42:26 localhost podman[89148]: 2025-11-28 08:42:26.890302744 +0000 UTC m=+0.130548456 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, distribution-scope=public, architecture=x86_64) Nov 28 03:42:26 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:42:26 localhost podman[89150]: 2025-11-28 08:42:26.910976345 +0000 UTC m=+0.141869866 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true) Nov 28 03:42:26 localhost podman[89149]: 2025-11-28 08:42:26.959981958 +0000 UTC m=+0.194598295 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:42:26 localhost podman[89150]: 2025-11-28 08:42:26.969685729 +0000 UTC m=+0.200579230 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:42:26 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:42:26 localhost podman[89149]: 2025-11-28 08:42:26.99322111 +0000 UTC m=+0.227837417 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=logrotate_crond, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Nov 28 03:42:27 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:42:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:42:27 localhost podman[89220]: 2025-11-28 08:42:27.845284753 +0000 UTC m=+0.082191094 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, container_name=nova_migration_target, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com) Nov 28 03:42:28 localhost podman[89220]: 2025-11-28 08:42:28.208382979 +0000 UTC m=+0.445289330 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:42:28 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:42:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:42:29 localhost podman[89243]: 2025-11-28 08:42:29.873336469 +0000 UTC m=+0.110596526 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=) Nov 28 03:42:29 localhost podman[89243]: 2025-11-28 08:42:29.898370247 +0000 UTC m=+0.135630264 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, tcib_managed=true, container_name=nova_compute) Nov 28 03:42:29 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:42:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:42:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:42:34 localhost podman[89292]: 2025-11-28 08:42:34.841886078 +0000 UTC m=+0.078485718 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 28 03:42:34 localhost podman[89292]: 2025-11-28 08:42:34.888393363 +0000 UTC m=+0.124992953 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-11-19T00:14:25Z) Nov 28 03:42:34 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:42:34 localhost systemd[1]: tmp-crun.YejIWT.mount: Deactivated successfully. Nov 28 03:42:34 localhost podman[89293]: 2025-11-28 08:42:34.972170125 +0000 UTC m=+0.204657637 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, release=1761123044, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com) Nov 28 03:42:35 localhost podman[89293]: 2025-11-28 08:42:35.024406107 +0000 UTC m=+0.256893649 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, release=1761123044, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z) Nov 28 03:42:35 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:42:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:42:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:42:51 localhost podman[89340]: 2025-11-28 08:42:51.855153697 +0000 UTC m=+0.083927846 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, version=17.1.12, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, container_name=iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 28 03:42:51 localhost podman[89340]: 2025-11-28 08:42:51.861355341 +0000 UTC m=+0.090129420 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-type=git, version=17.1.12, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:42:51 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:42:51 localhost podman[89341]: 2025-11-28 08:42:51.904983135 +0000 UTC m=+0.128463121 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, config_id=tripleo_step3, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:42:51 localhost podman[89341]: 2025-11-28 08:42:51.944590065 +0000 UTC m=+0.168070021 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=collectd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12) Nov 28 03:42:51 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:42:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:42:52 localhost podman[89379]: 2025-11-28 08:42:52.882757492 +0000 UTC m=+0.082031229 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd) Nov 28 03:42:53 localhost podman[89379]: 2025-11-28 08:42:53.078316205 +0000 UTC m=+0.277589942 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, release=1761123044, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, version=17.1.12) Nov 28 03:42:53 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:42:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:42:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:42:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:42:57 localhost podman[89486]: 2025-11-28 08:42:57.888317141 +0000 UTC m=+0.125761067 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:42:57 localhost systemd[1]: tmp-crun.nv47X8.mount: Deactivated successfully. Nov 28 03:42:57 localhost podman[89486]: 2025-11-28 08:42:57.949818291 +0000 UTC m=+0.187262217 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 28 03:42:57 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:42:57 localhost podman[89488]: 2025-11-28 08:42:57.993112185 +0000 UTC m=+0.225112222 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, release=1761123044, config_id=tripleo_step4, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true) Nov 28 03:42:58 localhost podman[89487]: 2025-11-28 08:42:57.954823107 +0000 UTC m=+0.191234140 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:42:58 localhost podman[89488]: 2025-11-28 08:42:58.017758031 +0000 UTC m=+0.249758068 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.) Nov 28 03:42:58 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:42:58 localhost podman[89487]: 2025-11-28 08:42:58.035344398 +0000 UTC m=+0.271755461 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Nov 28 03:42:58 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:42:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:42:58 localhost podman[89559]: 2025-11-28 08:42:58.843985991 +0000 UTC m=+0.078324883 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, release=1761123044, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:42:59 localhost podman[89559]: 2025-11-28 08:42:59.255966246 +0000 UTC m=+0.490305168 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:42:59 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:43:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:43:00 localhost podman[89583]: 2025-11-28 08:43:00.803706846 +0000 UTC m=+0.049143338 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, distribution-scope=public, config_id=tripleo_step5, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 28 03:43:00 localhost systemd[1]: tmp-crun.3KOeZk.mount: Deactivated successfully. Nov 28 03:43:00 localhost podman[89583]: 2025-11-28 08:43:00.824434329 +0000 UTC m=+0.069870831 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, container_name=nova_compute) Nov 28 03:43:00 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:43:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:43:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:43:05 localhost podman[89611]: 2025-11-28 08:43:05.84197138 +0000 UTC m=+0.082003228 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 03:43:05 localhost podman[89612]: 2025-11-28 08:43:05.901443707 +0000 UTC m=+0.139121001 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z) Nov 28 03:43:05 localhost podman[89611]: 2025-11-28 08:43:05.917509196 +0000 UTC m=+0.157541054 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, distribution-scope=public, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 03:43:05 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:43:05 localhost podman[89612]: 2025-11-28 08:43:05.944506505 +0000 UTC m=+0.182183749 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 03:43:05 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:43:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:43:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:43:22 localhost podman[89657]: 2025-11-28 08:43:22.865325698 +0000 UTC m=+0.100224094 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, container_name=iscsid, release=1761123044, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:43:22 localhost podman[89657]: 2025-11-28 08:43:22.87280066 +0000 UTC m=+0.107699006 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, architecture=x86_64, config_id=tripleo_step3, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, container_name=iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:43:22 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:43:22 localhost systemd[1]: tmp-crun.6XC80B.mount: Deactivated successfully. Nov 28 03:43:22 localhost podman[89658]: 2025-11-28 08:43:22.994127718 +0000 UTC m=+0.226231807 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., container_name=collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044) Nov 28 03:43:23 localhost podman[89658]: 2025-11-28 08:43:23.009537426 +0000 UTC m=+0.241641485 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, release=1761123044, batch=17.1_20251118.1) Nov 28 03:43:23 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:43:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:43:23 localhost podman[89696]: 2025-11-28 08:43:23.849236025 +0000 UTC m=+0.088257381 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:43:24 localhost podman[89696]: 2025-11-28 08:43:24.040414313 +0000 UTC m=+0.279435629 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd) Nov 28 03:43:24 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:43:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:43:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:43:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:43:28 localhost systemd[1]: tmp-crun.EMP0N2.mount: Deactivated successfully. Nov 28 03:43:28 localhost podman[89727]: 2025-11-28 08:43:28.831199164 +0000 UTC m=+0.070431609 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4) Nov 28 03:43:28 localhost podman[89727]: 2025-11-28 08:43:28.841411601 +0000 UTC m=+0.080644106 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, release=1761123044, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:43:28 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:43:28 localhost podman[89728]: 2025-11-28 08:43:28.917828075 +0000 UTC m=+0.147919846 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:43:28 localhost podman[89728]: 2025-11-28 08:43:28.944554494 +0000 UTC m=+0.174646235 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, release=1761123044, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 28 03:43:28 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:43:29 localhost podman[89726]: 2025-11-28 08:43:28.897208904 +0000 UTC m=+0.136997046 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4) Nov 28 03:43:29 localhost podman[89726]: 2025-11-28 08:43:29.030824774 +0000 UTC m=+0.270612846 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step4, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, tcib_managed=true, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 28 03:43:29 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:43:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:43:29 localhost systemd[1]: tmp-crun.XxYrnc.mount: Deactivated successfully. Nov 28 03:43:29 localhost systemd[1]: tmp-crun.WjxMSG.mount: Deactivated successfully. Nov 28 03:43:29 localhost podman[89796]: 2025-11-28 08:43:29.857922931 +0000 UTC m=+0.091413761 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12) Nov 28 03:43:30 localhost podman[89796]: 2025-11-28 08:43:30.276968646 +0000 UTC m=+0.510459446 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, version=17.1.12, architecture=x86_64, tcib_managed=true) Nov 28 03:43:30 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:43:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:43:31 localhost systemd[1]: tmp-crun.SmxqFL.mount: Deactivated successfully. Nov 28 03:43:31 localhost podman[89820]: 2025-11-28 08:43:31.850871686 +0000 UTC m=+0.085900139 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, distribution-scope=public, config_id=tripleo_step5, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, container_name=nova_compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.12) Nov 28 03:43:31 localhost podman[89820]: 2025-11-28 08:43:31.908523066 +0000 UTC m=+0.143551529 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, container_name=nova_compute, distribution-scope=public) Nov 28 03:43:31 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:43:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:43:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:43:36 localhost systemd[1]: tmp-crun.92iz7M.mount: Deactivated successfully. Nov 28 03:43:36 localhost podman[89848]: 2025-11-28 08:43:36.849919185 +0000 UTC m=+0.086611091 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 28 03:43:36 localhost podman[89848]: 2025-11-28 08:43:36.910342262 +0000 UTC m=+0.147034158 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, url=https://www.redhat.com, release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, vcs-type=git) Nov 28 03:43:36 localhost podman[89849]: 2025-11-28 08:43:36.910558458 +0000 UTC m=+0.143208648 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team) Nov 28 03:43:36 localhost podman[89849]: 2025-11-28 08:43:36.932769698 +0000 UTC m=+0.165419888 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., container_name=ovn_controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 03:43:36 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:43:36 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:43:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:43:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:43:53 localhost podman[89897]: 2025-11-28 08:43:53.850797417 +0000 UTC m=+0.087928703 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.12, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, tcib_managed=true, com.redhat.component=openstack-collectd-container, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:43:53 localhost podman[89897]: 2025-11-28 08:43:53.891629594 +0000 UTC m=+0.128760850 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, batch=17.1_20251118.1, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-collectd-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team) Nov 28 03:43:53 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:43:53 localhost podman[89896]: 2025-11-28 08:43:53.941163403 +0000 UTC m=+0.181618512 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, version=17.1.12) Nov 28 03:43:53 localhost podman[89896]: 2025-11-28 08:43:53.949132371 +0000 UTC m=+0.189587510 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step3, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 28 03:43:53 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:43:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:43:54 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 03:43:54 localhost podman[89948]: 2025-11-28 08:43:54.817608503 +0000 UTC m=+0.093341349 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12) Nov 28 03:43:54 localhost recover_tripleo_nova_virtqemud[89983]: 61397 Nov 28 03:43:54 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 03:43:54 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 03:43:55 localhost podman[89948]: 2025-11-28 08:43:55.006146588 +0000 UTC m=+0.281879384 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044) Nov 28 03:43:55 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:43:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:43:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:43:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:43:59 localhost podman[90045]: 2025-11-28 08:43:59.864994522 +0000 UTC m=+0.091928636 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 28 03:43:59 localhost podman[90045]: 2025-11-28 08:43:59.887960995 +0000 UTC m=+0.114895119 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, tcib_managed=true) Nov 28 03:43:59 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:43:59 localhost podman[90044]: 2025-11-28 08:43:59.902158286 +0000 UTC m=+0.132998632 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, config_id=tripleo_step4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=logrotate_crond, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Nov 28 03:43:59 localhost systemd[1]: tmp-crun.DuLpQj.mount: Deactivated successfully. Nov 28 03:43:59 localhost podman[90043]: 2025-11-28 08:43:59.963271614 +0000 UTC m=+0.195115991 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, container_name=ceilometer_agent_compute, version=17.1.12) Nov 28 03:43:59 localhost podman[90044]: 2025-11-28 08:43:59.98534604 +0000 UTC m=+0.216186426 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git) Nov 28 03:43:59 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:44:00 localhost podman[90043]: 2025-11-28 08:44:00.041356669 +0000 UTC m=+0.273201036 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4) Nov 28 03:44:00 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:44:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:44:00 localhost podman[90116]: 2025-11-28 08:44:00.844219454 +0000 UTC m=+0.075328251 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 28 03:44:01 localhost podman[90116]: 2025-11-28 08:44:01.22017797 +0000 UTC m=+0.451286767 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12) Nov 28 03:44:01 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:44:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:44:02 localhost systemd[1]: tmp-crun.2J5WrK.mount: Deactivated successfully. Nov 28 03:44:02 localhost podman[90137]: 2025-11-28 08:44:02.85391273 +0000 UTC m=+0.082705880 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, vcs-type=git) Nov 28 03:44:02 localhost podman[90137]: 2025-11-28 08:44:02.878434801 +0000 UTC m=+0.107227941 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=nova_compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=) Nov 28 03:44:02 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:44:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:44:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:44:07 localhost podman[90165]: 2025-11-28 08:44:07.839825358 +0000 UTC m=+0.078651623 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:44:07 localhost podman[90165]: 2025-11-28 08:44:07.891637528 +0000 UTC m=+0.130463773 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, container_name=ovn_controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller) Nov 28 03:44:07 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:44:07 localhost podman[90164]: 2025-11-28 08:44:07.902195636 +0000 UTC m=+0.142861328 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, tcib_managed=true, version=17.1.12, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044) Nov 28 03:44:07 localhost podman[90164]: 2025-11-28 08:44:07.948106261 +0000 UTC m=+0.188771913 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_id=tripleo_step4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 28 03:44:07 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:44:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:44:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:44:24 localhost podman[90211]: 2025-11-28 08:44:24.854901901 +0000 UTC m=+0.089393438 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z) Nov 28 03:44:24 localhost podman[90211]: 2025-11-28 08:44:24.865691806 +0000 UTC m=+0.100183313 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, container_name=collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.12) Nov 28 03:44:24 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:44:24 localhost podman[90210]: 2025-11-28 08:44:24.951135909 +0000 UTC m=+0.186040298 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_id=tripleo_step3, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team) Nov 28 03:44:24 localhost podman[90210]: 2025-11-28 08:44:24.959379025 +0000 UTC m=+0.194283414 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, vcs-type=git, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-18T23:44:13Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 28 03:44:24 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:44:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:44:25 localhost podman[90249]: 2025-11-28 08:44:25.833744451 +0000 UTC m=+0.075098384 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:44:26 localhost podman[90249]: 2025-11-28 08:44:26.004804284 +0000 UTC m=+0.246158197 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, config_id=tripleo_step1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, release=1761123044, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=) Nov 28 03:44:26 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:44:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:44:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:44:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:44:30 localhost podman[90279]: 2025-11-28 08:44:30.835451061 +0000 UTC m=+0.072383879 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, version=17.1.12, com.redhat.component=openstack-cron-container, release=1761123044, name=rhosp17/openstack-cron, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=) Nov 28 03:44:30 localhost podman[90279]: 2025-11-28 08:44:30.845247495 +0000 UTC m=+0.082180263 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Nov 28 03:44:30 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:44:30 localhost podman[90278]: 2025-11-28 08:44:30.88564094 +0000 UTC m=+0.130078832 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, tcib_managed=true) Nov 28 03:44:30 localhost podman[90278]: 2025-11-28 08:44:30.908264642 +0000 UTC m=+0.152702504 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git) Nov 28 03:44:30 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:44:30 localhost podman[90285]: 2025-11-28 08:44:30.998916667 +0000 UTC m=+0.232954445 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:44:31 localhost podman[90285]: 2025-11-28 08:44:31.029358473 +0000 UTC m=+0.263396261 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, tcib_managed=true) Nov 28 03:44:31 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:44:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:44:31 localhost systemd[1]: tmp-crun.9yCsDL.mount: Deactivated successfully. Nov 28 03:44:31 localhost podman[90352]: 2025-11-28 08:44:31.84743399 +0000 UTC m=+0.085477225 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com) Nov 28 03:44:32 localhost podman[90352]: 2025-11-28 08:44:32.228406652 +0000 UTC m=+0.466449887 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:44:32 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:44:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:44:33 localhost systemd[1]: tmp-crun.XNcEjy.mount: Deactivated successfully. Nov 28 03:44:33 localhost podman[90375]: 2025-11-28 08:44:33.848480076 +0000 UTC m=+0.086272639 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, distribution-scope=public, config_id=tripleo_step5, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12) Nov 28 03:44:33 localhost podman[90375]: 2025-11-28 08:44:33.880420319 +0000 UTC m=+0.118212862 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4) Nov 28 03:44:33 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:44:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:44:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:44:38 localhost podman[90402]: 2025-11-28 08:44:38.841396004 +0000 UTC m=+0.077865930 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com) Nov 28 03:44:38 localhost systemd[1]: tmp-crun.freZ7Q.mount: Deactivated successfully. Nov 28 03:44:38 localhost podman[90403]: 2025-11-28 08:44:38.889535038 +0000 UTC m=+0.122844326 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller) Nov 28 03:44:38 localhost podman[90402]: 2025-11-28 08:44:38.902501171 +0000 UTC m=+0.138971007 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, build-date=2025-11-19T00:14:25Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 28 03:44:38 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:44:38 localhost podman[90403]: 2025-11-28 08:44:38.953128574 +0000 UTC m=+0.186437942 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, name=rhosp17/openstack-ovn-controller, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., container_name=ovn_controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 03:44:38 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:44:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:44:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:44:55 localhost podman[90451]: 2025-11-28 08:44:55.84311033 +0000 UTC m=+0.077216600 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step3) Nov 28 03:44:55 localhost podman[90452]: 2025-11-28 08:44:55.895256609 +0000 UTC m=+0.128072568 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, container_name=collectd, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:44:55 localhost podman[90451]: 2025-11-28 08:44:55.929255205 +0000 UTC m=+0.163361515 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, config_id=tripleo_step3, container_name=iscsid, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.) Nov 28 03:44:55 localhost podman[90452]: 2025-11-28 08:44:55.930448532 +0000 UTC m=+0.163264521 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com) Nov 28 03:44:55 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:44:55 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:44:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:44:56 localhost podman[90506]: 2025-11-28 08:44:56.407369134 +0000 UTC m=+0.084443194 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, distribution-scope=public) Nov 28 03:44:56 localhost podman[90506]: 2025-11-28 08:44:56.623482845 +0000 UTC m=+0.300556905 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, container_name=metrics_qdr, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc.) Nov 28 03:44:56 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:45:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:45:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:45:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:45:01 localhost systemd[1]: tmp-crun.geOwC9.mount: Deactivated successfully. Nov 28 03:45:01 localhost podman[90597]: 2025-11-28 08:45:01.860885266 +0000 UTC m=+0.089261814 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, release=1761123044, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, batch=17.1_20251118.1, container_name=logrotate_crond, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, url=https://www.redhat.com) Nov 28 03:45:01 localhost podman[90597]: 2025-11-28 08:45:01.870736281 +0000 UTC m=+0.099112799 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vcs-type=git, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12) Nov 28 03:45:01 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:45:01 localhost podman[90596]: 2025-11-28 08:45:01.958226069 +0000 UTC m=+0.187834184 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:45:02 localhost podman[90596]: 2025-11-28 08:45:02.012737912 +0000 UTC m=+0.242346057 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:45:02 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:45:02 localhost podman[90598]: 2025-11-28 08:45:02.028488261 +0000 UTC m=+0.253880056 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64) Nov 28 03:45:02 localhost podman[90598]: 2025-11-28 08:45:02.057852653 +0000 UTC m=+0.283244428 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 28 03:45:02 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:45:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:45:02 localhost podman[90667]: 2025-11-28 08:45:02.850223331 +0000 UTC m=+0.085454045 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, container_name=nova_migration_target) Nov 28 03:45:03 localhost podman[90667]: 2025-11-28 08:45:03.211081378 +0000 UTC m=+0.446312102 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:45:03 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:45:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:45:04 localhost podman[90689]: 2025-11-28 08:45:04.847836562 +0000 UTC m=+0.084377172 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vendor=Red Hat, Inc., container_name=nova_compute, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:45:04 localhost podman[90689]: 2025-11-28 08:45:04.880324651 +0000 UTC m=+0.116865241 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, distribution-scope=public, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute) Nov 28 03:45:04 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:45:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:45:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:45:09 localhost systemd[1]: tmp-crun.Q0bjyz.mount: Deactivated successfully. Nov 28 03:45:09 localhost podman[90715]: 2025-11-28 08:45:09.866354204 +0000 UTC m=+0.096701445 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 03:45:09 localhost podman[90715]: 2025-11-28 08:45:09.909200204 +0000 UTC m=+0.139547475 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, release=1761123044, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, io.buildah.version=1.41.4) Nov 28 03:45:09 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:45:09 localhost podman[90716]: 2025-11-28 08:45:09.916256183 +0000 UTC m=+0.143377104 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.4) Nov 28 03:45:09 localhost podman[90716]: 2025-11-28 08:45:09.999641992 +0000 UTC m=+0.226762873 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-ovn-controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 28 03:45:10 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:45:19 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 03:45:19 localhost recover_tripleo_nova_virtqemud[90766]: 61397 Nov 28 03:45:19 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 03:45:19 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 03:45:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:45:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:45:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:45:26 localhost systemd[1]: tmp-crun.cv1f5f.mount: Deactivated successfully. Nov 28 03:45:26 localhost podman[90768]: 2025-11-28 08:45:26.921187008 +0000 UTC m=+0.145674113 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true) Nov 28 03:45:26 localhost podman[90768]: 2025-11-28 08:45:26.963940494 +0000 UTC m=+0.188427539 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container) Nov 28 03:45:26 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:45:26 localhost podman[90767]: 2025-11-28 08:45:26.978465688 +0000 UTC m=+0.197535493 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, container_name=iscsid, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:45:26 localhost podman[90769]: 2025-11-28 08:45:26.882107327 +0000 UTC m=+0.103717002 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:45:26 localhost podman[90767]: 2025-11-28 08:45:26.990311928 +0000 UTC m=+0.209381763 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step3, container_name=iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.12, release=1761123044, architecture=x86_64, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, vcs-type=git) Nov 28 03:45:27 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:45:27 localhost podman[90769]: 2025-11-28 08:45:27.079319739 +0000 UTC m=+0.300929344 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, release=1761123044, vcs-type=git, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 28 03:45:27 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:45:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:45:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:45:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:45:32 localhost systemd[1]: tmp-crun.XbEmfv.mount: Deactivated successfully. Nov 28 03:45:32 localhost podman[90837]: 2025-11-28 08:45:32.866552034 +0000 UTC m=+0.096062722 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, release=1761123044) Nov 28 03:45:32 localhost podman[90837]: 2025-11-28 08:45:32.895393045 +0000 UTC m=+0.124903713 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.12, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 28 03:45:32 localhost systemd[1]: tmp-crun.iJ2a4J.mount: Deactivated successfully. Nov 28 03:45:32 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:45:32 localhost podman[90836]: 2025-11-28 08:45:32.915527824 +0000 UTC m=+0.147246782 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-cron, release=1761123044, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, version=17.1.12) Nov 28 03:45:32 localhost podman[90836]: 2025-11-28 08:45:32.923471172 +0000 UTC m=+0.155190150 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, version=17.1.12, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:45:32 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:45:33 localhost podman[90835]: 2025-11-28 08:45:33.027889255 +0000 UTC m=+0.262413510 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 28 03:45:33 localhost podman[90835]: 2025-11-28 08:45:33.06738455 +0000 UTC m=+0.301908855 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute) Nov 28 03:45:33 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:45:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:45:33 localhost podman[90907]: 2025-11-28 08:45:33.845301946 +0000 UTC m=+0.079630578 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc.) Nov 28 03:45:34 localhost podman[90907]: 2025-11-28 08:45:34.454524053 +0000 UTC m=+0.688852675 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, build-date=2025-11-19T00:36:58Z) Nov 28 03:45:34 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:45:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:45:35 localhost podman[90930]: 2025-11-28 08:45:35.844809705 +0000 UTC m=+0.080902789 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:45:35 localhost podman[90930]: 2025-11-28 08:45:35.877319212 +0000 UTC m=+0.113412236 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vcs-type=git, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=nova_compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:45:35 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:45:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:45:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:45:40 localhost podman[90956]: 2025-11-28 08:45:40.820543943 +0000 UTC m=+0.063121323 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1) Nov 28 03:45:40 localhost systemd[1]: tmp-crun.khAJw5.mount: Deactivated successfully. Nov 28 03:45:40 localhost podman[90957]: 2025-11-28 08:45:40.874743907 +0000 UTC m=+0.114830289 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step4, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, release=1761123044, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team) Nov 28 03:45:40 localhost podman[90956]: 2025-11-28 08:45:40.889519519 +0000 UTC m=+0.132096899 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, container_name=ovn_metadata_agent) Nov 28 03:45:40 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:45:40 localhost podman[90957]: 2025-11-28 08:45:40.903154785 +0000 UTC m=+0.143241187 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible) Nov 28 03:45:40 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Deactivated successfully. Nov 28 03:45:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:45:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:45:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:45:57 localhost podman[91005]: 2025-11-28 08:45:57.852073112 +0000 UTC m=+0.081290181 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:45:57 localhost podman[91003]: 2025-11-28 08:45:57.913220353 +0000 UTC m=+0.147333115 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64) Nov 28 03:45:57 localhost podman[91003]: 2025-11-28 08:45:57.965521737 +0000 UTC m=+0.199634499 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, tcib_managed=true, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc.) Nov 28 03:45:57 localhost systemd[1]: tmp-crun.vgGeWk.mount: Deactivated successfully. Nov 28 03:45:57 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:45:58 localhost podman[91004]: 2025-11-28 08:45:57.974789147 +0000 UTC m=+0.208744184 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_id=tripleo_step3, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container) Nov 28 03:45:58 localhost podman[91004]: 2025-11-28 08:45:58.059066571 +0000 UTC m=+0.293021558 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, url=https://www.redhat.com, architecture=x86_64, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc., container_name=collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git) Nov 28 03:45:58 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:45:58 localhost podman[91005]: 2025-11-28 08:45:58.11185755 +0000 UTC m=+0.341074619 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, container_name=metrics_qdr, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, build-date=2025-11-18T22:49:46Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1) Nov 28 03:45:58 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:46:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:46:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:46:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:46:03 localhost podman[91148]: 2025-11-28 08:46:03.852751166 +0000 UTC m=+0.081654352 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:46:03 localhost podman[91148]: 2025-11-28 08:46:03.890760873 +0000 UTC m=+0.119664059 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron) Nov 28 03:46:03 localhost podman[91149]: 2025-11-28 08:46:03.906390642 +0000 UTC m=+0.131797969 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:46:03 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:46:03 localhost systemd[1]: tmp-crun.HOeSMz.mount: Deactivated successfully. Nov 28 03:46:03 localhost podman[91149]: 2025-11-28 08:46:03.967377538 +0000 UTC m=+0.192784825 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team) Nov 28 03:46:03 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:46:04 localhost podman[91147]: 2025-11-28 08:46:03.970172655 +0000 UTC m=+0.200631970 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:46:04 localhost podman[91147]: 2025-11-28 08:46:04.053326443 +0000 UTC m=+0.283785798 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 28 03:46:04 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:46:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:46:04 localhost podman[91220]: 2025-11-28 08:46:04.85008134 +0000 UTC m=+0.082681775 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, architecture=x86_64, container_name=nova_migration_target, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, vcs-type=git) Nov 28 03:46:05 localhost podman[91220]: 2025-11-28 08:46:05.225266343 +0000 UTC m=+0.457866768 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, release=1761123044, url=https://www.redhat.com) Nov 28 03:46:05 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:46:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:46:06 localhost podman[91244]: 2025-11-28 08:46:06.870223933 +0000 UTC m=+0.080222858 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, container_name=nova_compute, config_id=tripleo_step5, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:46:06 localhost podman[91244]: 2025-11-28 08:46:06.901382637 +0000 UTC m=+0.111381572 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc.) Nov 28 03:46:06 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:46:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:46:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:46:11 localhost systemd[1]: tmp-crun.FYSJLZ.mount: Deactivated successfully. Nov 28 03:46:11 localhost podman[91272]: 2025-11-28 08:46:11.882237884 +0000 UTC m=+0.114679395 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-18T23:34:05Z, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team) Nov 28 03:46:11 localhost podman[91271]: 2025-11-28 08:46:11.937891783 +0000 UTC m=+0.172836052 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 03:46:11 localhost podman[91272]: 2025-11-28 08:46:11.938478531 +0000 UTC m=+0.170920052 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, version=17.1.12, tcib_managed=true) Nov 28 03:46:11 localhost podman[91272]: unhealthy Nov 28 03:46:12 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:46:12 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:46:12 localhost podman[91271]: 2025-11-28 08:46:12.013358141 +0000 UTC m=+0.248302480 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git) Nov 28 03:46:12 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:46:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:46:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:46:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:46:28 localhost podman[91321]: 2025-11-28 08:46:28.83631154 +0000 UTC m=+0.070565596 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, distribution-scope=public, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12) Nov 28 03:46:28 localhost podman[91322]: 2025-11-28 08:46:28.90032977 +0000 UTC m=+0.130453047 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, distribution-scope=public, architecture=x86_64, container_name=collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:46:28 localhost podman[91322]: 2025-11-28 08:46:28.93838128 +0000 UTC m=+0.168504577 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, container_name=collectd, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, tcib_managed=true) Nov 28 03:46:28 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:46:28 localhost podman[91323]: 2025-11-28 08:46:28.953217863 +0000 UTC m=+0.180121499 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=metrics_qdr, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, release=1761123044, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 28 03:46:28 localhost podman[91321]: 2025-11-28 08:46:28.972059182 +0000 UTC m=+0.206313298 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, build-date=2025-11-18T23:44:13Z, container_name=iscsid, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step3, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, vcs-type=git, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:46:28 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:46:29 localhost podman[91323]: 2025-11-28 08:46:29.154434981 +0000 UTC m=+0.381338647 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:46:29 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:46:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:46:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:46:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:46:34 localhost podman[91389]: 2025-11-28 08:46:34.848614737 +0000 UTC m=+0.088806125 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vcs-type=git, container_name=ceilometer_agent_compute, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:46:34 localhost podman[91391]: 2025-11-28 08:46:34.895254815 +0000 UTC m=+0.130412607 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_id=tripleo_step4) Nov 28 03:46:34 localhost podman[91389]: 2025-11-28 08:46:34.906546057 +0000 UTC m=+0.146737425 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 28 03:46:34 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:46:34 localhost systemd[1]: tmp-crun.wM0B8c.mount: Deactivated successfully. Nov 28 03:46:34 localhost podman[91390]: 2025-11-28 08:46:34.951881044 +0000 UTC m=+0.187349156 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, container_name=logrotate_crond, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Nov 28 03:46:34 localhost podman[91391]: 2025-11-28 08:46:34.956356694 +0000 UTC m=+0.191514476 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:46:34 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:46:34 localhost podman[91390]: 2025-11-28 08:46:34.980426365 +0000 UTC m=+0.215894477 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, version=17.1.12, container_name=logrotate_crond, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z) Nov 28 03:46:34 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:46:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:46:35 localhost podman[91462]: 2025-11-28 08:46:35.840048686 +0000 UTC m=+0.072183287 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, distribution-scope=public, container_name=nova_migration_target, vcs-type=git, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:46:36 localhost podman[91462]: 2025-11-28 08:46:36.201409527 +0000 UTC m=+0.433544148 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, release=1761123044, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:46:36 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:46:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:46:37 localhost podman[91483]: 2025-11-28 08:46:37.84867927 +0000 UTC m=+0.082019574 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12) Nov 28 03:46:37 localhost podman[91483]: 2025-11-28 08:46:37.87587648 +0000 UTC m=+0.109216784 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1) Nov 28 03:46:37 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:46:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:46:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:46:42 localhost systemd[1]: tmp-crun.NmEX8q.mount: Deactivated successfully. Nov 28 03:46:42 localhost podman[91509]: 2025-11-28 08:46:42.857675997 +0000 UTC m=+0.090029054 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible) Nov 28 03:46:42 localhost podman[91510]: 2025-11-28 08:46:42.905265754 +0000 UTC m=+0.133998998 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, config_id=tripleo_step4, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1) Nov 28 03:46:42 localhost podman[91509]: 2025-11-28 08:46:42.911631073 +0000 UTC m=+0.143984130 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 03:46:42 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Deactivated successfully. Nov 28 03:46:42 localhost podman[91510]: 2025-11-28 08:46:42.929890654 +0000 UTC m=+0.158623888 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1761123044, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Nov 28 03:46:42 localhost podman[91510]: unhealthy Nov 28 03:46:42 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:46:42 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:46:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:46:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:46:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:46:59 localhost systemd[1]: tmp-crun.FOuSQa.mount: Deactivated successfully. Nov 28 03:46:59 localhost podman[91576]: 2025-11-28 08:46:59.647170722 +0000 UTC m=+0.102563106 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, container_name=collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, version=17.1.12) Nov 28 03:46:59 localhost podman[91576]: 2025-11-28 08:46:59.66150799 +0000 UTC m=+0.116900384 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, version=17.1.12, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:46:59 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:46:59 localhost systemd[1]: tmp-crun.VPukj1.mount: Deactivated successfully. Nov 28 03:46:59 localhost podman[91577]: 2025-11-28 08:46:59.753463863 +0000 UTC m=+0.206288586 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, container_name=metrics_qdr, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible) Nov 28 03:46:59 localhost podman[91574]: 2025-11-28 08:46:59.785869446 +0000 UTC m=+0.244730908 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64) Nov 28 03:46:59 localhost podman[91574]: 2025-11-28 08:46:59.800592216 +0000 UTC m=+0.259453628 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, distribution-scope=public, config_id=tripleo_step3, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git) Nov 28 03:46:59 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:46:59 localhost podman[91577]: 2025-11-28 08:46:59.993302828 +0000 UTC m=+0.446127541 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, release=1761123044, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:47:00 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:47:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:47:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:47:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:47:05 localhost podman[91702]: 2025-11-28 08:47:05.873257419 +0000 UTC m=+0.091119419 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=) Nov 28 03:47:05 localhost systemd[1]: tmp-crun.0zIsXx.mount: Deactivated successfully. Nov 28 03:47:05 localhost podman[91701]: 2025-11-28 08:47:05.912851976 +0000 UTC m=+0.131882472 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team) Nov 28 03:47:05 localhost podman[91703]: 2025-11-28 08:47:05.972691735 +0000 UTC m=+0.188164301 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi) Nov 28 03:47:05 localhost podman[91701]: 2025-11-28 08:47:05.973531952 +0000 UTC m=+0.192562538 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, container_name=ceilometer_agent_compute, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:47:05 localhost podman[91702]: 2025-11-28 08:47:05.989205022 +0000 UTC m=+0.207067022 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step4, version=17.1.12, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public, release=1761123044, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Nov 28 03:47:05 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:47:06 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:47:06 localhost podman[91703]: 2025-11-28 08:47:06.053070127 +0000 UTC m=+0.268542632 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044) Nov 28 03:47:06 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:47:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:47:06 localhost podman[91773]: 2025-11-28 08:47:06.851099354 +0000 UTC m=+0.085710979 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, version=17.1.12, name=rhosp17/openstack-nova-compute, architecture=x86_64, distribution-scope=public, release=1761123044, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team) Nov 28 03:47:07 localhost podman[91773]: 2025-11-28 08:47:07.219463295 +0000 UTC m=+0.454074880 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-type=git, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4) Nov 28 03:47:07 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:47:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:47:08 localhost podman[91797]: 2025-11-28 08:47:08.839226758 +0000 UTC m=+0.078100272 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12) Nov 28 03:47:08 localhost podman[91797]: 2025-11-28 08:47:08.870393741 +0000 UTC m=+0.109267255 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, vcs-type=git, name=rhosp17/openstack-nova-compute, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 28 03:47:08 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:47:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:47:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:47:13 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 03:47:13 localhost recover_tripleo_nova_virtqemud[91836]: 61397 Nov 28 03:47:13 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 03:47:13 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 03:47:13 localhost systemd[1]: tmp-crun.JxlQYc.mount: Deactivated successfully. Nov 28 03:47:13 localhost podman[91823]: 2025-11-28 08:47:13.854263622 +0000 UTC m=+0.095041540 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=ovn_metadata_agent, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, distribution-scope=public, config_id=tripleo_step4, batch=17.1_20251118.1) Nov 28 03:47:13 localhost podman[91823]: 2025-11-28 08:47:13.87145951 +0000 UTC m=+0.112237478 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc.) Nov 28 03:47:13 localhost podman[91823]: unhealthy Nov 28 03:47:13 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:47:13 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:47:13 localhost podman[91824]: 2025-11-28 08:47:13.957206869 +0000 UTC m=+0.195433038 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, container_name=ovn_controller, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4) Nov 28 03:47:13 localhost podman[91824]: 2025-11-28 08:47:13.970934928 +0000 UTC m=+0.209161087 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:47:13 localhost podman[91824]: unhealthy Nov 28 03:47:13 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:47:13 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:47:14 localhost systemd[1]: tmp-crun.uwgKKx.mount: Deactivated successfully. Nov 28 03:47:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:47:29 localhost podman[91865]: 2025-11-28 08:47:29.850568401 +0000 UTC m=+0.085150452 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, container_name=collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:47:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:47:29 localhost podman[91865]: 2025-11-28 08:47:29.862346698 +0000 UTC m=+0.096928749 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public) Nov 28 03:47:29 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:47:29 localhost systemd[1]: tmp-crun.euHbTd.mount: Deactivated successfully. Nov 28 03:47:29 localhost podman[91885]: 2025-11-28 08:47:29.954196599 +0000 UTC m=+0.083847211 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:47:29 localhost podman[91885]: 2025-11-28 08:47:29.963452888 +0000 UTC m=+0.093103520 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, version=17.1.12, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step3, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1) Nov 28 03:47:29 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:47:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:47:30 localhost systemd[1]: tmp-crun.yEUeO7.mount: Deactivated successfully. Nov 28 03:47:30 localhost podman[91904]: 2025-11-28 08:47:30.869550331 +0000 UTC m=+0.102834514 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:47:31 localhost podman[91904]: 2025-11-28 08:47:31.065557566 +0000 UTC m=+0.298841759 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible) Nov 28 03:47:31 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:47:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:47:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:47:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:47:36 localhost podman[91936]: 2025-11-28 08:47:36.858879042 +0000 UTC m=+0.088045003 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-type=git, version=17.1.12, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:47:36 localhost podman[91936]: 2025-11-28 08:47:36.893436971 +0000 UTC m=+0.122602902 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, distribution-scope=public, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 28 03:47:36 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:47:36 localhost podman[91935]: 2025-11-28 08:47:36.916319196 +0000 UTC m=+0.145982963 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, architecture=x86_64, release=1761123044, build-date=2025-11-18T22:49:32Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, vcs-type=git) Nov 28 03:47:36 localhost podman[91935]: 2025-11-28 08:47:36.949456312 +0000 UTC m=+0.179120129 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, url=https://www.redhat.com, container_name=logrotate_crond, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64) Nov 28 03:47:36 localhost podman[91934]: 2025-11-28 08:47:36.964391948 +0000 UTC m=+0.197441870 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044) Nov 28 03:47:36 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:47:37 localhost podman[91934]: 2025-11-28 08:47:37.0185152 +0000 UTC m=+0.251565082 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4) Nov 28 03:47:37 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:47:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:47:37 localhost podman[92006]: 2025-11-28 08:47:37.856132263 +0000 UTC m=+0.091003275 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, container_name=nova_migration_target, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git) Nov 28 03:47:38 localhost podman[92006]: 2025-11-28 08:47:38.265533585 +0000 UTC m=+0.500404577 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:47:38 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:47:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:47:39 localhost podman[92029]: 2025-11-28 08:47:39.845581037 +0000 UTC m=+0.079696732 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:47:39 localhost podman[92029]: 2025-11-28 08:47:39.8815233 +0000 UTC m=+0.115638985 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, config_id=tripleo_step5) Nov 28 03:47:39 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:47:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:47:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:47:44 localhost podman[92055]: 2025-11-28 08:47:44.852670365 +0000 UTC m=+0.084565514 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, batch=17.1_20251118.1) Nov 28 03:47:44 localhost podman[92055]: 2025-11-28 08:47:44.868540721 +0000 UTC m=+0.100435830 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, vcs-type=git, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.4) Nov 28 03:47:44 localhost podman[92055]: unhealthy Nov 28 03:47:44 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:47:44 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:47:44 localhost systemd[1]: tmp-crun.ALqc1q.mount: Deactivated successfully. Nov 28 03:47:44 localhost podman[92056]: 2025-11-28 08:47:44.969133434 +0000 UTC m=+0.197901665 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, release=1761123044) Nov 28 03:47:45 localhost podman[92056]: 2025-11-28 08:47:45.041594858 +0000 UTC m=+0.270363109 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 28 03:47:45 localhost podman[92056]: unhealthy Nov 28 03:47:45 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:47:45 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:48:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:48:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:48:00 localhost podman[92097]: 2025-11-28 08:48:00.848640894 +0000 UTC m=+0.081275131 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, io.openshift.expose-services=, vcs-type=git) Nov 28 03:48:00 localhost podman[92097]: 2025-11-28 08:48:00.857423798 +0000 UTC m=+0.090057995 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, architecture=x86_64, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step3, batch=17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, maintainer=OpenStack TripleO Team) Nov 28 03:48:00 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:48:00 localhost podman[92096]: 2025-11-28 08:48:00.899797043 +0000 UTC m=+0.134748573 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, version=17.1.12, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z) Nov 28 03:48:00 localhost podman[92096]: 2025-11-28 08:48:00.934866448 +0000 UTC m=+0.169817938 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, com.redhat.component=openstack-iscsid-container, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:48:00 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:48:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:48:01 localhost podman[92135]: 2025-11-28 08:48:01.835902783 +0000 UTC m=+0.076742979 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team) Nov 28 03:48:02 localhost podman[92135]: 2025-11-28 08:48:02.050950802 +0000 UTC m=+0.291791008 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, release=1761123044, vendor=Red Hat, Inc., container_name=metrics_qdr, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Nov 28 03:48:02 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:48:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:48:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:48:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:48:07 localhost podman[92277]: 2025-11-28 08:48:07.861454174 +0000 UTC m=+0.087656761 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, version=17.1.12) Nov 28 03:48:07 localhost podman[92275]: 2025-11-28 08:48:07.912976863 +0000 UTC m=+0.146608741 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, distribution-scope=public) Nov 28 03:48:07 localhost podman[92277]: 2025-11-28 08:48:07.919129216 +0000 UTC m=+0.145331823 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public) Nov 28 03:48:07 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:48:07 localhost podman[92275]: 2025-11-28 08:48:07.948484533 +0000 UTC m=+0.182116401 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:48:07 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:48:08 localhost podman[92276]: 2025-11-28 08:48:08.014037291 +0000 UTC m=+0.242238330 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container) Nov 28 03:48:08 localhost podman[92276]: 2025-11-28 08:48:08.023876789 +0000 UTC m=+0.252077748 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, container_name=logrotate_crond, release=1761123044, vcs-type=git, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-cron) Nov 28 03:48:08 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:48:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:48:08 localhost podman[92347]: 2025-11-28 08:48:08.832540587 +0000 UTC m=+0.073224459 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, distribution-scope=public, config_id=tripleo_step4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container) Nov 28 03:48:09 localhost podman[92347]: 2025-11-28 08:48:09.175469533 +0000 UTC m=+0.416153405 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, tcib_managed=true, version=17.1.12, config_id=tripleo_step4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc.) Nov 28 03:48:09 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:48:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:48:10 localhost podman[92385]: 2025-11-28 08:48:10.854410115 +0000 UTC m=+0.086525555 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:48:10 localhost podman[92385]: 2025-11-28 08:48:10.890506133 +0000 UTC m=+0.122621623 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.) Nov 28 03:48:10 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:48:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:48:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:48:15 localhost podman[92413]: 2025-11-28 08:48:15.856809807 +0000 UTC m=+0.084405799 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true) Nov 28 03:48:15 localhost podman[92413]: 2025-11-28 08:48:15.87355124 +0000 UTC m=+0.101147222 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, config_id=tripleo_step4, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 28 03:48:15 localhost podman[92413]: unhealthy Nov 28 03:48:15 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:48:15 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:48:15 localhost systemd[1]: tmp-crun.vtF9Jz.mount: Deactivated successfully. Nov 28 03:48:15 localhost podman[92412]: 2025-11-28 08:48:15.975711692 +0000 UTC m=+0.205195223 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, vcs-type=git, release=1761123044, distribution-scope=public, architecture=x86_64, build-date=2025-11-19T00:14:25Z) Nov 28 03:48:15 localhost podman[92412]: 2025-11-28 08:48:15.994504769 +0000 UTC m=+0.223988260 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, tcib_managed=true, version=17.1.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4) Nov 28 03:48:16 localhost podman[92412]: unhealthy Nov 28 03:48:16 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:48:16 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:48:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:48:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:48:31 localhost podman[92453]: 2025-11-28 08:48:31.844903699 +0000 UTC m=+0.084309865 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, release=1761123044, distribution-scope=public, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:48:31 localhost podman[92453]: 2025-11-28 08:48:31.88241155 +0000 UTC m=+0.121817686 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, tcib_managed=true, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:48:31 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:48:31 localhost podman[92454]: 2025-11-28 08:48:31.902068285 +0000 UTC m=+0.140734459 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4) Nov 28 03:48:31 localhost podman[92454]: 2025-11-28 08:48:31.915411272 +0000 UTC m=+0.154077406 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, version=17.1.12, release=1761123044, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4) Nov 28 03:48:31 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:48:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:48:32 localhost systemd[1]: tmp-crun.lY7M7V.mount: Deactivated successfully. Nov 28 03:48:32 localhost podman[92490]: 2025-11-28 08:48:32.850357806 +0000 UTC m=+0.081038073 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc.) Nov 28 03:48:33 localhost podman[92490]: 2025-11-28 08:48:33.073332343 +0000 UTC m=+0.304012560 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, release=1761123044, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, container_name=metrics_qdr) Nov 28 03:48:33 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:48:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 03:48:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.1 total, 600.0 interval#012Cumulative writes: 4899 writes, 22K keys, 4899 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4899 writes, 608 syncs, 8.06 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 28 03:48:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:48:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:48:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:48:38 localhost podman[92521]: 2025-11-28 08:48:38.845736325 +0000 UTC m=+0.081621971 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, tcib_managed=true, container_name=logrotate_crond, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 28 03:48:38 localhost podman[92521]: 2025-11-28 08:48:38.858295687 +0000 UTC m=+0.094181333 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, container_name=logrotate_crond, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true) Nov 28 03:48:38 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:48:38 localhost systemd[1]: tmp-crun.MzVwCQ.mount: Deactivated successfully. Nov 28 03:48:38 localhost podman[92520]: 2025-11-28 08:48:38.96525619 +0000 UTC m=+0.204791381 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, tcib_managed=true, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com) Nov 28 03:48:39 localhost podman[92520]: 2025-11-28 08:48:39.019395191 +0000 UTC m=+0.258930322 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, release=1761123044) Nov 28 03:48:39 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:48:39 localhost podman[92522]: 2025-11-28 08:48:39.106690929 +0000 UTC m=+0.338565190 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12) Nov 28 03:48:39 localhost systemd[1]: tmp-crun.iAk3uG.mount: Deactivated successfully. Nov 28 03:48:39 localhost podman[92522]: 2025-11-28 08:48:39.137346737 +0000 UTC m=+0.369221038 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, release=1761123044, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, url=https://www.redhat.com) Nov 28 03:48:39 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:48:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 03:48:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.3 total, 600.0 interval#012Cumulative writes: 5616 writes, 25K keys, 5616 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5616 writes, 758 syncs, 7.41 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 28 03:48:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:48:39 localhost podman[92594]: 2025-11-28 08:48:39.849162849 +0000 UTC m=+0.080277520 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:48:40 localhost podman[92594]: 2025-11-28 08:48:40.22361041 +0000 UTC m=+0.454725091 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team) Nov 28 03:48:40 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:48:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:48:41 localhost systemd[1]: tmp-crun.a90Vaj.mount: Deactivated successfully. Nov 28 03:48:41 localhost podman[92617]: 2025-11-28 08:48:41.844763526 +0000 UTC m=+0.082898811 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:48:41 localhost podman[92617]: 2025-11-28 08:48:41.875351752 +0000 UTC m=+0.113487027 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, container_name=nova_compute, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:48:41 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:48:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:48:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:48:46 localhost systemd[1]: tmp-crun.KumBMN.mount: Deactivated successfully. Nov 28 03:48:46 localhost podman[92643]: 2025-11-28 08:48:46.861445861 +0000 UTC m=+0.084777379 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 03:48:46 localhost podman[92643]: 2025-11-28 08:48:46.869045209 +0000 UTC m=+0.092376747 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, container_name=ovn_metadata_agent, config_id=tripleo_step4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public) Nov 28 03:48:46 localhost podman[92643]: unhealthy Nov 28 03:48:46 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:48:46 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:48:46 localhost podman[92644]: 2025-11-28 08:48:46.953512248 +0000 UTC m=+0.173445180 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1) Nov 28 03:48:46 localhost podman[92644]: 2025-11-28 08:48:46.972319016 +0000 UTC m=+0.192251918 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, container_name=ovn_controller, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:48:46 localhost podman[92644]: unhealthy Nov 28 03:48:46 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:48:46 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:49:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:49:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:49:02 localhost podman[92685]: 2025-11-28 08:49:02.841483862 +0000 UTC m=+0.084885824 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, container_name=iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true) Nov 28 03:49:02 localhost podman[92685]: 2025-11-28 08:49:02.877261829 +0000 UTC m=+0.120663781 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, batch=17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-18T23:44:13Z, vcs-type=git, config_id=tripleo_step3, managed_by=tripleo_ansible) Nov 28 03:49:02 localhost systemd[1]: tmp-crun.kdFX3A.mount: Deactivated successfully. Nov 28 03:49:02 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:49:02 localhost podman[92686]: 2025-11-28 08:49:02.90063919 +0000 UTC m=+0.140940735 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=collectd, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, version=17.1.12) Nov 28 03:49:02 localhost podman[92686]: 2025-11-28 08:49:02.910408655 +0000 UTC m=+0.150710210 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step3, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:49:02 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:49:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:49:03 localhost podman[92726]: 2025-11-28 08:49:03.845397301 +0000 UTC m=+0.079997481 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., container_name=metrics_qdr, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=) Nov 28 03:49:04 localhost podman[92726]: 2025-11-28 08:49:04.073502528 +0000 UTC m=+0.308102688 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, container_name=metrics_qdr, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., config_id=tripleo_step1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 28 03:49:04 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:49:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:49:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:49:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:49:09 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 03:49:09 localhost recover_tripleo_nova_virtqemud[92786]: 61397 Nov 28 03:49:09 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 03:49:09 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 03:49:09 localhost systemd[1]: tmp-crun.s31sHo.mount: Deactivated successfully. Nov 28 03:49:09 localhost podman[92770]: 2025-11-28 08:49:09.760481531 +0000 UTC m=+0.103753833 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, version=17.1.12) Nov 28 03:49:09 localhost podman[92772]: 2025-11-28 08:49:09.821661002 +0000 UTC m=+0.162848040 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:49:09 localhost podman[92772]: 2025-11-28 08:49:09.830706405 +0000 UTC m=+0.171893402 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 28 03:49:09 localhost podman[92773]: 2025-11-28 08:49:09.860132795 +0000 UTC m=+0.196128089 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-19T00:12:45Z, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4) Nov 28 03:49:09 localhost podman[92770]: 2025-11-28 08:49:09.891285668 +0000 UTC m=+0.234557960 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true) Nov 28 03:49:09 localhost podman[92773]: 2025-11-28 08:49:09.889399229 +0000 UTC m=+0.225394563 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 28 03:49:09 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:49:09 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:49:09 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:49:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:49:10 localhost podman[92905]: 2025-11-28 08:49:10.669499135 +0000 UTC m=+0.086823454 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vendor=Red Hat, Inc., container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, build-date=2025-11-19T00:36:58Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:49:11 localhost podman[92905]: 2025-11-28 08:49:11.042250123 +0000 UTC m=+0.459574392 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:49:11 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:49:11 localhost podman[92985]: Nov 28 03:49:11 localhost podman[92985]: 2025-11-28 08:49:11.219534112 +0000 UTC m=+0.083168009 container create f38c474b3edba19cc96451ffef18fff014c31ffa8f1a2f2234aaf3a23bf9a4fa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_varahamihira, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , ceph=True, vendor=Red Hat, Inc., RELEASE=main, CEPH_POINT_RELEASE=, GIT_BRANCH=main, release=553, name=rhceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, architecture=x86_64, vcs-type=git) Nov 28 03:49:11 localhost systemd[1]: Started libpod-conmon-f38c474b3edba19cc96451ffef18fff014c31ffa8f1a2f2234aaf3a23bf9a4fa.scope. Nov 28 03:49:11 localhost podman[92985]: 2025-11-28 08:49:11.185932282 +0000 UTC m=+0.049566179 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 03:49:11 localhost systemd[1]: Started libcrun container. Nov 28 03:49:11 localhost podman[92985]: 2025-11-28 08:49:11.301749282 +0000 UTC m=+0.165383169 container init f38c474b3edba19cc96451ffef18fff014c31ffa8f1a2f2234aaf3a23bf9a4fa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_varahamihira, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, version=7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, maintainer=Guillaume Abrioux , vcs-type=git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, distribution-scope=public, RELEASE=main) Nov 28 03:49:11 localhost podman[92985]: 2025-11-28 08:49:11.314759298 +0000 UTC m=+0.178393185 container start f38c474b3edba19cc96451ffef18fff014c31ffa8f1a2f2234aaf3a23bf9a4fa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_varahamihira, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, maintainer=Guillaume Abrioux , ceph=True, io.buildah.version=1.33.12, io.openshift.expose-services=, name=rhceph, vendor=Red Hat, Inc., version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, vcs-type=git, RELEASE=main) Nov 28 03:49:11 localhost podman[92985]: 2025-11-28 08:49:11.315129059 +0000 UTC m=+0.178762996 container attach f38c474b3edba19cc96451ffef18fff014c31ffa8f1a2f2234aaf3a23bf9a4fa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_varahamihira, ceph=True, RELEASE=main, version=7, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, name=rhceph, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, release=553, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True) Nov 28 03:49:11 localhost charming_varahamihira[93001]: 167 167 Nov 28 03:49:11 localhost systemd[1]: libpod-f38c474b3edba19cc96451ffef18fff014c31ffa8f1a2f2234aaf3a23bf9a4fa.scope: Deactivated successfully. Nov 28 03:49:11 localhost podman[92985]: 2025-11-28 08:49:11.319252308 +0000 UTC m=+0.182886225 container died f38c474b3edba19cc96451ffef18fff014c31ffa8f1a2f2234aaf3a23bf9a4fa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_varahamihira, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-type=git, io.openshift.tags=rhceph ceph, RELEASE=main, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.expose-services=, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, version=7, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, ceph=True) Nov 28 03:49:11 localhost podman[93006]: 2025-11-28 08:49:11.427116008 +0000 UTC m=+0.092790450 container remove f38c474b3edba19cc96451ffef18fff014c31ffa8f1a2f2234aaf3a23bf9a4fa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_varahamihira, vcs-type=git, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_CLEAN=True, name=rhceph, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, ceph=True, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, release=553) Nov 28 03:49:11 localhost systemd[1]: libpod-conmon-f38c474b3edba19cc96451ffef18fff014c31ffa8f1a2f2234aaf3a23bf9a4fa.scope: Deactivated successfully. Nov 28 03:49:11 localhost podman[93027]: Nov 28 03:49:11 localhost podman[93027]: 2025-11-28 08:49:11.664692642 +0000 UTC m=+0.080386363 container create 7937f315b75cd4b852b2d3c86bae8d967bcc45ac6d1e9ee12db7e0cf1b89533a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_albattani, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, release=553, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, version=7, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, io.openshift.expose-services=, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 28 03:49:11 localhost systemd[1]: Started libpod-conmon-7937f315b75cd4b852b2d3c86bae8d967bcc45ac6d1e9ee12db7e0cf1b89533a.scope. Nov 28 03:49:11 localhost systemd[1]: Started libcrun container. Nov 28 03:49:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9dc99256477a60c2659e433e2fdd6d92a53e343843f6c595e3ad75eb7c063e49/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 28 03:49:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9dc99256477a60c2659e433e2fdd6d92a53e343843f6c595e3ad75eb7c063e49/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 28 03:49:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9dc99256477a60c2659e433e2fdd6d92a53e343843f6c595e3ad75eb7c063e49/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 28 03:49:11 localhost podman[93027]: 2025-11-28 08:49:11.728185806 +0000 UTC m=+0.143879527 container init 7937f315b75cd4b852b2d3c86bae8d967bcc45ac6d1e9ee12db7e0cf1b89533a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_albattani, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, version=7, description=Red Hat Ceph Storage 7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, release=553, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 03:49:11 localhost podman[93027]: 2025-11-28 08:49:11.632679742 +0000 UTC m=+0.048373503 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 03:49:11 localhost podman[93027]: 2025-11-28 08:49:11.738959953 +0000 UTC m=+0.154653684 container start 7937f315b75cd4b852b2d3c86bae8d967bcc45ac6d1e9ee12db7e0cf1b89533a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_albattani, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_CLEAN=True, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, version=7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, build-date=2025-09-24T08:57:55, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, ceph=True) Nov 28 03:49:11 localhost podman[93027]: 2025-11-28 08:49:11.739320804 +0000 UTC m=+0.155014575 container attach 7937f315b75cd4b852b2d3c86bae8d967bcc45ac6d1e9ee12db7e0cf1b89533a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_albattani, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, name=rhceph, build-date=2025-09-24T08:57:55, RELEASE=main, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553) Nov 28 03:49:11 localhost systemd[1]: var-lib-containers-storage-overlay-7e41374e26fe1058b6fc5ed7dafc746a4518e7bdef00d193a7c5d6e3a115c4ea-merged.mount: Deactivated successfully. Nov 28 03:49:12 localhost sharp_albattani[93043]: [ Nov 28 03:49:12 localhost sharp_albattani[93043]: { Nov 28 03:49:12 localhost sharp_albattani[93043]: "available": false, Nov 28 03:49:12 localhost sharp_albattani[93043]: "ceph_device": false, Nov 28 03:49:12 localhost sharp_albattani[93043]: "device_id": "QEMU_DVD-ROM_QM00001", Nov 28 03:49:12 localhost sharp_albattani[93043]: "lsm_data": {}, Nov 28 03:49:12 localhost sharp_albattani[93043]: "lvs": [], Nov 28 03:49:12 localhost sharp_albattani[93043]: "path": "/dev/sr0", Nov 28 03:49:12 localhost sharp_albattani[93043]: "rejected_reasons": [ Nov 28 03:49:12 localhost sharp_albattani[93043]: "Insufficient space (<5GB)", Nov 28 03:49:12 localhost sharp_albattani[93043]: "Has a FileSystem" Nov 28 03:49:12 localhost sharp_albattani[93043]: ], Nov 28 03:49:12 localhost sharp_albattani[93043]: "sys_api": { Nov 28 03:49:12 localhost sharp_albattani[93043]: "actuators": null, Nov 28 03:49:12 localhost sharp_albattani[93043]: "device_nodes": "sr0", Nov 28 03:49:12 localhost sharp_albattani[93043]: "human_readable_size": "482.00 KB", Nov 28 03:49:12 localhost sharp_albattani[93043]: "id_bus": "ata", Nov 28 03:49:12 localhost sharp_albattani[93043]: "model": "QEMU DVD-ROM", Nov 28 03:49:12 localhost sharp_albattani[93043]: "nr_requests": "2", Nov 28 03:49:12 localhost sharp_albattani[93043]: "partitions": {}, Nov 28 03:49:12 localhost sharp_albattani[93043]: "path": "/dev/sr0", Nov 28 03:49:12 localhost sharp_albattani[93043]: "removable": "1", Nov 28 03:49:12 localhost sharp_albattani[93043]: "rev": "2.5+", Nov 28 03:49:12 localhost sharp_albattani[93043]: "ro": "0", Nov 28 03:49:12 localhost sharp_albattani[93043]: "rotational": "1", Nov 28 03:49:12 localhost sharp_albattani[93043]: "sas_address": "", Nov 28 03:49:12 localhost sharp_albattani[93043]: "sas_device_handle": "", Nov 28 03:49:12 localhost sharp_albattani[93043]: "scheduler_mode": "mq-deadline", Nov 28 03:49:12 localhost sharp_albattani[93043]: "sectors": 0, Nov 28 03:49:12 localhost sharp_albattani[93043]: "sectorsize": "2048", Nov 28 03:49:12 localhost sharp_albattani[93043]: "size": 493568.0, Nov 28 03:49:12 localhost sharp_albattani[93043]: "support_discard": "0", Nov 28 03:49:12 localhost sharp_albattani[93043]: "type": "disk", Nov 28 03:49:12 localhost sharp_albattani[93043]: "vendor": "QEMU" Nov 28 03:49:12 localhost sharp_albattani[93043]: } Nov 28 03:49:12 localhost sharp_albattani[93043]: } Nov 28 03:49:12 localhost sharp_albattani[93043]: ] Nov 28 03:49:12 localhost systemd[1]: libpod-7937f315b75cd4b852b2d3c86bae8d967bcc45ac6d1e9ee12db7e0cf1b89533a.scope: Deactivated successfully. Nov 28 03:49:12 localhost podman[93027]: 2025-11-28 08:49:12.697388051 +0000 UTC m=+1.113081822 container died 7937f315b75cd4b852b2d3c86bae8d967bcc45ac6d1e9ee12db7e0cf1b89533a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_albattani, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, GIT_CLEAN=True, architecture=x86_64, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, build-date=2025-09-24T08:57:55, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=553, vendor=Red Hat, Inc., ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, vcs-type=git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 28 03:49:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:49:12 localhost systemd[1]: var-lib-containers-storage-overlay-9dc99256477a60c2659e433e2fdd6d92a53e343843f6c595e3ad75eb7c063e49-merged.mount: Deactivated successfully. Nov 28 03:49:12 localhost systemd[1]: tmp-crun.KNdFYF.mount: Deactivated successfully. Nov 28 03:49:12 localhost podman[95066]: 2025-11-28 08:49:12.831625986 +0000 UTC m=+0.105682164 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vcs-type=git, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, container_name=nova_compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Nov 28 03:49:12 localhost podman[95066]: 2025-11-28 08:49:12.861585401 +0000 UTC m=+0.135641809 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-type=git, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible) Nov 28 03:49:12 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:49:12 localhost podman[95060]: 2025-11-28 08:49:12.900718364 +0000 UTC m=+0.194517969 container remove 7937f315b75cd4b852b2d3c86bae8d967bcc45ac6d1e9ee12db7e0cf1b89533a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_albattani, ceph=True, io.openshift.expose-services=, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, architecture=x86_64, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, release=553, GIT_BRANCH=main, version=7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc.) Nov 28 03:49:12 localhost systemd[1]: libpod-conmon-7937f315b75cd4b852b2d3c86bae8d967bcc45ac6d1e9ee12db7e0cf1b89533a.scope: Deactivated successfully. Nov 28 03:49:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:49:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:49:17 localhost systemd[1]: tmp-crun.ah4RYx.mount: Deactivated successfully. Nov 28 03:49:17 localhost systemd[1]: tmp-crun.Z6KIKD.mount: Deactivated successfully. Nov 28 03:49:17 localhost podman[95118]: 2025-11-28 08:49:17.905979405 +0000 UTC m=+0.139730098 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:49:17 localhost podman[95117]: 2025-11-28 08:49:17.868093911 +0000 UTC m=+0.101838543 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12) Nov 28 03:49:17 localhost podman[95118]: 2025-11-28 08:49:17.92760039 +0000 UTC m=+0.161351133 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public) Nov 28 03:49:17 localhost podman[95118]: unhealthy Nov 28 03:49:17 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:49:17 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:49:17 localhost podman[95117]: 2025-11-28 08:49:17.952919072 +0000 UTC m=+0.186663694 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 28 03:49:17 localhost podman[95117]: unhealthy Nov 28 03:49:17 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:49:17 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:49:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:49:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:49:33 localhost systemd[1]: tmp-crun.0gzMWv.mount: Deactivated successfully. Nov 28 03:49:33 localhost podman[95156]: 2025-11-28 08:49:33.864586208 +0000 UTC m=+0.092826722 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, container_name=iscsid, version=17.1.12, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:49:33 localhost podman[95157]: 2025-11-28 08:49:33.916997465 +0000 UTC m=+0.140673447 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-type=git, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:49:33 localhost podman[95157]: 2025-11-28 08:49:33.928590528 +0000 UTC m=+0.152266550 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, release=1761123044, config_id=tripleo_step3, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:49:33 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:49:33 localhost podman[95156]: 2025-11-28 08:49:33.984137953 +0000 UTC m=+0.212378467 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Nov 28 03:49:33 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:49:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:49:34 localhost podman[95195]: 2025-11-28 08:49:34.847589554 +0000 UTC m=+0.085297867 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4) Nov 28 03:49:35 localhost podman[95195]: 2025-11-28 08:49:35.04560145 +0000 UTC m=+0.283309783 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, release=1761123044, container_name=metrics_qdr, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 28 03:49:35 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:49:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:49:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:49:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:49:40 localhost systemd[1]: tmp-crun.mCqIwu.mount: Deactivated successfully. Nov 28 03:49:40 localhost podman[95226]: 2025-11-28 08:49:40.87652989 +0000 UTC m=+0.109847924 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, distribution-scope=public, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:49:40 localhost podman[95226]: 2025-11-28 08:49:40.889392452 +0000 UTC m=+0.122710446 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step4, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1) Nov 28 03:49:40 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:49:40 localhost podman[95225]: 2025-11-28 08:49:40.966182572 +0000 UTC m=+0.202921883 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, architecture=x86_64, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:49:40 localhost podman[95227]: 2025-11-28 08:49:40.839621907 +0000 UTC m=+0.074671004 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, architecture=x86_64, url=https://www.redhat.com) Nov 28 03:49:41 localhost podman[95227]: 2025-11-28 08:49:41.025499815 +0000 UTC m=+0.260548932 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi) Nov 28 03:49:41 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:49:41 localhost podman[95225]: 2025-11-28 08:49:41.075985722 +0000 UTC m=+0.312724983 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:49:41 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:49:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:49:41 localhost podman[95296]: 2025-11-28 08:49:41.19400158 +0000 UTC m=+0.079196765 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:49:41 localhost podman[95296]: 2025-11-28 08:49:41.557463587 +0000 UTC m=+0.442658762 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:49:41 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:49:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:49:43 localhost podman[95317]: 2025-11-28 08:49:43.844134859 +0000 UTC m=+0.085997808 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:49:43 localhost podman[95317]: 2025-11-28 08:49:43.903588997 +0000 UTC m=+0.145451896 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=nova_compute, config_id=tripleo_step5, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:49:43 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:49:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:49:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:49:48 localhost podman[95342]: 2025-11-28 08:49:48.847171009 +0000 UTC m=+0.081955501 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 03:49:48 localhost podman[95341]: 2025-11-28 08:49:48.894203439 +0000 UTC m=+0.133275636 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 28 03:49:48 localhost podman[95342]: 2025-11-28 08:49:48.912408978 +0000 UTC m=+0.147193460 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team) Nov 28 03:49:48 localhost podman[95342]: unhealthy Nov 28 03:49:48 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:49:48 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:49:48 localhost podman[95341]: 2025-11-28 08:49:48.937407749 +0000 UTC m=+0.176480156 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:49:48 localhost podman[95341]: unhealthy Nov 28 03:49:48 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:49:48 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:50:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:50:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:50:04 localhost systemd[1]: tmp-crun.KSwXz8.mount: Deactivated successfully. Nov 28 03:50:04 localhost podman[95382]: 2025-11-28 08:50:04.85184813 +0000 UTC m=+0.090339183 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, version=17.1.12, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Nov 28 03:50:04 localhost systemd[1]: tmp-crun.WEEUr0.mount: Deactivated successfully. Nov 28 03:50:04 localhost podman[95381]: 2025-11-28 08:50:04.894138602 +0000 UTC m=+0.134343559 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, name=rhosp17/openstack-iscsid, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid) Nov 28 03:50:04 localhost podman[95382]: 2025-11-28 08:50:04.914811138 +0000 UTC m=+0.153302231 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step3, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, managed_by=tripleo_ansible, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd) Nov 28 03:50:04 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:50:04 localhost podman[95381]: 2025-11-28 08:50:04.931424166 +0000 UTC m=+0.171629113 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Nov 28 03:50:04 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:50:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:50:06 localhost podman[95420]: 2025-11-28 08:50:06.842299197 +0000 UTC m=+0.077779172 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container) Nov 28 03:50:07 localhost podman[95420]: 2025-11-28 08:50:07.031425446 +0000 UTC m=+0.266905381 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 28 03:50:07 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:50:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:50:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:50:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:50:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:50:11 localhost podman[95450]: 2025-11-28 08:50:11.856289908 +0000 UTC m=+0.085693648 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., container_name=nova_migration_target, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, version=17.1.12) Nov 28 03:50:11 localhost systemd[1]: tmp-crun.6HC5JW.mount: Deactivated successfully. Nov 28 03:50:11 localhost podman[95451]: 2025-11-28 08:50:11.913012872 +0000 UTC m=+0.139731228 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_id=tripleo_step4) Nov 28 03:50:11 localhost podman[95449]: 2025-11-28 08:50:11.952276698 +0000 UTC m=+0.184426384 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1761123044, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:50:11 localhost podman[95451]: 2025-11-28 08:50:11.975627738 +0000 UTC m=+0.202346054 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-cron, version=17.1.12, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:50:11 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:50:11 localhost podman[95449]: 2025-11-28 08:50:11.987692564 +0000 UTC m=+0.219842310 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4) Nov 28 03:50:12 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:50:12 localhost podman[95456]: 2025-11-28 08:50:12.06276084 +0000 UTC m=+0.285361048 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, release=1761123044, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com) Nov 28 03:50:12 localhost podman[95456]: 2025-11-28 08:50:12.11651236 +0000 UTC m=+0.339112568 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi) Nov 28 03:50:12 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:50:12 localhost podman[95450]: 2025-11-28 08:50:12.327544294 +0000 UTC m=+0.556948034 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, container_name=nova_migration_target, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true) Nov 28 03:50:12 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:50:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:50:14 localhost podman[95587]: 2025-11-28 08:50:14.232489129 +0000 UTC m=+0.076724979 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git) Nov 28 03:50:14 localhost podman[95587]: 2025-11-28 08:50:14.287628711 +0000 UTC m=+0.131864511 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, container_name=nova_compute, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Nov 28 03:50:14 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:50:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:50:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:50:19 localhost podman[95697]: 2025-11-28 08:50:19.845932101 +0000 UTC m=+0.079108932 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team) Nov 28 03:50:19 localhost podman[95696]: 2025-11-28 08:50:19.893871 +0000 UTC m=+0.130636243 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 28 03:50:19 localhost podman[95696]: 2025-11-28 08:50:19.911350136 +0000 UTC m=+0.148115409 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, version=17.1.12, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z) Nov 28 03:50:19 localhost podman[95696]: unhealthy Nov 28 03:50:19 localhost podman[95697]: 2025-11-28 08:50:19.919529202 +0000 UTC m=+0.152705983 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible) Nov 28 03:50:19 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:50:19 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:50:19 localhost podman[95697]: unhealthy Nov 28 03:50:19 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:50:19 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:50:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:50:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:50:35 localhost podman[95732]: 2025-11-28 08:50:35.847960711 +0000 UTC m=+0.084338207 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Nov 28 03:50:35 localhost podman[95732]: 2025-11-28 08:50:35.858877321 +0000 UTC m=+0.095254827 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 28 03:50:35 localhost podman[95733]: 2025-11-28 08:50:35.908208023 +0000 UTC m=+0.137766666 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, tcib_managed=true, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, version=17.1.12) Nov 28 03:50:35 localhost podman[95733]: 2025-11-28 08:50:35.920343072 +0000 UTC m=+0.149901725 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.41.4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:50:35 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:50:35 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:50:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:50:37 localhost podman[95772]: 2025-11-28 08:50:37.852554198 +0000 UTC m=+0.080673972 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=) Nov 28 03:50:38 localhost podman[95772]: 2025-11-28 08:50:38.040402437 +0000 UTC m=+0.268522221 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:50:38 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:50:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:50:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:50:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:50:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:50:42 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 03:50:42 localhost recover_tripleo_nova_virtqemud[95822]: 61397 Nov 28 03:50:42 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 03:50:42 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 03:50:42 localhost podman[95802]: 2025-11-28 08:50:42.853792532 +0000 UTC m=+0.092859742 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64) Nov 28 03:50:42 localhost systemd[1]: tmp-crun.dKmcoC.mount: Deactivated successfully. Nov 28 03:50:42 localhost podman[95804]: 2025-11-28 08:50:42.900984067 +0000 UTC m=+0.134399601 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public) Nov 28 03:50:42 localhost podman[95804]: 2025-11-28 08:50:42.909229214 +0000 UTC m=+0.142644758 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Nov 28 03:50:42 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:50:42 localhost systemd[1]: tmp-crun.fOUNwD.mount: Deactivated successfully. Nov 28 03:50:42 localhost podman[95802]: 2025-11-28 08:50:42.979436228 +0000 UTC m=+0.218503438 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:50:42 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:50:43 localhost podman[95803]: 2025-11-28 08:50:42.998715591 +0000 UTC m=+0.234819879 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:50:43 localhost podman[95805]: 2025-11-28 08:50:42.972504682 +0000 UTC m=+0.201615641 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1) Nov 28 03:50:43 localhost podman[95805]: 2025-11-28 08:50:43.055318089 +0000 UTC m=+0.284429058 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:50:43 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:50:43 localhost podman[95803]: 2025-11-28 08:50:43.358683609 +0000 UTC m=+0.594787957 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:50:43 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:50:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:50:44 localhost podman[95895]: 2025-11-28 08:50:44.846157969 +0000 UTC m=+0.081267711 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step5, managed_by=tripleo_ansible, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:50:44 localhost podman[95895]: 2025-11-28 08:50:44.875363551 +0000 UTC m=+0.110473293 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, batch=17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 28 03:50:44 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:50:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:50:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:50:50 localhost podman[95923]: 2025-11-28 08:50:50.856540675 +0000 UTC m=+0.087089602 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, version=17.1.12, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true) Nov 28 03:50:50 localhost podman[95923]: 2025-11-28 08:50:50.897527556 +0000 UTC m=+0.128076483 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible) Nov 28 03:50:50 localhost podman[95923]: unhealthy Nov 28 03:50:50 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:50:50 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:50:50 localhost podman[95922]: 2025-11-28 08:50:50.907560749 +0000 UTC m=+0.141501952 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, container_name=ovn_metadata_agent, distribution-scope=public, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:50:50 localhost podman[95922]: 2025-11-28 08:50:50.989595343 +0000 UTC m=+0.223536506 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.12, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044) Nov 28 03:50:50 localhost podman[95922]: unhealthy Nov 28 03:50:51 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:50:51 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:51:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:51:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:51:06 localhost podman[95961]: 2025-11-28 08:51:06.846304708 +0000 UTC m=+0.083578623 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, vcs-type=git, config_id=tripleo_step3, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container) Nov 28 03:51:06 localhost podman[95961]: 2025-11-28 08:51:06.859670326 +0000 UTC m=+0.096944261 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, architecture=x86_64, io.openshift.expose-services=) Nov 28 03:51:06 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:51:06 localhost podman[95962]: 2025-11-28 08:51:06.95390143 +0000 UTC m=+0.188225453 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step3, batch=17.1_20251118.1, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, release=1761123044, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:51:06 localhost podman[95962]: 2025-11-28 08:51:06.99006368 +0000 UTC m=+0.224387733 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:51:07 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:51:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:51:08 localhost systemd[1]: tmp-crun.91Q8MG.mount: Deactivated successfully. Nov 28 03:51:08 localhost podman[95998]: 2025-11-28 08:51:08.86670773 +0000 UTC m=+0.103375131 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_id=tripleo_step1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, managed_by=tripleo_ansible) Nov 28 03:51:09 localhost podman[95998]: 2025-11-28 08:51:09.0641673 +0000 UTC m=+0.300834741 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, distribution-scope=public, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd) Nov 28 03:51:09 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:51:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:51:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:51:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:51:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:51:13 localhost podman[96028]: 2025-11-28 08:51:13.854756172 +0000 UTC m=+0.083751838 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.openshift.expose-services=, version=17.1.12) Nov 28 03:51:13 localhost podman[96025]: 2025-11-28 08:51:13.909496152 +0000 UTC m=+0.142001747 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team) Nov 28 03:51:13 localhost podman[96026]: 2025-11-28 08:51:13.962243981 +0000 UTC m=+0.192882098 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:51:14 localhost podman[96028]: 2025-11-28 08:51:14.012974856 +0000 UTC m=+0.241970472 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team) Nov 28 03:51:14 localhost podman[96027]: 2025-11-28 08:51:14.021861723 +0000 UTC m=+0.250449706 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_id=tripleo_step4, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Nov 28 03:51:14 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:51:14 localhost podman[96027]: 2025-11-28 08:51:14.035312564 +0000 UTC m=+0.263900567 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=logrotate_crond, release=1761123044, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git) Nov 28 03:51:14 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:51:14 localhost podman[96025]: 2025-11-28 08:51:14.088659701 +0000 UTC m=+0.321165326 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, config_id=tripleo_step4) Nov 28 03:51:14 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:51:14 localhost podman[96026]: 2025-11-28 08:51:14.327659318 +0000 UTC m=+0.558297455 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=nova_migration_target, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=) Nov 28 03:51:14 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:51:14 localhost systemd[1]: tmp-crun.zkZEwR.mount: Deactivated successfully. Nov 28 03:51:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:51:15 localhost podman[96124]: 2025-11-28 08:51:15.84909784 +0000 UTC m=+0.087109814 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 28 03:51:15 localhost podman[96124]: 2025-11-28 08:51:15.882515494 +0000 UTC m=+0.120527488 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, container_name=nova_compute, release=1761123044, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:51:15 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:51:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:51:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:51:21 localhost podman[96229]: 2025-11-28 08:51:21.855011478 +0000 UTC m=+0.088477146 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, vcs-type=git, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.expose-services=) Nov 28 03:51:21 localhost podman[96229]: 2025-11-28 08:51:21.896533875 +0000 UTC m=+0.129999533 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 03:51:21 localhost podman[96229]: unhealthy Nov 28 03:51:21 localhost podman[96228]: 2025-11-28 08:51:21.908807779 +0000 UTC m=+0.144645401 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12) Nov 28 03:51:21 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:51:21 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:51:21 localhost podman[96228]: 2025-11-28 08:51:21.950521223 +0000 UTC m=+0.186358845 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, config_id=tripleo_step4) Nov 28 03:51:21 localhost podman[96228]: unhealthy Nov 28 03:51:21 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:51:21 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:51:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:51:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:51:37 localhost podman[96271]: 2025-11-28 08:51:37.837872607 +0000 UTC m=+0.070388190 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, url=https://www.redhat.com, name=rhosp17/openstack-collectd, vcs-type=git, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step3) Nov 28 03:51:37 localhost podman[96271]: 2025-11-28 08:51:37.874537782 +0000 UTC m=+0.107053365 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc.) Nov 28 03:51:37 localhost podman[96270]: 2025-11-28 08:51:37.903094615 +0000 UTC m=+0.136043662 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, name=rhosp17/openstack-iscsid, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, version=17.1.12, io.openshift.expose-services=, container_name=iscsid, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, io.buildah.version=1.41.4) Nov 28 03:51:37 localhost podman[96270]: 2025-11-28 08:51:37.91733869 +0000 UTC m=+0.150287727 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, version=17.1.12, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, architecture=x86_64, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z) Nov 28 03:51:37 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:51:37 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:51:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:51:39 localhost podman[96309]: 2025-11-28 08:51:39.833503074 +0000 UTC m=+0.073218449 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=metrics_qdr, architecture=x86_64, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:51:40 localhost podman[96309]: 2025-11-28 08:51:40.058551386 +0000 UTC m=+0.298266701 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, version=17.1.12) Nov 28 03:51:40 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:51:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:51:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:51:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:51:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:51:44 localhost podman[96338]: 2025-11-28 08:51:44.856114787 +0000 UTC m=+0.093769151 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4) Nov 28 03:51:44 localhost podman[96339]: 2025-11-28 08:51:44.893273018 +0000 UTC m=+0.128720053 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:51:44 localhost podman[96338]: 2025-11-28 08:51:44.912478058 +0000 UTC m=+0.150132342 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, batch=17.1_20251118.1, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, version=17.1.12) Nov 28 03:51:44 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:51:44 localhost podman[96340]: 2025-11-28 08:51:44.997719702 +0000 UTC m=+0.231289348 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=) Nov 28 03:51:45 localhost podman[96340]: 2025-11-28 08:51:45.009319454 +0000 UTC m=+0.242889130 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-cron, version=17.1.12, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:51:45 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:51:45 localhost podman[96341]: 2025-11-28 08:51:45.048055225 +0000 UTC m=+0.278699600 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1761123044, vcs-type=git, version=17.1.12, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4) Nov 28 03:51:45 localhost podman[96341]: 2025-11-28 08:51:45.101799054 +0000 UTC m=+0.332443449 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-type=git, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 28 03:51:45 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:51:45 localhost podman[96339]: 2025-11-28 08:51:45.296509108 +0000 UTC m=+0.531956163 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., container_name=nova_migration_target, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, version=17.1.12, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:51:45 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:51:45 localhost systemd[1]: tmp-crun.AhAcAJ.mount: Deactivated successfully. Nov 28 03:51:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:51:46 localhost podman[96434]: 2025-11-28 08:51:46.841216676 +0000 UTC m=+0.079331561 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, container_name=nova_compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.openshift.expose-services=, tcib_managed=true) Nov 28 03:51:46 localhost podman[96434]: 2025-11-28 08:51:46.875474496 +0000 UTC m=+0.113589381 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute) Nov 28 03:51:46 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:51:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:51:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:51:52 localhost systemd[1]: tmp-crun.NG1Ihm.mount: Deactivated successfully. Nov 28 03:51:52 localhost podman[96461]: 2025-11-28 08:51:52.83899344 +0000 UTC m=+0.076738179 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, url=https://www.redhat.com, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 28 03:51:52 localhost podman[96461]: 2025-11-28 08:51:52.849362683 +0000 UTC m=+0.087107412 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:14:25Z, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64) Nov 28 03:51:52 localhost podman[96461]: unhealthy Nov 28 03:51:52 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:51:52 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:51:52 localhost podman[96462]: 2025-11-28 08:51:52.947405277 +0000 UTC m=+0.179284973 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, vcs-type=git, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, release=1761123044, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 28 03:51:52 localhost podman[96462]: 2025-11-28 08:51:52.991424053 +0000 UTC m=+0.223303709 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:51:52 localhost podman[96462]: unhealthy Nov 28 03:51:53 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:51:53 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:52:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:52:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:52:08 localhost systemd[1]: tmp-crun.CbeLqK.mount: Deactivated successfully. Nov 28 03:52:08 localhost systemd[1]: tmp-crun.gK4XKv.mount: Deactivated successfully. Nov 28 03:52:08 localhost podman[96502]: 2025-11-28 08:52:08.910276052 +0000 UTC m=+0.145705944 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, config_id=tripleo_step3, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, managed_by=tripleo_ansible) Nov 28 03:52:08 localhost podman[96503]: 2025-11-28 08:52:08.86767753 +0000 UTC m=+0.102113671 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, container_name=collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team) Nov 28 03:52:08 localhost podman[96502]: 2025-11-28 08:52:08.946526344 +0000 UTC m=+0.181956306 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, release=1761123044, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:52:08 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:52:09 localhost podman[96503]: 2025-11-28 08:52:09.002110731 +0000 UTC m=+0.236546882 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, version=17.1.12, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, container_name=collectd, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public) Nov 28 03:52:09 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:52:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:52:10 localhost systemd[1]: tmp-crun.8S88RS.mount: Deactivated successfully. Nov 28 03:52:10 localhost podman[96543]: 2025-11-28 08:52:10.852443009 +0000 UTC m=+0.090526049 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, release=1761123044, distribution-scope=public, config_id=tripleo_step1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, version=17.1.12) Nov 28 03:52:11 localhost podman[96543]: 2025-11-28 08:52:11.070435871 +0000 UTC m=+0.308518901 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, tcib_managed=true, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, release=1761123044, vcs-type=git) Nov 28 03:52:11 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:52:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:52:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:52:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:52:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:52:15 localhost systemd[1]: tmp-crun.gCdxnr.mount: Deactivated successfully. Nov 28 03:52:15 localhost podman[96574]: 2025-11-28 08:52:15.860665493 +0000 UTC m=+0.092440230 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, architecture=x86_64, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-nova-compute-container) Nov 28 03:52:15 localhost systemd[1]: tmp-crun.V1heXz.mount: Deactivated successfully. Nov 28 03:52:15 localhost podman[96573]: 2025-11-28 08:52:15.914738542 +0000 UTC m=+0.145242970 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, release=1761123044, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible) Nov 28 03:52:15 localhost podman[96575]: 2025-11-28 08:52:15.877122736 +0000 UTC m=+0.101137651 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:52:15 localhost podman[96575]: 2025-11-28 08:52:15.960285515 +0000 UTC m=+0.184300420 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, version=17.1.12, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible) Nov 28 03:52:15 localhost podman[96573]: 2025-11-28 08:52:15.974424567 +0000 UTC m=+0.204928965 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, architecture=x86_64, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:52:15 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:52:15 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:52:16 localhost podman[96583]: 2025-11-28 08:52:15.975377537 +0000 UTC m=+0.196373418 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible) Nov 28 03:52:16 localhost podman[96583]: 2025-11-28 08:52:16.054332654 +0000 UTC m=+0.275328495 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, version=17.1.12, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 28 03:52:16 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:52:16 localhost podman[96574]: 2025-11-28 08:52:16.232515041 +0000 UTC m=+0.464289668 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:52:16 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:52:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:52:17 localhost systemd[1]: tmp-crun.1qztnj.mount: Deactivated successfully. Nov 28 03:52:17 localhost podman[96681]: 2025-11-28 08:52:17.576808807 +0000 UTC m=+0.082685774 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1) Nov 28 03:52:17 localhost podman[96681]: 2025-11-28 08:52:17.609494229 +0000 UTC m=+0.115371216 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute) Nov 28 03:52:17 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:52:18 localhost systemd[1]: tmp-crun.vCGtOl.mount: Deactivated successfully. Nov 28 03:52:18 localhost podman[96793]: 2025-11-28 08:52:18.447268837 +0000 UTC m=+0.102979509 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.component=rhceph-container, GIT_BRANCH=main, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_CLEAN=True, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, distribution-scope=public, name=rhceph, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, RELEASE=main, release=553, ceph=True) Nov 28 03:52:18 localhost podman[96793]: 2025-11-28 08:52:18.545554308 +0000 UTC m=+0.201264990 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, vcs-type=git, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, name=rhceph, release=553, ceph=True, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 28 03:52:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:52:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:52:23 localhost systemd[1]: tmp-crun.issTeR.mount: Deactivated successfully. Nov 28 03:52:23 localhost podman[96938]: 2025-11-28 08:52:23.894104555 +0000 UTC m=+0.131948644 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, version=17.1.12, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 28 03:52:23 localhost podman[96939]: 2025-11-28 08:52:23.862404435 +0000 UTC m=+0.100265224 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, version=17.1.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com) Nov 28 03:52:23 localhost podman[96938]: 2025-11-28 08:52:23.937464231 +0000 UTC m=+0.175308310 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 28 03:52:23 localhost podman[96938]: unhealthy Nov 28 03:52:23 localhost podman[96939]: 2025-11-28 08:52:23.94639371 +0000 UTC m=+0.184254459 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team) Nov 28 03:52:23 localhost podman[96939]: unhealthy Nov 28 03:52:23 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:52:23 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:52:23 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:52:23 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:52:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:52:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:52:39 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 03:52:39 localhost recover_tripleo_nova_virtqemud[96991]: 61397 Nov 28 03:52:39 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 03:52:39 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 03:52:39 localhost podman[96979]: 2025-11-28 08:52:39.844576443 +0000 UTC m=+0.083399588 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, architecture=x86_64, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, maintainer=OpenStack TripleO Team) Nov 28 03:52:39 localhost systemd[1]: tmp-crun.6vg2WE.mount: Deactivated successfully. Nov 28 03:52:39 localhost podman[96978]: 2025-11-28 08:52:39.900138199 +0000 UTC m=+0.140239133 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, vcs-type=git, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team) Nov 28 03:52:39 localhost podman[96979]: 2025-11-28 08:52:39.908626834 +0000 UTC m=+0.147450009 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-collectd-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, architecture=x86_64, config_id=tripleo_step3, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z) Nov 28 03:52:39 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:52:39 localhost podman[96978]: 2025-11-28 08:52:39.932122258 +0000 UTC m=+0.172223162 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 28 03:52:39 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:52:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:52:41 localhost systemd[1]: tmp-crun.6IM8ng.mount: Deactivated successfully. Nov 28 03:52:41 localhost podman[97019]: 2025-11-28 08:52:41.852500935 +0000 UTC m=+0.089783826 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:52:42 localhost podman[97019]: 2025-11-28 08:52:42.043463982 +0000 UTC m=+0.280746923 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1761123044, container_name=metrics_qdr, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc.) Nov 28 03:52:42 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:52:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:52:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:52:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:52:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:52:46 localhost podman[97051]: 2025-11-28 08:52:46.861206762 +0000 UTC m=+0.091756158 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, tcib_managed=true, io.buildah.version=1.41.4, release=1761123044, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Nov 28 03:52:46 localhost podman[97049]: 2025-11-28 08:52:46.907611662 +0000 UTC m=+0.143783964 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, url=https://www.redhat.com, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-type=git, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, release=1761123044) Nov 28 03:52:46 localhost systemd[1]: tmp-crun.KfXC3N.mount: Deactivated successfully. Nov 28 03:52:46 localhost podman[97049]: 2025-11-28 08:52:46.961506656 +0000 UTC m=+0.197678928 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, release=1761123044, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com) Nov 28 03:52:46 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:52:47 localhost podman[97050]: 2025-11-28 08:52:46.96289888 +0000 UTC m=+0.196015416 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:52:47 localhost podman[97052]: 2025-11-28 08:52:47.024073401 +0000 UTC m=+0.251380216 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:12:45Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 28 03:52:47 localhost podman[97051]: 2025-11-28 08:52:47.03109383 +0000 UTC m=+0.261643236 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 28 03:52:47 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:52:47 localhost podman[97052]: 2025-11-28 08:52:47.057488116 +0000 UTC m=+0.284794911 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12) Nov 28 03:52:47 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:52:47 localhost podman[97050]: 2025-11-28 08:52:47.337719732 +0000 UTC m=+0.570836258 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com) Nov 28 03:52:47 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:52:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:52:47 localhost podman[97143]: 2025-11-28 08:52:47.836760845 +0000 UTC m=+0.075647525 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vcs-type=git, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1) Nov 28 03:52:47 localhost podman[97143]: 2025-11-28 08:52:47.86765105 +0000 UTC m=+0.106537760 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, release=1761123044, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true) Nov 28 03:52:47 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:52:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:52:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:52:54 localhost podman[97170]: 2025-11-28 08:52:54.849304868 +0000 UTC m=+0.084392898 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, architecture=x86_64) Nov 28 03:52:54 localhost podman[97170]: 2025-11-28 08:52:54.89769634 +0000 UTC m=+0.132784440 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-19T00:14:25Z) Nov 28 03:52:54 localhost podman[97170]: unhealthy Nov 28 03:52:54 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:52:54 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:52:54 localhost podman[97171]: 2025-11-28 08:52:54.898876537 +0000 UTC m=+0.131466319 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=ovn_controller, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:52:54 localhost podman[97171]: 2025-11-28 08:52:54.983399018 +0000 UTC m=+0.215988730 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, container_name=ovn_controller, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1) Nov 28 03:52:54 localhost podman[97171]: unhealthy Nov 28 03:52:54 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:52:54 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:53:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:53:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:53:10 localhost podman[97210]: 2025-11-28 08:53:10.845221145 +0000 UTC m=+0.081678153 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, name=rhosp17/openstack-iscsid, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.) Nov 28 03:53:10 localhost podman[97210]: 2025-11-28 08:53:10.858309985 +0000 UTC m=+0.094767023 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.12, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:53:10 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:53:10 localhost podman[97211]: 2025-11-28 08:53:10.945248041 +0000 UTC m=+0.179609903 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public) Nov 28 03:53:10 localhost podman[97211]: 2025-11-28 08:53:10.957421511 +0000 UTC m=+0.191783373 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, architecture=x86_64, name=rhosp17/openstack-collectd) Nov 28 03:53:10 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:53:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:53:12 localhost systemd[1]: tmp-crun.HAU6EO.mount: Deactivated successfully. Nov 28 03:53:12 localhost podman[97251]: 2025-11-28 08:53:12.861644473 +0000 UTC m=+0.095126244 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, architecture=x86_64, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, container_name=metrics_qdr, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com) Nov 28 03:53:13 localhost podman[97251]: 2025-11-28 08:53:13.05648147 +0000 UTC m=+0.289963261 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.buildah.version=1.41.4) Nov 28 03:53:13 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:53:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:53:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:53:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:53:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:53:17 localhost podman[97280]: 2025-11-28 08:53:17.843057909 +0000 UTC m=+0.080854818 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute) Nov 28 03:53:17 localhost podman[97283]: 2025-11-28 08:53:17.893223176 +0000 UTC m=+0.130277921 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12) Nov 28 03:53:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:53:17 localhost podman[97283]: 2025-11-28 08:53:17.915537414 +0000 UTC m=+0.152592159 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:53:17 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:53:18 localhost podman[97352]: 2025-11-28 08:53:18.000971793 +0000 UTC m=+0.081393464 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4) Nov 28 03:53:18 localhost podman[97280]: 2025-11-28 08:53:18.021262688 +0000 UTC m=+0.259059587 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, architecture=x86_64, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 28 03:53:18 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:53:18 localhost podman[97352]: 2025-11-28 08:53:18.054974541 +0000 UTC m=+0.135396182 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, container_name=nova_compute, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 28 03:53:18 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:53:18 localhost podman[97281]: 2025-11-28 08:53:18.103299531 +0000 UTC m=+0.339569202 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com) Nov 28 03:53:18 localhost podman[97282]: 2025-11-28 08:53:18.155046137 +0000 UTC m=+0.390686358 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team) Nov 28 03:53:18 localhost podman[97282]: 2025-11-28 08:53:18.164660908 +0000 UTC m=+0.400301129 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-cron, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step4, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 28 03:53:18 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:53:18 localhost podman[97281]: 2025-11-28 08:53:18.470965079 +0000 UTC m=+0.707234710 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:53:18 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:53:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:53:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:53:25 localhost systemd[1]: tmp-crun.gdA5E1.mount: Deactivated successfully. Nov 28 03:53:25 localhost podman[97473]: 2025-11-28 08:53:25.850594213 +0000 UTC m=+0.086980790 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044) Nov 28 03:53:25 localhost systemd[1]: tmp-crun.pcQOD5.mount: Deactivated successfully. Nov 28 03:53:25 localhost podman[97472]: 2025-11-28 08:53:25.883198161 +0000 UTC m=+0.118926996 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 28 03:53:25 localhost podman[97472]: 2025-11-28 08:53:25.900463881 +0000 UTC m=+0.136192726 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git) Nov 28 03:53:25 localhost podman[97472]: unhealthy Nov 28 03:53:25 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:53:25 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:53:25 localhost podman[97473]: 2025-11-28 08:53:25.916942836 +0000 UTC m=+0.153329393 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller) Nov 28 03:53:25 localhost podman[97473]: unhealthy Nov 28 03:53:25 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:53:25 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:53:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:53:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:53:41 localhost systemd[1]: tmp-crun.Ha93bu.mount: Deactivated successfully. Nov 28 03:53:41 localhost podman[97511]: 2025-11-28 08:53:41.858087763 +0000 UTC m=+0.091330965 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-iscsid-container) Nov 28 03:53:41 localhost podman[97512]: 2025-11-28 08:53:41.890217896 +0000 UTC m=+0.122616712 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, container_name=collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git) Nov 28 03:53:41 localhost podman[97511]: 2025-11-28 08:53:41.89387024 +0000 UTC m=+0.127113482 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step3, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1) Nov 28 03:53:41 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:53:41 localhost podman[97512]: 2025-11-28 08:53:41.925094727 +0000 UTC m=+0.157493603 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=) Nov 28 03:53:41 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:53:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:53:43 localhost systemd[1]: Starting dnf makecache... Nov 28 03:53:43 localhost podman[97548]: 2025-11-28 08:53:43.840226819 +0000 UTC m=+0.080303660 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, name=rhosp17/openstack-qdrouterd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, config_id=tripleo_step1, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container) Nov 28 03:53:44 localhost dnf[97549]: Updating Subscription Management repositories. Nov 28 03:53:44 localhost podman[97548]: 2025-11-28 08:53:44.042635273 +0000 UTC m=+0.282712144 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:53:44 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:53:45 localhost dnf[97549]: Metadata cache refreshed recently. Nov 28 03:53:45 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Nov 28 03:53:45 localhost systemd[1]: Finished dnf makecache. Nov 28 03:53:45 localhost systemd[1]: dnf-makecache.service: Consumed 1.910s CPU time. Nov 28 03:53:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:53:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:53:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:53:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:53:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:53:48 localhost systemd[1]: tmp-crun.yXdwRY.mount: Deactivated successfully. Nov 28 03:53:48 localhost podman[97578]: 2025-11-28 08:53:48.87400195 +0000 UTC m=+0.111686281 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Nov 28 03:53:48 localhost podman[97578]: 2025-11-28 08:53:48.908300391 +0000 UTC m=+0.145984702 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=ceilometer_agent_compute) Nov 28 03:53:48 localhost systemd[1]: tmp-crun.HeJXjq.mount: Deactivated successfully. Nov 28 03:53:48 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:53:48 localhost podman[97579]: 2025-11-28 08:53:48.914290628 +0000 UTC m=+0.151104881 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 28 03:53:48 localhost podman[97581]: 2025-11-28 08:53:48.968605066 +0000 UTC m=+0.198815223 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:53:49 localhost podman[97580]: 2025-11-28 08:53:49.022709256 +0000 UTC m=+0.255730861 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-cron, container_name=logrotate_crond, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:53:49 localhost podman[97581]: 2025-11-28 08:53:49.027559748 +0000 UTC m=+0.257769885 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public) Nov 28 03:53:49 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:53:49 localhost podman[97580]: 2025-11-28 08:53:49.080603506 +0000 UTC m=+0.313625131 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, batch=17.1_20251118.1, name=rhosp17/openstack-cron, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:53:49 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:53:49 localhost podman[97587]: 2025-11-28 08:53:49.161284467 +0000 UTC m=+0.387529901 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, io.buildah.version=1.41.4, config_id=tripleo_step4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:53:49 localhost podman[97587]: 2025-11-28 08:53:49.219557677 +0000 UTC m=+0.445803171 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 28 03:53:49 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:53:49 localhost podman[97579]: 2025-11-28 08:53:49.314171584 +0000 UTC m=+0.550985917 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, version=17.1.12, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:53:49 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:53:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:53:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:53:56 localhost systemd[1]: tmp-crun.QuTMbu.mount: Deactivated successfully. Nov 28 03:53:56 localhost podman[97694]: 2025-11-28 08:53:56.855974847 +0000 UTC m=+0.092175088 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:53:56 localhost podman[97694]: 2025-11-28 08:53:56.896646971 +0000 UTC m=+0.132847202 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 28 03:53:56 localhost podman[97694]: unhealthy Nov 28 03:53:56 localhost podman[97695]: 2025-11-28 08:53:56.914084857 +0000 UTC m=+0.143302088 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, architecture=x86_64, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller) Nov 28 03:53:56 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:53:56 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:53:56 localhost podman[97695]: 2025-11-28 08:53:56.932499094 +0000 UTC m=+0.161716385 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, version=17.1.12) Nov 28 03:53:56 localhost podman[97695]: unhealthy Nov 28 03:53:56 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:53:56 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:54:05 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 03:54:05 localhost recover_tripleo_nova_virtqemud[97734]: 61397 Nov 28 03:54:05 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 03:54:05 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 03:54:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:54:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:54:12 localhost systemd[1]: tmp-crun.GQjtmg.mount: Deactivated successfully. Nov 28 03:54:12 localhost podman[97735]: 2025-11-28 08:54:12.861563326 +0000 UTC m=+0.097356390 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, batch=17.1_20251118.1, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.buildah.version=1.41.4) Nov 28 03:54:12 localhost podman[97735]: 2025-11-28 08:54:12.898986988 +0000 UTC m=+0.134780042 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, container_name=iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:54:12 localhost systemd[1]: tmp-crun.vPAWTJ.mount: Deactivated successfully. Nov 28 03:54:12 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:54:12 localhost podman[97736]: 2025-11-28 08:54:12.917214689 +0000 UTC m=+0.152761375 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, name=rhosp17/openstack-collectd, version=17.1.12, config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team) Nov 28 03:54:12 localhost podman[97736]: 2025-11-28 08:54:12.952511694 +0000 UTC m=+0.188058370 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:54:12 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:54:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:54:14 localhost podman[97771]: 2025-11-28 08:54:14.847268719 +0000 UTC m=+0.083371352 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, vcs-type=git) Nov 28 03:54:15 localhost podman[97771]: 2025-11-28 08:54:15.037559338 +0000 UTC m=+0.273661981 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, vcs-type=git, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, architecture=x86_64) Nov 28 03:54:15 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:54:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:54:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:54:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:54:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:54:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:54:19 localhost podman[97801]: 2025-11-28 08:54:19.863419183 +0000 UTC m=+0.093699194 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, architecture=x86_64, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044) Nov 28 03:54:19 localhost systemd[1]: tmp-crun.7lx01n.mount: Deactivated successfully. Nov 28 03:54:19 localhost podman[97800]: 2025-11-28 08:54:19.922662989 +0000 UTC m=+0.156674468 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-19T00:11:48Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute) Nov 28 03:54:19 localhost podman[97800]: 2025-11-28 08:54:19.950978555 +0000 UTC m=+0.184990044 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container) Nov 28 03:54:19 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:54:19 localhost podman[97814]: 2025-11-28 08:54:19.969139694 +0000 UTC m=+0.187173172 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi) Nov 28 03:54:20 localhost podman[97808]: 2025-11-28 08:54:20.019297145 +0000 UTC m=+0.241098951 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, url=https://www.redhat.com) Nov 28 03:54:20 localhost podman[97802]: 2025-11-28 08:54:20.066293377 +0000 UTC m=+0.293470571 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Nov 28 03:54:20 localhost podman[97814]: 2025-11-28 08:54:20.071414747 +0000 UTC m=+0.289448235 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:54:20 localhost podman[97808]: 2025-11-28 08:54:20.071852701 +0000 UTC m=+0.293654567 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, url=https://www.redhat.com, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z) Nov 28 03:54:20 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:54:20 localhost podman[97802]: 2025-11-28 08:54:20.12226836 +0000 UTC m=+0.349445554 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, io.openshift.expose-services=, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron) Nov 28 03:54:20 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:54:20 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:54:20 localhost podman[97801]: 2025-11-28 08:54:20.227541906 +0000 UTC m=+0.457821967 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Nov 28 03:54:20 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:54:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:54:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:54:27 localhost podman[97992]: 2025-11-28 08:54:27.855732399 +0000 UTC m=+0.089487423 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, version=17.1.12, distribution-scope=public, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent) Nov 28 03:54:27 localhost podman[97993]: 2025-11-28 08:54:27.904528968 +0000 UTC m=+0.138207310 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, release=1761123044, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc.) Nov 28 03:54:27 localhost podman[97993]: 2025-11-28 08:54:27.920332992 +0000 UTC m=+0.154011294 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1761123044, vcs-type=git) Nov 28 03:54:27 localhost podman[97993]: unhealthy Nov 28 03:54:27 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:54:27 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:54:27 localhost podman[97992]: 2025-11-28 08:54:27.932225464 +0000 UTC m=+0.165980528 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Nov 28 03:54:27 localhost podman[97992]: unhealthy Nov 28 03:54:27 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:54:27 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:54:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:54:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:54:43 localhost podman[98032]: 2025-11-28 08:54:43.854785572 +0000 UTC m=+0.090732322 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:54:43 localhost podman[98032]: 2025-11-28 08:54:43.89237954 +0000 UTC m=+0.128326230 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vcs-type=git, config_id=tripleo_step3, name=rhosp17/openstack-collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044) Nov 28 03:54:43 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:54:43 localhost podman[98031]: 2025-11-28 08:54:43.900267437 +0000 UTC m=+0.139023325 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., container_name=iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 28 03:54:43 localhost podman[98031]: 2025-11-28 08:54:43.983439351 +0000 UTC m=+0.222195209 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, distribution-scope=public, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid) Nov 28 03:54:43 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:54:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:54:45 localhost systemd[1]: tmp-crun.fozlxL.mount: Deactivated successfully. Nov 28 03:54:45 localhost podman[98070]: 2025-11-28 08:54:45.85153231 +0000 UTC m=+0.090082322 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step1, batch=17.1_20251118.1, container_name=metrics_qdr, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, version=17.1.12) Nov 28 03:54:46 localhost podman[98070]: 2025-11-28 08:54:46.053324499 +0000 UTC m=+0.291874471 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, architecture=x86_64, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, release=1761123044, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1) Nov 28 03:54:46 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:54:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:54:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:54:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:54:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:54:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:54:50 localhost podman[98100]: 2025-11-28 08:54:50.862606606 +0000 UTC m=+0.100832268 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=) Nov 28 03:54:50 localhost podman[98101]: 2025-11-28 08:54:50.918372013 +0000 UTC m=+0.152954962 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, batch=17.1_20251118.1, config_id=tripleo_step4, container_name=nova_migration_target, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:54:50 localhost podman[98100]: 2025-11-28 08:54:50.94766737 +0000 UTC m=+0.185893012 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-19T00:11:48Z, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:54:51 localhost podman[98103]: 2025-11-28 08:54:51.024687892 +0000 UTC m=+0.252279961 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step5, io.buildah.version=1.41.4, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-nova-compute) Nov 28 03:54:51 localhost podman[98102]: 2025-11-28 08:54:50.991672648 +0000 UTC m=+0.223527691 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:54:51 localhost podman[98102]: 2025-11-28 08:54:51.073286544 +0000 UTC m=+0.305141597 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, name=rhosp17/openstack-cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:54:51 localhost podman[98103]: 2025-11-28 08:54:51.083560026 +0000 UTC m=+0.311152085 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, container_name=nova_compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044) Nov 28 03:54:51 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:54:51 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:54:51 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:54:51 localhost podman[98109]: 2025-11-28 08:54:51.122193155 +0000 UTC m=+0.346800511 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container) Nov 28 03:54:51 localhost podman[98109]: 2025-11-28 08:54:51.152506914 +0000 UTC m=+0.377114330 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:54:51 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:54:51 localhost podman[98101]: 2025-11-28 08:54:51.281474473 +0000 UTC m=+0.516057482 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, vcs-type=git) Nov 28 03:54:51 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:54:51 localhost systemd[1]: tmp-crun.BQL1OG.mount: Deactivated successfully. Nov 28 03:54:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:54:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:54:58 localhost podman[98220]: 2025-11-28 08:54:58.839728036 +0000 UTC m=+0.081247755 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, build-date=2025-11-19T00:14:25Z, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, architecture=x86_64, container_name=ovn_metadata_agent, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044) Nov 28 03:54:58 localhost podman[98220]: 2025-11-28 08:54:58.881419762 +0000 UTC m=+0.122939481 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 28 03:54:58 localhost podman[98220]: unhealthy Nov 28 03:54:58 localhost systemd[1]: tmp-crun.tskW4s.mount: Deactivated successfully. Nov 28 03:54:58 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:54:58 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:54:58 localhost podman[98221]: 2025-11-28 08:54:58.903080421 +0000 UTC m=+0.140930084 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1) Nov 28 03:54:58 localhost podman[98221]: 2025-11-28 08:54:58.946484069 +0000 UTC m=+0.184333762 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, config_id=tripleo_step4, io.buildah.version=1.41.4, release=1761123044) Nov 28 03:54:58 localhost podman[98221]: unhealthy Nov 28 03:54:58 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:54:58 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:55:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:55:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:55:14 localhost systemd[1]: tmp-crun.rN6qi5.mount: Deactivated successfully. Nov 28 03:55:14 localhost podman[98258]: 2025-11-28 08:55:14.859711553 +0000 UTC m=+0.088704479 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:55:14 localhost podman[98258]: 2025-11-28 08:55:14.893560973 +0000 UTC m=+0.122553959 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, vcs-type=git, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z) Nov 28 03:55:14 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:55:14 localhost podman[98259]: 2025-11-28 08:55:14.908934425 +0000 UTC m=+0.135754232 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044) Nov 28 03:55:14 localhost podman[98259]: 2025-11-28 08:55:14.988758985 +0000 UTC m=+0.215578772 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, vcs-type=git, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container) Nov 28 03:55:15 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:55:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:55:16 localhost systemd[1]: tmp-crun.StDRY6.mount: Deactivated successfully. Nov 28 03:55:16 localhost podman[98297]: 2025-11-28 08:55:16.849721761 +0000 UTC m=+0.082776172 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., container_name=metrics_qdr, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044) Nov 28 03:55:17 localhost podman[98297]: 2025-11-28 08:55:17.040545177 +0000 UTC m=+0.273599578 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 28 03:55:17 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:55:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:55:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:55:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:55:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:55:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:55:21 localhost systemd[1]: tmp-crun.JitIoh.mount: Deactivated successfully. Nov 28 03:55:21 localhost systemd[1]: tmp-crun.YRB9sN.mount: Deactivated successfully. Nov 28 03:55:21 localhost podman[98328]: 2025-11-28 08:55:21.885071608 +0000 UTC m=+0.111565145 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron) Nov 28 03:55:21 localhost podman[98328]: 2025-11-28 08:55:21.913510289 +0000 UTC m=+0.140003796 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:55:21 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:55:21 localhost podman[98334]: 2025-11-28 08:55:21.963729161 +0000 UTC m=+0.183291061 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 28 03:55:21 localhost podman[98327]: 2025-11-28 08:55:21.918634329 +0000 UTC m=+0.147085967 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, container_name=nova_migration_target, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-type=git) Nov 28 03:55:21 localhost podman[98334]: 2025-11-28 08:55:21.992357558 +0000 UTC m=+0.211919538 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true) Nov 28 03:55:22 localhost podman[98333]: 2025-11-28 08:55:21.850773853 +0000 UTC m=+0.078188558 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., container_name=nova_compute, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true) Nov 28 03:55:22 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:55:22 localhost podman[98333]: 2025-11-28 08:55:22.034287091 +0000 UTC m=+0.261701836 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:55:22 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:55:22 localhost podman[98326]: 2025-11-28 08:55:21.895616079 +0000 UTC m=+0.131145428 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4) Nov 28 03:55:22 localhost podman[98326]: 2025-11-28 08:55:22.078201646 +0000 UTC m=+0.313730985 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 28 03:55:22 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:55:22 localhost podman[98327]: 2025-11-28 08:55:22.27537204 +0000 UTC m=+0.503823738 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:55:22 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:55:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:55:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:55:29 localhost podman[98520]: 2025-11-28 08:55:29.85975239 +0000 UTC m=+0.091063532 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, version=17.1.12, architecture=x86_64, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 03:55:29 localhost systemd[1]: tmp-crun.VCpchZ.mount: Deactivated successfully. Nov 28 03:55:29 localhost podman[98521]: 2025-11-28 08:55:29.914294348 +0000 UTC m=+0.142364989 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 03:55:29 localhost podman[98520]: 2025-11-28 08:55:29.927343517 +0000 UTC m=+0.158654649 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, config_id=tripleo_step4, architecture=x86_64, version=17.1.12, release=1761123044) Nov 28 03:55:29 localhost podman[98520]: unhealthy Nov 28 03:55:29 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:55:29 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:55:29 localhost podman[98521]: 2025-11-28 08:55:29.957495351 +0000 UTC m=+0.185565992 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, version=17.1.12, tcib_managed=true, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container) Nov 28 03:55:29 localhost podman[98521]: unhealthy Nov 28 03:55:29 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:55:29 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:55:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:55:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:55:45 localhost podman[98560]: 2025-11-28 08:55:45.863983682 +0000 UTC m=+0.090688321 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, vcs-type=git, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true) Nov 28 03:55:45 localhost podman[98559]: 2025-11-28 08:55:45.910107257 +0000 UTC m=+0.140932484 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, vcs-type=git, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com) Nov 28 03:55:45 localhost podman[98559]: 2025-11-28 08:55:45.919417428 +0000 UTC m=+0.150242645 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:55:45 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:55:45 localhost podman[98560]: 2025-11-28 08:55:45.977505197 +0000 UTC m=+0.204209826 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, distribution-scope=public, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:55:45 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:55:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:55:47 localhost podman[98598]: 2025-11-28 08:55:47.844589895 +0000 UTC m=+0.078614443 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:55:48 localhost podman[98598]: 2025-11-28 08:55:48.033406738 +0000 UTC m=+0.267431306 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1) Nov 28 03:55:48 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:55:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:55:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:55:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:55:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:55:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:55:52 localhost podman[98628]: 2025-11-28 08:55:52.836612304 +0000 UTC m=+0.077536529 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:55:52 localhost podman[98629]: 2025-11-28 08:55:52.852522742 +0000 UTC m=+0.088021757 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, batch=17.1_20251118.1, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:55:52 localhost podman[98637]: 2025-11-28 08:55:52.888076496 +0000 UTC m=+0.117367117 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc.) Nov 28 03:55:52 localhost podman[98628]: 2025-11-28 08:55:52.896366006 +0000 UTC m=+0.137290251 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible) Nov 28 03:55:52 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:55:52 localhost podman[98636]: 2025-11-28 08:55:52.936866023 +0000 UTC m=+0.169117436 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, release=1761123044, distribution-scope=public, config_id=tripleo_step5, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:55:52 localhost podman[98637]: 2025-11-28 08:55:52.941288972 +0000 UTC m=+0.170579553 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, version=17.1.12, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi) Nov 28 03:55:52 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:55:52 localhost podman[98636]: 2025-11-28 08:55:52.986530259 +0000 UTC m=+0.218781702 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, version=17.1.12) Nov 28 03:55:52 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:55:53 localhost podman[98630]: 2025-11-28 08:55:53.056545262 +0000 UTC m=+0.288979391 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, container_name=logrotate_crond) Nov 28 03:55:53 localhost podman[98630]: 2025-11-28 08:55:53.068523576 +0000 UTC m=+0.300957745 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, distribution-scope=public, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, version=17.1.12, url=https://www.redhat.com, batch=17.1_20251118.1) Nov 28 03:55:53 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:55:53 localhost podman[98629]: 2025-11-28 08:55:53.19028507 +0000 UTC m=+0.425784115 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12) Nov 28 03:55:53 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:55:53 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 03:55:53 localhost recover_tripleo_nova_virtqemud[98752]: 61397 Nov 28 03:55:53 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 03:55:53 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 03:56:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:56:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:56:00 localhost podman[98754]: 2025-11-28 08:56:00.845894159 +0000 UTC m=+0.080369958 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-type=git, container_name=ovn_controller, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 03:56:00 localhost podman[98754]: 2025-11-28 08:56:00.860683432 +0000 UTC m=+0.095159241 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.buildah.version=1.41.4, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-ovn-controller, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 03:56:00 localhost podman[98753]: 2025-11-28 08:56:00.89987293 +0000 UTC m=+0.134919807 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 03:56:00 localhost podman[98754]: unhealthy Nov 28 03:56:00 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:56:00 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:56:00 localhost podman[98753]: 2025-11-28 08:56:00.966607369 +0000 UTC m=+0.201654246 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, tcib_managed=true, container_name=ovn_metadata_agent, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z) Nov 28 03:56:00 localhost podman[98753]: unhealthy Nov 28 03:56:00 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:56:00 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:56:14 localhost systemd[1]: session-28.scope: Deactivated successfully. Nov 28 03:56:14 localhost systemd[1]: session-28.scope: Consumed 7min 6.514s CPU time. Nov 28 03:56:14 localhost systemd-logind[764]: Session 28 logged out. Waiting for processes to exit. Nov 28 03:56:14 localhost systemd-logind[764]: Removed session 28. Nov 28 03:56:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:56:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:56:16 localhost systemd[1]: tmp-crun.NIguys.mount: Deactivated successfully. Nov 28 03:56:16 localhost podman[98796]: 2025-11-28 08:56:16.903245988 +0000 UTC m=+0.134323738 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=) Nov 28 03:56:16 localhost podman[98797]: 2025-11-28 08:56:16.868784099 +0000 UTC m=+0.098351261 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, release=1761123044, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, container_name=collectd) Nov 28 03:56:16 localhost podman[98796]: 2025-11-28 08:56:16.941433074 +0000 UTC m=+0.172510814 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, architecture=x86_64, tcib_managed=true, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, container_name=iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Nov 28 03:56:16 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:56:16 localhost podman[98797]: 2025-11-28 08:56:16.954361379 +0000 UTC m=+0.183928481 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, version=17.1.12, vcs-type=git, name=rhosp17/openstack-collectd, tcib_managed=true, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, container_name=collectd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:56:16 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:56:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:56:18 localhost systemd[1]: tmp-crun.gTGWVQ.mount: Deactivated successfully. Nov 28 03:56:18 localhost podman[98835]: 2025-11-28 08:56:18.851885631 +0000 UTC m=+0.089612597 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.buildah.version=1.41.4, container_name=metrics_qdr, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, vcs-type=git) Nov 28 03:56:19 localhost podman[98835]: 2025-11-28 08:56:19.069462305 +0000 UTC m=+0.307189261 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, vcs-type=git, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 28 03:56:19 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:56:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:56:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:56:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:56:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:56:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:56:23 localhost podman[98865]: 2025-11-28 08:56:23.849090705 +0000 UTC m=+0.084217598 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.component=openstack-cron-container) Nov 28 03:56:23 localhost systemd[1]: tmp-crun.BiH09W.mount: Deactivated successfully. Nov 28 03:56:23 localhost podman[98863]: 2025-11-28 08:56:23.91760077 +0000 UTC m=+0.155683576 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:56:23 localhost podman[98863]: 2025-11-28 08:56:23.945335049 +0000 UTC m=+0.183417825 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public) Nov 28 03:56:23 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:56:23 localhost podman[98872]: 2025-11-28 08:56:23.961929609 +0000 UTC m=+0.184941063 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team) Nov 28 03:56:24 localhost podman[98864]: 2025-11-28 08:56:24.012992957 +0000 UTC m=+0.248440790 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, version=17.1.12, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:56:24 localhost podman[98865]: 2025-11-28 08:56:24.06894952 +0000 UTC m=+0.304076393 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, release=1761123044, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, config_id=tripleo_step4, name=rhosp17/openstack-cron, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:56:24 localhost podman[98866]: 2025-11-28 08:56:23.884309898 +0000 UTC m=+0.111322547 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, architecture=x86_64, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, io.openshift.expose-services=, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:56:24 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:56:24 localhost podman[98872]: 2025-11-28 08:56:24.119348358 +0000 UTC m=+0.342359812 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 28 03:56:24 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:56:24 localhost podman[98866]: 2025-11-28 08:56:24.205637391 +0000 UTC m=+0.432650110 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, container_name=nova_compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, version=17.1.12) Nov 28 03:56:24 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:56:24 localhost podman[98864]: 2025-11-28 08:56:24.406360657 +0000 UTC m=+0.641808470 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, url=https://www.redhat.com, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target) Nov 28 03:56:24 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:56:24 localhost systemd[1]: Stopping User Manager for UID 1003... Nov 28 03:56:24 localhost systemd[35694]: Activating special unit Exit the Session... Nov 28 03:56:24 localhost systemd[35694]: Removed slice User Background Tasks Slice. Nov 28 03:56:24 localhost systemd[35694]: Stopped target Main User Target. Nov 28 03:56:24 localhost systemd[35694]: Stopped target Basic System. Nov 28 03:56:24 localhost systemd[35694]: Stopped target Paths. Nov 28 03:56:24 localhost systemd[35694]: Stopped target Sockets. Nov 28 03:56:24 localhost systemd[35694]: Stopped target Timers. Nov 28 03:56:24 localhost systemd[35694]: Stopped Mark boot as successful after the user session has run 2 minutes. Nov 28 03:56:24 localhost systemd[35694]: Stopped Daily Cleanup of User's Temporary Directories. Nov 28 03:56:24 localhost systemd[35694]: Closed D-Bus User Message Bus Socket. Nov 28 03:56:24 localhost systemd[35694]: Stopped Create User's Volatile Files and Directories. Nov 28 03:56:24 localhost systemd[35694]: Removed slice User Application Slice. Nov 28 03:56:24 localhost systemd[35694]: Reached target Shutdown. Nov 28 03:56:24 localhost systemd[35694]: Finished Exit the Session. Nov 28 03:56:24 localhost systemd[35694]: Reached target Exit the Session. Nov 28 03:56:24 localhost systemd[1]: user@1003.service: Deactivated successfully. Nov 28 03:56:24 localhost systemd[1]: Stopped User Manager for UID 1003. Nov 28 03:56:24 localhost systemd[1]: user@1003.service: Consumed 5.344s CPU time, read 0B from disk, written 7.0K to disk. Nov 28 03:56:24 localhost systemd[1]: Stopping User Runtime Directory /run/user/1003... Nov 28 03:56:24 localhost systemd[1]: user-runtime-dir@1003.service: Deactivated successfully. Nov 28 03:56:24 localhost systemd[1]: Stopped User Runtime Directory /run/user/1003. Nov 28 03:56:24 localhost systemd[1]: Removed slice User Slice of UID 1003. Nov 28 03:56:24 localhost systemd[1]: user-1003.slice: Consumed 7min 11.886s CPU time. Nov 28 03:56:24 localhost systemd[1]: run-user-1003.mount: Deactivated successfully. Nov 28 03:56:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:56:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:56:31 localhost podman[99057]: 2025-11-28 08:56:31.847546193 +0000 UTC m=+0.086070947 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1761123044, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=ovn_controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, architecture=x86_64) Nov 28 03:56:31 localhost podman[99057]: 2025-11-28 08:56:31.869346646 +0000 UTC m=+0.107871370 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, release=1761123044, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:56:31 localhost podman[99057]: unhealthy Nov 28 03:56:31 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:56:31 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:56:31 localhost podman[99056]: 2025-11-28 08:56:31.953987386 +0000 UTC m=+0.193767229 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, version=17.1.12, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 03:56:31 localhost podman[99056]: 2025-11-28 08:56:31.997472607 +0000 UTC m=+0.237252370 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 03:56:32 localhost podman[99056]: unhealthy Nov 28 03:56:32 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:56:32 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:56:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:56:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:56:47 localhost podman[99097]: 2025-11-28 08:56:47.846362677 +0000 UTC m=+0.082984620 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, distribution-scope=public, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc.) Nov 28 03:56:47 localhost podman[99097]: 2025-11-28 08:56:47.86052928 +0000 UTC m=+0.097151223 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, container_name=iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_id=tripleo_step3, url=https://www.redhat.com, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:56:47 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:56:47 localhost podman[99098]: 2025-11-28 08:56:47.947807693 +0000 UTC m=+0.181970659 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, build-date=2025-11-18T22:51:28Z, architecture=x86_64, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true) Nov 28 03:56:47 localhost podman[99098]: 2025-11-28 08:56:47.960646836 +0000 UTC m=+0.194809792 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, release=1761123044) Nov 28 03:56:47 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:56:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:56:49 localhost podman[99134]: 2025-11-28 08:56:49.846777122 +0000 UTC m=+0.080585815 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:56:50 localhost podman[99134]: 2025-11-28 08:56:50.07245332 +0000 UTC m=+0.306261973 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=metrics_qdr, tcib_managed=true, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container) Nov 28 03:56:50 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:56:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:56:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:56:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:56:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:56:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:56:54 localhost podman[99166]: 2025-11-28 08:56:54.858752336 +0000 UTC m=+0.092889160 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, config_id=tripleo_step4, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team) Nov 28 03:56:54 localhost systemd[1]: tmp-crun.FSz5V7.mount: Deactivated successfully. Nov 28 03:56:54 localhost podman[99166]: 2025-11-28 08:56:54.922385359 +0000 UTC m=+0.156522223 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, container_name=logrotate_crond, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron) Nov 28 03:56:54 localhost podman[99168]: 2025-11-28 08:56:54.922583525 +0000 UTC m=+0.145688843 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 28 03:56:54 localhost podman[99168]: 2025-11-28 08:56:54.953450692 +0000 UTC m=+0.176555970 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 28 03:56:54 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:56:55 localhost podman[99167]: 2025-11-28 08:56:55.013377708 +0000 UTC m=+0.243004950 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team) Nov 28 03:56:55 localhost podman[99165]: 2025-11-28 08:56:54.974790579 +0000 UTC m=+0.208590442 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public) Nov 28 03:56:55 localhost podman[99167]: 2025-11-28 08:56:55.070691613 +0000 UTC m=+0.300318845 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git) Nov 28 03:56:55 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:56:55 localhost podman[99164]: 2025-11-28 08:56:55.073384388 +0000 UTC m=+0.308202183 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1) Nov 28 03:56:55 localhost podman[99164]: 2025-11-28 08:56:55.157393729 +0000 UTC m=+0.392211544 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, release=1761123044, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:56:55 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:56:55 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:56:55 localhost podman[99165]: 2025-11-28 08:56:55.342098923 +0000 UTC m=+0.575898866 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 03:56:55 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:56:55 localhost systemd[1]: tmp-crun.P1RemA.mount: Deactivated successfully. Nov 28 03:57:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:57:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:57:02 localhost podman[99287]: 2025-11-28 08:57:02.852125782 +0000 UTC m=+0.084781686 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com) Nov 28 03:57:02 localhost podman[99287]: 2025-11-28 08:57:02.898370811 +0000 UTC m=+0.131026715 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=) Nov 28 03:57:02 localhost podman[99287]: unhealthy Nov 28 03:57:02 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:57:02 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:57:02 localhost podman[99288]: 2025-11-28 08:57:02.902546661 +0000 UTC m=+0.132088767 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, vcs-type=git, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 28 03:57:02 localhost podman[99288]: 2025-11-28 08:57:02.986974246 +0000 UTC m=+0.216516362 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, batch=17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, architecture=x86_64) Nov 28 03:57:02 localhost podman[99288]: unhealthy Nov 28 03:57:02 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:57:02 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:57:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:57:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:57:18 localhost systemd[1]: tmp-crun.aPNwdz.mount: Deactivated successfully. Nov 28 03:57:18 localhost podman[99325]: 2025-11-28 08:57:18.843236906 +0000 UTC m=+0.080662957 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, release=1761123044, architecture=x86_64, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:57:18 localhost podman[99325]: 2025-11-28 08:57:18.884552119 +0000 UTC m=+0.121978250 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=) Nov 28 03:57:18 localhost systemd[1]: tmp-crun.Q05erx.mount: Deactivated successfully. Nov 28 03:57:18 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:57:18 localhost podman[99326]: 2025-11-28 08:57:18.912098882 +0000 UTC m=+0.147866212 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, batch=17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, container_name=collectd, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible) Nov 28 03:57:18 localhost podman[99326]: 2025-11-28 08:57:18.926444861 +0000 UTC m=+0.162212261 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, container_name=collectd, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:57:18 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:57:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:57:20 localhost systemd[1]: tmp-crun.hHzywW.mount: Deactivated successfully. Nov 28 03:57:20 localhost podman[99363]: 2025-11-28 08:57:20.849734021 +0000 UTC m=+0.087975586 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 28 03:57:21 localhost podman[99363]: 2025-11-28 08:57:21.04162928 +0000 UTC m=+0.279870865 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, vcs-type=git, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true) Nov 28 03:57:21 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:57:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:57:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:57:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:57:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:57:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:57:25 localhost systemd[1]: tmp-crun.uXWe86.mount: Deactivated successfully. Nov 28 03:57:25 localhost podman[99394]: 2025-11-28 08:57:25.866313191 +0000 UTC m=+0.097341150 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:57:25 localhost systemd[1]: tmp-crun.VPghA6.mount: Deactivated successfully. Nov 28 03:57:25 localhost podman[99393]: 2025-11-28 08:57:25.957885848 +0000 UTC m=+0.193934474 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.expose-services=, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-ceilometer-compute) Nov 28 03:57:25 localhost podman[99395]: 2025-11-28 08:57:25.93110319 +0000 UTC m=+0.160802918 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-cron, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, container_name=logrotate_crond) Nov 28 03:57:25 localhost podman[99393]: 2025-11-28 08:57:25.993498093 +0000 UTC m=+0.229546759 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, vcs-type=git, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 28 03:57:26 localhost podman[99395]: 2025-11-28 08:57:26.010731093 +0000 UTC m=+0.240430781 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., url=https://www.redhat.com) Nov 28 03:57:26 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:57:26 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:57:26 localhost podman[99402]: 2025-11-28 08:57:25.933459174 +0000 UTC m=+0.152748125 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-type=git) Nov 28 03:57:26 localhost podman[99402]: 2025-11-28 08:57:26.063503966 +0000 UTC m=+0.282792917 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 28 03:57:26 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:57:26 localhost podman[99401]: 2025-11-28 08:57:26.086233957 +0000 UTC m=+0.308040517 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_compute, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044) Nov 28 03:57:26 localhost podman[99401]: 2025-11-28 08:57:26.143554253 +0000 UTC m=+0.365360813 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step5, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:57:26 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:57:26 localhost podman[99394]: 2025-11-28 08:57:26.230468165 +0000 UTC m=+0.461496164 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:57:26 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:57:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:57:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:57:33 localhost podman[99589]: 2025-11-28 08:57:33.847278773 +0000 UTC m=+0.082663450 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, container_name=ovn_metadata_agent, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:57:33 localhost systemd[1]: tmp-crun.SwpesI.mount: Deactivated successfully. Nov 28 03:57:33 localhost podman[99589]: 2025-11-28 08:57:33.891493127 +0000 UTC m=+0.126877784 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git) Nov 28 03:57:33 localhost podman[99589]: unhealthy Nov 28 03:57:33 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:57:33 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:57:33 localhost podman[99590]: 2025-11-28 08:57:33.906383894 +0000 UTC m=+0.141313527 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, container_name=ovn_controller, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 03:57:33 localhost podman[99590]: 2025-11-28 08:57:33.988087932 +0000 UTC m=+0.223017495 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team) Nov 28 03:57:33 localhost podman[99590]: unhealthy Nov 28 03:57:34 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:57:34 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:57:37 localhost sshd[99628]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:57:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:57:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:57:49 localhost systemd[1]: tmp-crun.T6NWc3.mount: Deactivated successfully. Nov 28 03:57:49 localhost podman[99630]: 2025-11-28 08:57:49.866827088 +0000 UTC m=+0.100043324 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid) Nov 28 03:57:49 localhost podman[99630]: 2025-11-28 08:57:49.906613404 +0000 UTC m=+0.139829700 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, batch=17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, distribution-scope=public, vcs-type=git, architecture=x86_64) Nov 28 03:57:49 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:57:49 localhost podman[99631]: 2025-11-28 08:57:49.951943973 +0000 UTC m=+0.184254610 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:57:49 localhost podman[99631]: 2025-11-28 08:57:49.989461068 +0000 UTC m=+0.221771706 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, architecture=x86_64, config_id=tripleo_step3, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-collectd) Nov 28 03:57:50 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:57:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:57:51 localhost podman[99670]: 2025-11-28 08:57:51.858844329 +0000 UTC m=+0.087183491 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc.) Nov 28 03:57:52 localhost podman[99670]: 2025-11-28 08:57:52.084566898 +0000 UTC m=+0.312906040 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044) Nov 28 03:57:52 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:57:55 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 03:57:55 localhost recover_tripleo_nova_virtqemud[99700]: 61397 Nov 28 03:57:55 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 03:57:55 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 03:57:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:57:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:57:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:57:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:57:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:57:56 localhost podman[99702]: 2025-11-28 08:57:56.860220361 +0000 UTC m=+0.091826656 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12) Nov 28 03:57:56 localhost podman[99701]: 2025-11-28 08:57:56.925118724 +0000 UTC m=+0.156732029 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 28 03:57:56 localhost podman[99703]: 2025-11-28 08:57:56.986222187 +0000 UTC m=+0.215760048 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, tcib_managed=true, container_name=logrotate_crond, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:57:57 localhost podman[99704]: 2025-11-28 08:57:57.025850219 +0000 UTC m=+0.248294237 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step5, distribution-scope=public, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, container_name=nova_compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044) Nov 28 03:57:57 localhost podman[99708]: 2025-11-28 08:57:57.079559471 +0000 UTC m=+0.300179392 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true, release=1761123044, vcs-type=git) Nov 28 03:57:57 localhost podman[99704]: 2025-11-28 08:57:57.089545263 +0000 UTC m=+0.311989331 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, release=1761123044, container_name=nova_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, name=rhosp17/openstack-nova-compute) Nov 28 03:57:57 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:57:57 localhost podman[99701]: 2025-11-28 08:57:57.108630761 +0000 UTC m=+0.340244076 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute) Nov 28 03:57:57 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:57:57 localhost podman[99703]: 2025-11-28 08:57:57.145705241 +0000 UTC m=+0.375243092 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1761123044, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, version=17.1.12) Nov 28 03:57:57 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:57:57 localhost podman[99708]: 2025-11-28 08:57:57.162555469 +0000 UTC m=+0.383175390 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:57:57 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:57:57 localhost podman[99702]: 2025-11-28 08:57:57.274385631 +0000 UTC m=+0.505991906 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, release=1761123044, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:57:57 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:57:57 localhost systemd[1]: tmp-crun.Lq8Ha0.mount: Deactivated successfully. Nov 28 03:58:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:58:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:58:04 localhost podman[99818]: 2025-11-28 08:58:04.860936319 +0000 UTC m=+0.070044905 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=) Nov 28 03:58:04 localhost systemd[1]: tmp-crun.a3fAnb.mount: Deactivated successfully. Nov 28 03:58:04 localhost podman[99819]: 2025-11-28 08:58:04.936254148 +0000 UTC m=+0.140142420 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 03:58:04 localhost podman[99818]: 2025-11-28 08:58:04.953936192 +0000 UTC m=+0.163044818 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:58:04 localhost podman[99818]: unhealthy Nov 28 03:58:04 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:58:04 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:58:04 localhost podman[99819]: 2025-11-28 08:58:04.982796616 +0000 UTC m=+0.186684958 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1) Nov 28 03:58:04 localhost podman[99819]: unhealthy Nov 28 03:58:04 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:58:04 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:58:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:58:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:58:20 localhost systemd[1]: tmp-crun.C0kunO.mount: Deactivated successfully. Nov 28 03:58:20 localhost podman[99859]: 2025-11-28 08:58:20.857949681 +0000 UTC m=+0.093779179 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, distribution-scope=public, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, release=1761123044, container_name=iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, vendor=Red Hat, Inc.) Nov 28 03:58:20 localhost podman[99860]: 2025-11-28 08:58:20.906232123 +0000 UTC m=+0.139208941 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_id=tripleo_step3, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T22:51:28Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Nov 28 03:58:20 localhost podman[99860]: 2025-11-28 08:58:20.917423253 +0000 UTC m=+0.150400121 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, config_id=tripleo_step3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z) Nov 28 03:58:20 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:58:20 localhost podman[99859]: 2025-11-28 08:58:20.969117491 +0000 UTC m=+0.204947009 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, config_id=tripleo_step3, container_name=iscsid, vcs-type=git, maintainer=OpenStack TripleO Team) Nov 28 03:58:20 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:58:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:58:22 localhost systemd[1]: tmp-crun.bbyEaT.mount: Deactivated successfully. Nov 28 03:58:22 localhost podman[99899]: 2025-11-28 08:58:22.839596528 +0000 UTC m=+0.081402620 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, version=17.1.12, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-type=git, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Nov 28 03:58:23 localhost podman[99899]: 2025-11-28 08:58:23.043748591 +0000 UTC m=+0.285554643 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, tcib_managed=true, container_name=metrics_qdr, io.buildah.version=1.41.4, config_id=tripleo_step1, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:58:23 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:58:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:58:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:58:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:58:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:58:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:58:27 localhost systemd[1]: tmp-crun.tbk1DP.mount: Deactivated successfully. Nov 28 03:58:27 localhost podman[99931]: 2025-11-28 08:58:27.913321624 +0000 UTC m=+0.136974760 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team) Nov 28 03:58:27 localhost podman[99942]: 2025-11-28 08:58:27.923142262 +0000 UTC m=+0.139660145 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-type=git) Nov 28 03:58:27 localhost podman[99929]: 2025-11-28 08:58:27.841899578 +0000 UTC m=+0.072658266 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, com.redhat.component=openstack-nova-compute-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:58:27 localhost podman[99930]: 2025-11-28 08:58:27.964114075 +0000 UTC m=+0.187962537 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron) Nov 28 03:58:27 localhost podman[99928]: 2025-11-28 08:58:27.873110795 +0000 UTC m=+0.104216354 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:58:27 localhost podman[99931]: 2025-11-28 08:58:27.997863862 +0000 UTC m=+0.221516968 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, tcib_managed=true, version=17.1.12, container_name=nova_compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com) Nov 28 03:58:28 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:58:28 localhost podman[99930]: 2025-11-28 08:58:28.026836999 +0000 UTC m=+0.250685481 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, vendor=Red Hat, Inc., version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64) Nov 28 03:58:28 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:58:28 localhost podman[99942]: 2025-11-28 08:58:28.049785448 +0000 UTC m=+0.266303361 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, distribution-scope=public, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, container_name=ceilometer_agent_ipmi, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi) Nov 28 03:58:28 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:58:28 localhost podman[99928]: 2025-11-28 08:58:28.105534824 +0000 UTC m=+0.336640343 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.12, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true) Nov 28 03:58:28 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:58:28 localhost podman[99929]: 2025-11-28 08:58:28.214157826 +0000 UTC m=+0.444916544 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, container_name=nova_migration_target, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:58:28 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:58:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 03:58:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.1 total, 600.0 interval#012Cumulative writes: 4899 writes, 22K keys, 4899 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4899 writes, 608 syncs, 8.06 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 28 03:58:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:58:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:58:35 localhost podman[100117]: 2025-11-28 08:58:35.854275304 +0000 UTC m=+0.089398411 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:58:35 localhost podman[100117]: 2025-11-28 08:58:35.898663074 +0000 UTC m=+0.133786221 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc.) Nov 28 03:58:35 localhost podman[100117]: unhealthy Nov 28 03:58:35 localhost podman[100118]: 2025-11-28 08:58:35.911489165 +0000 UTC m=+0.143559946 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Nov 28 03:58:35 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:58:35 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:58:35 localhost podman[100118]: 2025-11-28 08:58:35.956547016 +0000 UTC m=+0.188617767 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team) Nov 28 03:58:35 localhost podman[100118]: unhealthy Nov 28 03:58:35 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:58:35 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:58:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 03:58:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.3 total, 600.0 interval#012Cumulative writes: 5616 writes, 25K keys, 5616 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5616 writes, 758 syncs, 7.41 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 28 03:58:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:58:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:58:51 localhost podman[100155]: 2025-11-28 08:58:51.852176664 +0000 UTC m=+0.080956837 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, container_name=iscsid, release=1761123044) Nov 28 03:58:51 localhost systemd[1]: tmp-crun.eEJEy5.mount: Deactivated successfully. Nov 28 03:58:51 localhost podman[100156]: 2025-11-28 08:58:51.915260599 +0000 UTC m=+0.141178802 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, url=https://www.redhat.com, container_name=collectd, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Nov 28 03:58:51 localhost podman[100155]: 2025-11-28 08:58:51.940138988 +0000 UTC m=+0.168919211 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, version=17.1.12, build-date=2025-11-18T23:44:13Z, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, container_name=iscsid, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 28 03:58:51 localhost podman[100156]: 2025-11-28 08:58:51.94946301 +0000 UTC m=+0.175381223 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, tcib_managed=true, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 03:58:51 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:58:51 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:58:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:58:53 localhost systemd[1]: tmp-crun.CMv7WA.mount: Deactivated successfully. Nov 28 03:58:53 localhost podman[100192]: 2025-11-28 08:58:53.85489133 +0000 UTC m=+0.088929685 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:58:54 localhost podman[100192]: 2025-11-28 08:58:54.078559155 +0000 UTC m=+0.312597460 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, build-date=2025-11-18T22:49:46Z, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:58:54 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:58:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:58:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:58:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:58:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:58:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:58:58 localhost systemd[1]: tmp-crun.Mg0VU7.mount: Deactivated successfully. Nov 28 03:58:58 localhost podman[100221]: 2025-11-28 08:58:58.868086584 +0000 UTC m=+0.102675457 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, batch=17.1_20251118.1) Nov 28 03:58:58 localhost podman[100231]: 2025-11-28 08:58:58.919604527 +0000 UTC m=+0.140823381 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12) Nov 28 03:58:58 localhost podman[100221]: 2025-11-28 08:58:58.924429148 +0000 UTC m=+0.159018041 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 28 03:58:58 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:58:58 localhost podman[100231]: 2025-11-28 08:58:58.945109476 +0000 UTC m=+0.166328310 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 03:58:58 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:58:59 localhost podman[100222]: 2025-11-28 08:58:59.014170538 +0000 UTC m=+0.245363014 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible) Nov 28 03:58:59 localhost podman[100224]: 2025-11-28 08:58:59.070246274 +0000 UTC m=+0.295792223 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, version=17.1.12, config_id=tripleo_step5, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, name=rhosp17/openstack-nova-compute) Nov 28 03:58:59 localhost podman[100223]: 2025-11-28 08:58:58.878441528 +0000 UTC m=+0.104692530 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, name=rhosp17/openstack-cron, architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Nov 28 03:58:59 localhost podman[100223]: 2025-11-28 08:58:59.118627259 +0000 UTC m=+0.344878231 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, container_name=logrotate_crond, release=1761123044, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:58:59 localhost podman[100224]: 2025-11-28 08:58:59.128004594 +0000 UTC m=+0.353550603 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, url=https://www.redhat.com, container_name=nova_compute, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.buildah.version=1.41.4) Nov 28 03:58:59 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:58:59 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:58:59 localhost podman[100222]: 2025-11-28 08:58:59.388495211 +0000 UTC m=+0.619687727 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4) Nov 28 03:58:59 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:59:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:59:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:59:06 localhost podman[100339]: 2025-11-28 08:59:06.190112036 +0000 UTC m=+0.082475714 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, distribution-scope=public, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, version=17.1.12, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 03:59:06 localhost podman[100339]: 2025-11-28 08:59:06.204896169 +0000 UTC m=+0.097259897 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-type=git) Nov 28 03:59:06 localhost podman[100339]: unhealthy Nov 28 03:59:06 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:59:06 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:59:06 localhost systemd[1]: tmp-crun.TLwlNh.mount: Deactivated successfully. Nov 28 03:59:06 localhost podman[100340]: 2025-11-28 08:59:06.302246848 +0000 UTC m=+0.192737687 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., architecture=x86_64, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git) Nov 28 03:59:06 localhost podman[100340]: 2025-11-28 08:59:06.319435756 +0000 UTC m=+0.209926595 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=ovn_controller, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, distribution-scope=public) Nov 28 03:59:06 localhost podman[100340]: unhealthy Nov 28 03:59:06 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:59:06 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:59:06 localhost sshd[100378]: main: sshd: ssh-rsa algorithm is disabled Nov 28 03:59:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:59:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:59:22 localhost podman[100381]: 2025-11-28 08:59:22.865448775 +0000 UTC m=+0.099148985 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc.) Nov 28 03:59:22 localhost systemd[1]: tmp-crun.i0xML3.mount: Deactivated successfully. Nov 28 03:59:22 localhost podman[100380]: 2025-11-28 08:59:22.933371002 +0000 UTC m=+0.169104296 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true, container_name=iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 03:59:22 localhost podman[100380]: 2025-11-28 08:59:22.943439278 +0000 UTC m=+0.179172562 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4) Nov 28 03:59:22 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:59:22 localhost podman[100381]: 2025-11-28 08:59:22.9565832 +0000 UTC m=+0.190283440 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, url=https://www.redhat.com, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.12, vcs-type=git, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-collectd-container, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd) Nov 28 03:59:22 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:59:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:59:24 localhost podman[100417]: 2025-11-28 08:59:24.841400136 +0000 UTC m=+0.082938118 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:59:25 localhost podman[100417]: 2025-11-28 08:59:25.030360123 +0000 UTC m=+0.271898035 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 03:59:25 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 03:59:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 03:59:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 03:59:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 03:59:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 03:59:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 03:59:29 localhost podman[100456]: 2025-11-28 08:59:29.878070823 +0000 UTC m=+0.104574436 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, url=https://www.redhat.com, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi) Nov 28 03:59:29 localhost podman[100456]: 2025-11-28 08:59:29.89937468 +0000 UTC m=+0.125878293 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com) Nov 28 03:59:29 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 03:59:29 localhost systemd[1]: tmp-crun.mONzuW.mount: Deactivated successfully. Nov 28 03:59:29 localhost podman[100448]: 2025-11-28 08:59:29.966233254 +0000 UTC m=+0.200096667 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:59:30 localhost podman[100449]: 2025-11-28 08:59:30.014563877 +0000 UTC m=+0.246743878 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, build-date=2025-11-18T22:49:32Z, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, io.openshift.expose-services=, com.redhat.component=openstack-cron-container) Nov 28 03:59:30 localhost podman[100449]: 2025-11-28 08:59:30.021922058 +0000 UTC m=+0.254102069 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron) Nov 28 03:59:30 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 03:59:30 localhost podman[100450]: 2025-11-28 08:59:30.062149987 +0000 UTC m=+0.291419397 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, container_name=nova_compute, vcs-type=git, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Nov 28 03:59:30 localhost podman[100450]: 2025-11-28 08:59:30.094915964 +0000 UTC m=+0.324185384 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:59:30 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 03:59:30 localhost podman[100447]: 2025-11-28 08:59:30.116317084 +0000 UTC m=+0.354486042 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, build-date=2025-11-19T00:11:48Z) Nov 28 03:59:30 localhost podman[100447]: 2025-11-28 08:59:30.146481288 +0000 UTC m=+0.384650276 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Nov 28 03:59:30 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 03:59:30 localhost podman[100448]: 2025-11-28 08:59:30.328596411 +0000 UTC m=+0.562459844 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 03:59:30 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 03:59:30 localhost systemd[1]: tmp-crun.x6r8D3.mount: Deactivated successfully. Nov 28 03:59:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 03:59:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 03:59:36 localhost podman[100649]: 2025-11-28 08:59:36.846972859 +0000 UTC m=+0.085065964 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, release=1761123044, vcs-type=git, tcib_managed=true, container_name=ovn_controller, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=) Nov 28 03:59:36 localhost podman[100649]: 2025-11-28 08:59:36.892500025 +0000 UTC m=+0.130593150 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, release=1761123044, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, container_name=ovn_controller) Nov 28 03:59:36 localhost podman[100649]: unhealthy Nov 28 03:59:36 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:59:36 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 03:59:36 localhost podman[100648]: 2025-11-28 08:59:36.894993903 +0000 UTC m=+0.134425011 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.12, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn) Nov 28 03:59:36 localhost podman[100648]: 2025-11-28 08:59:36.978415696 +0000 UTC m=+0.217846794 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 03:59:36 localhost podman[100648]: unhealthy Nov 28 03:59:36 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 03:59:36 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 03:59:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 03:59:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 03:59:53 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 03:59:53 localhost recover_tripleo_nova_virtqemud[100701]: 61397 Nov 28 03:59:53 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 03:59:53 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 03:59:53 localhost systemd[1]: tmp-crun.ukCqXz.mount: Deactivated successfully. Nov 28 03:59:53 localhost podman[100688]: 2025-11-28 08:59:53.863588218 +0000 UTC m=+0.095676917 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step3, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, container_name=iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Nov 28 03:59:53 localhost systemd[1]: tmp-crun.kKjlcp.mount: Deactivated successfully. Nov 28 03:59:53 localhost podman[100689]: 2025-11-28 08:59:53.919083536 +0000 UTC m=+0.148849032 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, container_name=collectd, distribution-scope=public, release=1761123044, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, version=17.1.12) Nov 28 03:59:53 localhost podman[100688]: 2025-11-28 08:59:53.930164833 +0000 UTC m=+0.162253562 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, architecture=x86_64, distribution-scope=public, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 03:59:53 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 03:59:53 localhost podman[100689]: 2025-11-28 08:59:53.98529452 +0000 UTC m=+0.215060006 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, distribution-scope=public, container_name=collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 03:59:53 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 03:59:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 03:59:55 localhost podman[100729]: 2025-11-28 08:59:55.848337423 +0000 UTC m=+0.085743767 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step1, container_name=metrics_qdr, vcs-type=git) Nov 28 03:59:56 localhost podman[100729]: 2025-11-28 08:59:56.084099246 +0000 UTC m=+0.321505610 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git) Nov 28 03:59:56 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 04:00:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 04:00:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 04:00:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 04:00:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 04:00:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 04:00:00 localhost systemd[1]: tmp-crun.3c693F.mount: Deactivated successfully. Nov 28 04:00:00 localhost podman[100767]: 2025-11-28 09:00:00.927293483 +0000 UTC m=+0.142954927 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, vcs-type=git, release=1761123044, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 04:00:00 localhost podman[100759]: 2025-11-28 09:00:00.96710707 +0000 UTC m=+0.193256203 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-type=git, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container) Nov 28 04:00:00 localhost podman[100767]: 2025-11-28 09:00:00.986469226 +0000 UTC m=+0.202130620 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 04:00:00 localhost podman[100758]: 2025-11-28 09:00:00.892644569 +0000 UTC m=+0.116861641 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, io.openshift.expose-services=, release=1761123044, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Nov 28 04:00:01 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 04:00:01 localhost podman[100758]: 2025-11-28 09:00:01.026533161 +0000 UTC m=+0.250750273 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, architecture=x86_64, tcib_managed=true, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 04:00:01 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 04:00:01 localhost podman[100761]: 2025-11-28 09:00:01.05332176 +0000 UTC m=+0.264092161 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., container_name=nova_compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step5, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public) Nov 28 04:00:01 localhost podman[100761]: 2025-11-28 09:00:01.082526044 +0000 UTC m=+0.293296385 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, container_name=nova_compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 04:00:01 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 04:00:01 localhost podman[100760]: 2025-11-28 09:00:00.987751797 +0000 UTC m=+0.210362430 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-cron, distribution-scope=public, container_name=logrotate_crond, release=1761123044, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, vcs-type=git, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team) Nov 28 04:00:01 localhost podman[100760]: 2025-11-28 09:00:01.171372247 +0000 UTC m=+0.393982830 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., tcib_managed=true) Nov 28 04:00:01 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 04:00:01 localhost podman[100759]: 2025-11-28 09:00:01.387416592 +0000 UTC m=+0.613565715 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute) Nov 28 04:00:01 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 04:00:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 04:00:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 04:00:07 localhost podman[100883]: 2025-11-28 09:00:07.837472902 +0000 UTC m=+0.078752727 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, container_name=ovn_metadata_agent, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team) Nov 28 04:00:07 localhost podman[100884]: 2025-11-28 09:00:07.878495426 +0000 UTC m=+0.117199121 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4) Nov 28 04:00:07 localhost podman[100883]: 2025-11-28 09:00:07.882042618 +0000 UTC m=+0.123322443 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4) Nov 28 04:00:07 localhost podman[100883]: unhealthy Nov 28 04:00:07 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:00:07 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 04:00:07 localhost podman[100884]: 2025-11-28 09:00:07.89615749 +0000 UTC m=+0.134861235 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1761123044, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc.) Nov 28 04:00:07 localhost podman[100884]: unhealthy Nov 28 04:00:07 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:00:07 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 04:00:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 04:00:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 04:00:24 localhost podman[100926]: 2025-11-28 09:00:24.829351964 +0000 UTC m=+0.073560344 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team) Nov 28 04:00:24 localhost podman[100926]: 2025-11-28 09:00:24.838515361 +0000 UTC m=+0.082723701 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, release=1761123044, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git) Nov 28 04:00:24 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 04:00:24 localhost podman[100927]: 2025-11-28 09:00:24.906719258 +0000 UTC m=+0.144494657 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-collectd, url=https://www.redhat.com, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, tcib_managed=true, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044) Nov 28 04:00:24 localhost podman[100927]: 2025-11-28 09:00:24.918659421 +0000 UTC m=+0.156434770 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 28 04:00:24 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 04:00:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 04:00:26 localhost podman[100964]: 2025-11-28 09:00:26.839665268 +0000 UTC m=+0.080846283 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 04:00:27 localhost podman[100964]: 2025-11-28 09:00:27.032454906 +0000 UTC m=+0.273635901 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12) Nov 28 04:00:27 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 04:00:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 04:00:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 04:00:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 04:00:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 04:00:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 04:00:31 localhost systemd[1]: tmp-crun.4tmTOo.mount: Deactivated successfully. Nov 28 04:00:31 localhost systemd[1]: tmp-crun.JxaPRF.mount: Deactivated successfully. Nov 28 04:00:31 localhost podman[100992]: 2025-11-28 09:00:31.911894168 +0000 UTC m=+0.149325797 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64) Nov 28 04:00:31 localhost podman[100995]: 2025-11-28 09:00:31.876091207 +0000 UTC m=+0.104987689 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.12, container_name=nova_compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com) Nov 28 04:00:32 localhost podman[100994]: 2025-11-28 09:00:31.956446993 +0000 UTC m=+0.187070519 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 04:00:32 localhost podman[101001]: 2025-11-28 09:00:32.011784956 +0000 UTC m=+0.236498007 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi) Nov 28 04:00:32 localhost podman[100995]: 2025-11-28 09:00:32.012393505 +0000 UTC m=+0.241289957 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step5, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 04:00:32 localhost podman[100993]: 2025-11-28 09:00:31.979863127 +0000 UTC m=+0.214736316 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 04:00:32 localhost podman[100994]: 2025-11-28 09:00:32.039316708 +0000 UTC m=+0.269940214 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, name=rhosp17/openstack-cron, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1) Nov 28 04:00:32 localhost podman[100992]: 2025-11-28 09:00:32.046937907 +0000 UTC m=+0.284369486 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, architecture=x86_64, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute) Nov 28 04:00:32 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 04:00:32 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 04:00:32 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 04:00:32 localhost podman[101001]: 2025-11-28 09:00:32.114603866 +0000 UTC m=+0.339316907 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team) Nov 28 04:00:32 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 04:00:32 localhost podman[100993]: 2025-11-28 09:00:32.341418669 +0000 UTC m=+0.576291848 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team) Nov 28 04:00:32 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 04:00:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 04:00:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 04:00:38 localhost systemd[1]: tmp-crun.RnYGOO.mount: Deactivated successfully. Nov 28 04:00:38 localhost podman[101233]: 2025-11-28 09:00:38.865518338 +0000 UTC m=+0.095083909 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, architecture=x86_64, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com) Nov 28 04:00:38 localhost podman[101234]: 2025-11-28 09:00:38.912767417 +0000 UTC m=+0.142006758 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=) Nov 28 04:00:38 localhost podman[101233]: 2025-11-28 09:00:38.919166158 +0000 UTC m=+0.148731739 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 28 04:00:38 localhost podman[101233]: unhealthy Nov 28 04:00:38 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:00:38 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 04:00:38 localhost podman[101234]: 2025-11-28 09:00:38.996472878 +0000 UTC m=+0.225712159 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 04:00:39 localhost podman[101234]: unhealthy Nov 28 04:00:39 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:00:39 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 04:00:47 localhost sshd[101274]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:00:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 04:00:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 04:00:55 localhost podman[101276]: 2025-11-28 09:00:55.853872632 +0000 UTC m=+0.090955540 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 04:00:55 localhost podman[101276]: 2025-11-28 09:00:55.893490192 +0000 UTC m=+0.130573070 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true) Nov 28 04:00:55 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 04:00:55 localhost podman[101277]: 2025-11-28 09:00:55.910475334 +0000 UTC m=+0.145736385 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git) Nov 28 04:00:55 localhost podman[101277]: 2025-11-28 09:00:55.925333179 +0000 UTC m=+0.160594210 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, version=17.1.12, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4) Nov 28 04:00:55 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 04:00:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 04:00:57 localhost podman[101315]: 2025-11-28 09:00:57.847148923 +0000 UTC m=+0.085908811 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, url=https://www.redhat.com, config_id=tripleo_step1, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 04:00:58 localhost podman[101315]: 2025-11-28 09:00:58.038466505 +0000 UTC m=+0.277226373 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container) Nov 28 04:00:58 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 04:01:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 04:01:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 04:01:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 04:01:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 04:01:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 04:01:02 localhost podman[101370]: 2025-11-28 09:01:02.884808251 +0000 UTC m=+0.116637584 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, batch=17.1_20251118.1, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container) Nov 28 04:01:02 localhost podman[101383]: 2025-11-28 09:01:02.920512718 +0000 UTC m=+0.141618325 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, distribution-scope=public, build-date=2025-11-19T00:12:45Z, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Nov 28 04:01:02 localhost podman[101373]: 2025-11-28 09:01:02.966332724 +0000 UTC m=+0.189361281 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Nov 28 04:01:02 localhost podman[101370]: 2025-11-28 09:01:02.986896628 +0000 UTC m=+0.218726051 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, architecture=x86_64, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044) Nov 28 04:01:02 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 04:01:03 localhost podman[101383]: 2025-11-28 09:01:03.007154342 +0000 UTC m=+0.228259979 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, batch=17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi) Nov 28 04:01:03 localhost podman[101371]: 2025-11-28 09:01:02.852108997 +0000 UTC m=+0.083497496 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.openshift.expose-services=) Nov 28 04:01:03 localhost podman[101373]: 2025-11-28 09:01:03.019413976 +0000 UTC m=+0.242442503 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, container_name=nova_compute) Nov 28 04:01:03 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 04:01:03 localhost podman[101372]: 2025-11-28 09:01:02.988090386 +0000 UTC m=+0.214123057 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, distribution-scope=public, release=1761123044) Nov 28 04:01:03 localhost podman[101372]: 2025-11-28 09:01:03.068275186 +0000 UTC m=+0.294307937 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, version=17.1.12, distribution-scope=public, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=logrotate_crond, tcib_managed=true, build-date=2025-11-18T22:49:32Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=) Nov 28 04:01:03 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 04:01:03 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 04:01:03 localhost podman[101371]: 2025-11-28 09:01:03.214417082 +0000 UTC m=+0.445805581 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, vcs-type=git, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 04:01:03 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 04:01:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 04:01:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 04:01:09 localhost systemd[1]: tmp-crun.FH8VwZ.mount: Deactivated successfully. Nov 28 04:01:09 localhost podman[101491]: 2025-11-28 09:01:09.887423032 +0000 UTC m=+0.116972203 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, managed_by=tripleo_ansible) Nov 28 04:01:09 localhost podman[101490]: 2025-11-28 09:01:09.903777525 +0000 UTC m=+0.136042691 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 28 04:01:09 localhost podman[101491]: 2025-11-28 09:01:09.931446891 +0000 UTC m=+0.160996062 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Nov 28 04:01:09 localhost podman[101491]: unhealthy Nov 28 04:01:09 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:01:09 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 04:01:09 localhost podman[101490]: 2025-11-28 09:01:09.944512041 +0000 UTC m=+0.176777227 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn) Nov 28 04:01:09 localhost podman[101490]: unhealthy Nov 28 04:01:09 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:01:09 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 04:01:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 04:01:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 04:01:26 localhost podman[101531]: 2025-11-28 09:01:26.838083896 +0000 UTC m=+0.073537783 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-collectd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z) Nov 28 04:01:26 localhost podman[101531]: 2025-11-28 09:01:26.849446922 +0000 UTC m=+0.084900799 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, version=17.1.12, container_name=collectd, config_id=tripleo_step3, io.buildah.version=1.41.4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, maintainer=OpenStack TripleO Team) Nov 28 04:01:26 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 04:01:26 localhost podman[101530]: 2025-11-28 09:01:26.902185874 +0000 UTC m=+0.140423969 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, release=1761123044, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, distribution-scope=public, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z) Nov 28 04:01:26 localhost podman[101530]: 2025-11-28 09:01:26.935384734 +0000 UTC m=+0.173622849 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 04:01:26 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 04:01:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 04:01:28 localhost podman[101569]: 2025-11-28 09:01:28.840289168 +0000 UTC m=+0.079423549 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, distribution-scope=public) Nov 28 04:01:29 localhost podman[101569]: 2025-11-28 09:01:29.000491434 +0000 UTC m=+0.239625875 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 04:01:29 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 04:01:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 04:01:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 04:01:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 04:01:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 04:01:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 04:01:34 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 04:01:34 localhost recover_tripleo_nova_virtqemud[101628]: 61397 Nov 28 04:01:34 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 04:01:34 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 04:01:34 localhost systemd[1]: tmp-crun.ZPme0I.mount: Deactivated successfully. Nov 28 04:01:34 localhost podman[101598]: 2025-11-28 09:01:34.212357557 +0000 UTC m=+0.106929329 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 04:01:34 localhost systemd[1]: tmp-crun.PKuXzr.mount: Deactivated successfully. Nov 28 04:01:34 localhost podman[101599]: 2025-11-28 09:01:34.255164778 +0000 UTC m=+0.145305881 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, container_name=nova_migration_target, config_id=tripleo_step4, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, version=17.1.12, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 04:01:34 localhost podman[101598]: 2025-11-28 09:01:34.262962383 +0000 UTC m=+0.157534135 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, tcib_managed=true, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com) Nov 28 04:01:34 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 04:01:34 localhost podman[101600]: 2025-11-28 09:01:34.309994825 +0000 UTC m=+0.199407015 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1) Nov 28 04:01:34 localhost podman[101600]: 2025-11-28 09:01:34.322345812 +0000 UTC m=+0.211757992 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Nov 28 04:01:34 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 04:01:34 localhost podman[101604]: 2025-11-28 09:01:34.364403389 +0000 UTC m=+0.243178026 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, distribution-scope=public, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com) Nov 28 04:01:34 localhost podman[101604]: 2025-11-28 09:01:34.392414017 +0000 UTC m=+0.271188714 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 04:01:34 localhost podman[101607]: 2025-11-28 09:01:34.429173058 +0000 UTC m=+0.308932906 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Nov 28 04:01:34 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 04:01:34 localhost podman[101607]: 2025-11-28 09:01:34.465397602 +0000 UTC m=+0.345157440 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 04:01:34 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 04:01:34 localhost podman[101599]: 2025-11-28 09:01:34.616336389 +0000 UTC m=+0.506477502 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible) Nov 28 04:01:34 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 04:01:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 04:01:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 04:01:40 localhost podman[101793]: 2025-11-28 09:01:40.850691602 +0000 UTC m=+0.089395990 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-11-19T00:14:25Z, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4) Nov 28 04:01:40 localhost podman[101793]: 2025-11-28 09:01:40.890722586 +0000 UTC m=+0.129426924 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, release=1761123044, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 28 04:01:40 localhost podman[101793]: unhealthy Nov 28 04:01:40 localhost systemd[1]: tmp-crun.Hq0JBF.mount: Deactivated successfully. Nov 28 04:01:40 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:01:40 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 04:01:40 localhost podman[101794]: 2025-11-28 09:01:40.913346004 +0000 UTC m=+0.151059151 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, release=1761123044, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, tcib_managed=true) Nov 28 04:01:40 localhost podman[101794]: 2025-11-28 09:01:40.927322602 +0000 UTC m=+0.165035749 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible) Nov 28 04:01:40 localhost podman[101794]: unhealthy Nov 28 04:01:40 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:01:40 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 04:01:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 04:01:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 04:01:57 localhost podman[101836]: 2025-11-28 09:01:57.84489035 +0000 UTC m=+0.079429398 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc.) Nov 28 04:01:57 localhost podman[101836]: 2025-11-28 09:01:57.857567948 +0000 UTC m=+0.092107006 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, tcib_managed=true, distribution-scope=public, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team) Nov 28 04:01:57 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 04:01:57 localhost podman[101835]: 2025-11-28 09:01:57.909489004 +0000 UTC m=+0.144596190 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, container_name=iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4) Nov 28 04:01:57 localhost podman[101835]: 2025-11-28 09:01:57.921476889 +0000 UTC m=+0.156584085 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, container_name=iscsid, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4) Nov 28 04:01:57 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 04:01:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 04:01:59 localhost podman[101873]: 2025-11-28 09:01:59.846279006 +0000 UTC m=+0.080610126 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, version=17.1.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public) Nov 28 04:02:00 localhost podman[101873]: 2025-11-28 09:02:00.036282256 +0000 UTC m=+0.270613356 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=) Nov 28 04:02:00 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 04:02:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 04:02:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 04:02:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 04:02:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 04:02:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 04:02:04 localhost podman[101905]: 2025-11-28 09:02:04.873928681 +0000 UTC m=+0.106134975 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 04:02:04 localhost podman[101903]: 2025-11-28 09:02:04.853230642 +0000 UTC m=+0.091698913 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, architecture=x86_64, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=ceilometer_agent_compute, io.openshift.expose-services=, distribution-scope=public, version=17.1.12, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1) Nov 28 04:02:04 localhost podman[101911]: 2025-11-28 09:02:04.919199658 +0000 UTC m=+0.146008743 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 04:02:04 localhost systemd[1]: tmp-crun.rQLpyQ.mount: Deactivated successfully. Nov 28 04:02:04 localhost podman[101905]: 2025-11-28 09:02:04.939808003 +0000 UTC m=+0.172014307 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, distribution-scope=public, container_name=logrotate_crond, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container) Nov 28 04:02:04 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 04:02:04 localhost podman[101911]: 2025-11-28 09:02:04.979449545 +0000 UTC m=+0.206258650 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, container_name=nova_compute, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step5) Nov 28 04:02:04 localhost podman[101903]: 2025-11-28 09:02:04.987801336 +0000 UTC m=+0.226269607 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 28 04:02:04 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 04:02:04 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Deactivated successfully. Nov 28 04:02:05 localhost podman[101913]: 2025-11-28 09:02:05.073308094 +0000 UTC m=+0.296726823 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi) Nov 28 04:02:05 localhost podman[101904]: 2025-11-28 09:02:05.119980575 +0000 UTC m=+0.352368665 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 04:02:05 localhost podman[101913]: 2025-11-28 09:02:05.128452391 +0000 UTC m=+0.351871080 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, release=1761123044, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Nov 28 04:02:05 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Deactivated successfully. Nov 28 04:02:05 localhost podman[101904]: 2025-11-28 09:02:05.50138742 +0000 UTC m=+0.733775500 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, container_name=nova_migration_target, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com) Nov 28 04:02:05 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 04:02:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 04:02:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 04:02:11 localhost systemd[1]: tmp-crun.X7kq12.mount: Deactivated successfully. Nov 28 04:02:11 localhost podman[102021]: 2025-11-28 09:02:11.856847676 +0000 UTC m=+0.093954802 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, release=1761123044, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 28 04:02:11 localhost podman[102022]: 2025-11-28 09:02:11.900876515 +0000 UTC m=+0.134705119 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., version=17.1.12) Nov 28 04:02:11 localhost podman[102021]: 2025-11-28 09:02:11.904408226 +0000 UTC m=+0.141515332 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.4) Nov 28 04:02:11 localhost podman[102021]: unhealthy Nov 28 04:02:11 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:02:11 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 04:02:11 localhost podman[102022]: 2025-11-28 09:02:11.920428678 +0000 UTC m=+0.154257302 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 04:02:11 localhost podman[102022]: unhealthy Nov 28 04:02:11 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:02:11 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 04:02:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 04:02:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 04:02:28 localhost systemd[1]: tmp-crun.ympjBx.mount: Deactivated successfully. Nov 28 04:02:28 localhost podman[102062]: 2025-11-28 09:02:28.848921767 +0000 UTC m=+0.088801692 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 04:02:28 localhost podman[102062]: 2025-11-28 09:02:28.858396834 +0000 UTC m=+0.098276699 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1761123044, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, build-date=2025-11-18T23:44:13Z, container_name=iscsid, version=17.1.12, managed_by=tripleo_ansible) Nov 28 04:02:28 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 04:02:28 localhost podman[102063]: 2025-11-28 09:02:28.925938759 +0000 UTC m=+0.167954731 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, container_name=collectd, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 04:02:28 localhost podman[102063]: 2025-11-28 09:02:28.933666151 +0000 UTC m=+0.175682193 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, tcib_managed=true, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=collectd, vcs-type=git, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z) Nov 28 04:02:28 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 04:02:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 04:02:30 localhost podman[102101]: 2025-11-28 09:02:30.84165449 +0000 UTC m=+0.078895668 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, config_id=tripleo_step1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=metrics_qdr, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, managed_by=tripleo_ansible) Nov 28 04:02:31 localhost podman[102101]: 2025-11-28 09:02:31.049840685 +0000 UTC m=+0.287081793 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-type=git) Nov 28 04:02:31 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 04:02:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 04:02:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 04:02:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 04:02:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 04:02:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 04:02:35 localhost podman[102132]: 2025-11-28 09:02:35.852841714 +0000 UTC m=+0.085571975 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Nov 28 04:02:35 localhost podman[102132]: 2025-11-28 09:02:35.890510922 +0000 UTC m=+0.123241183 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, batch=17.1_20251118.1, config_id=tripleo_step4, tcib_managed=true, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=) Nov 28 04:02:35 localhost systemd[1]: tmp-crun.96rFJ1.mount: Deactivated successfully. Nov 28 04:02:35 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 04:02:35 localhost podman[102131]: 2025-11-28 09:02:35.909664896 +0000 UTC m=+0.145741140 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, container_name=nova_migration_target, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 04:02:35 localhost podman[102134]: 2025-11-28 09:02:35.972966759 +0000 UTC m=+0.201962073 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 04:02:36 localhost podman[102142]: 2025-11-28 09:02:36.022634929 +0000 UTC m=+0.248365392 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Nov 28 04:02:36 localhost podman[102134]: 2025-11-28 09:02:36.031232286 +0000 UTC m=+0.260227570 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, container_name=nova_compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container) Nov 28 04:02:36 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Deactivated successfully. Nov 28 04:02:36 localhost podman[102142]: 2025-11-28 09:02:36.062924438 +0000 UTC m=+0.288654871 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc.) Nov 28 04:02:36 localhost podman[102142]: unhealthy Nov 28 04:02:36 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:02:36 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Failed with result 'exit-code'. Nov 28 04:02:36 localhost podman[102130]: 2025-11-28 09:02:36.117694596 +0000 UTC m=+0.357351621 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 04:02:36 localhost podman[102130]: 2025-11-28 09:02:36.132901518 +0000 UTC m=+0.372558583 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 04:02:36 localhost podman[102130]: unhealthy Nov 28 04:02:36 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:02:36 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Failed with result 'exit-code'. Nov 28 04:02:36 localhost podman[102131]: 2025-11-28 09:02:36.288255345 +0000 UTC m=+0.524331549 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, name=rhosp17/openstack-nova-compute, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, distribution-scope=public, com.redhat.component=openstack-nova-compute-container) Nov 28 04:02:36 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 04:02:39 localhost podman[102333]: 2025-11-28 09:02:39.934775964 +0000 UTC m=+0.093559642 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-type=git, release=553, io.buildah.version=1.33.12, GIT_BRANCH=main, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, ceph=True, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, RELEASE=main, com.redhat.component=rhceph-container) Nov 28 04:02:40 localhost podman[102333]: 2025-11-28 09:02:40.028263782 +0000 UTC m=+0.187047450 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, GIT_BRANCH=main, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, distribution-scope=public, version=7, release=553, vcs-type=git, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc.) Nov 28 04:02:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 04:02:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 04:02:42 localhost podman[102477]: 2025-11-28 09:02:42.853788055 +0000 UTC m=+0.090275780 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, vcs-type=git, name=rhosp17/openstack-ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true) Nov 28 04:02:42 localhost podman[102476]: 2025-11-28 09:02:42.902118563 +0000 UTC m=+0.138713182 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, release=1761123044, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, version=17.1.12, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com) Nov 28 04:02:42 localhost podman[102477]: 2025-11-28 09:02:42.919676778 +0000 UTC m=+0.156164493 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=ovn_controller, io.buildah.version=1.41.4) Nov 28 04:02:42 localhost podman[102477]: unhealthy Nov 28 04:02:42 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:02:42 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 04:02:42 localhost podman[102476]: 2025-11-28 09:02:42.944575399 +0000 UTC m=+0.181169998 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent) Nov 28 04:02:42 localhost podman[102476]: unhealthy Nov 28 04:02:42 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:02:42 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 04:02:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 04:02:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 04:02:59 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 04:02:59 localhost recover_tripleo_nova_virtqemud[102526]: 61397 Nov 28 04:02:59 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 04:02:59 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 04:02:59 localhost podman[102516]: 2025-11-28 09:02:59.864511111 +0000 UTC m=+0.100748055 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, container_name=iscsid, version=17.1.12) Nov 28 04:02:59 localhost podman[102516]: 2025-11-28 09:02:59.8783707 +0000 UTC m=+0.114607684 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4) Nov 28 04:02:59 localhost systemd[1]: tmp-crun.mkdMCz.mount: Deactivated successfully. Nov 28 04:02:59 localhost podman[102517]: 2025-11-28 09:02:59.922153768 +0000 UTC m=+0.154986606 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3) Nov 28 04:02:59 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 04:02:59 localhost podman[102517]: 2025-11-28 09:02:59.987390401 +0000 UTC m=+0.220223249 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step3, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Nov 28 04:02:59 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 04:03:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 04:03:01 localhost systemd[1]: tmp-crun.zGFk5F.mount: Deactivated successfully. Nov 28 04:03:01 localhost podman[102557]: 2025-11-28 09:03:01.856899419 +0000 UTC m=+0.095786741 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=) Nov 28 04:03:02 localhost podman[102557]: 2025-11-28 09:03:02.087737367 +0000 UTC m=+0.326624649 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, url=https://www.redhat.com, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12) Nov 28 04:03:02 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 04:03:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 04:03:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 04:03:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 04:03:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 04:03:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 04:03:06 localhost podman[102587]: 2025-11-28 09:03:06.869173807 +0000 UTC m=+0.105364858 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute) Nov 28 04:03:06 localhost systemd[1]: tmp-crun.YHXcrU.mount: Deactivated successfully. Nov 28 04:03:06 localhost podman[102588]: 2025-11-28 09:03:06.967791845 +0000 UTC m=+0.199583650 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Nov 28 04:03:06 localhost podman[102588]: 2025-11-28 09:03:06.973558364 +0000 UTC m=+0.205350149 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, config_id=tripleo_step4, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container) Nov 28 04:03:06 localhost podman[102586]: 2025-11-28 09:03:06.930906821 +0000 UTC m=+0.161656964 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true) Nov 28 04:03:06 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 04:03:07 localhost podman[102586]: 2025-11-28 09:03:07.010832939 +0000 UTC m=+0.241583042 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, version=17.1.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git) Nov 28 04:03:07 localhost podman[102586]: unhealthy Nov 28 04:03:07 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:03:07 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Failed with result 'exit-code'. Nov 28 04:03:07 localhost podman[102589]: 2025-11-28 09:03:07.079489528 +0000 UTC m=+0.307808325 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 04:03:07 localhost podman[102589]: 2025-11-28 09:03:07.101485221 +0000 UTC m=+0.329803998 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, config_id=tripleo_step5, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 04:03:07 localhost podman[102589]: unhealthy Nov 28 04:03:07 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:03:07 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'. Nov 28 04:03:07 localhost podman[102598]: 2025-11-28 09:03:07.168583981 +0000 UTC m=+0.392171821 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team) Nov 28 04:03:07 localhost podman[102598]: 2025-11-28 09:03:07.211602945 +0000 UTC m=+0.435190815 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=) Nov 28 04:03:07 localhost podman[102598]: unhealthy Nov 28 04:03:07 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:03:07 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Failed with result 'exit-code'. Nov 28 04:03:07 localhost podman[102587]: 2025-11-28 09:03:07.223636638 +0000 UTC m=+0.459827709 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 04:03:07 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 04:03:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 04:03:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 04:03:13 localhost podman[102686]: 2025-11-28 09:03:13.85312249 +0000 UTC m=+0.088916398 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, distribution-scope=public, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=) Nov 28 04:03:13 localhost podman[102686]: 2025-11-28 09:03:13.896275288 +0000 UTC m=+0.132069166 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible) Nov 28 04:03:13 localhost podman[102686]: unhealthy Nov 28 04:03:13 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:03:13 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 04:03:13 localhost podman[102687]: 2025-11-28 09:03:13.903570054 +0000 UTC m=+0.136275977 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container) Nov 28 04:03:13 localhost podman[102687]: 2025-11-28 09:03:13.984650839 +0000 UTC m=+0.217356702 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git) Nov 28 04:03:13 localhost podman[102687]: unhealthy Nov 28 04:03:13 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:03:13 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 04:03:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26986 DF PROTO=TCP SPT=39260 DPT=9105 SEQ=1654414081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0285EA0000000001030307) Nov 28 04:03:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26987 DF PROTO=TCP SPT=39260 DPT=9105 SEQ=1654414081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB028A020000000001030307) Nov 28 04:03:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26988 DF PROTO=TCP SPT=39260 DPT=9105 SEQ=1654414081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0292020000000001030307) Nov 28 04:03:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26989 DF PROTO=TCP SPT=39260 DPT=9105 SEQ=1654414081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02A1C20000000001030307) Nov 28 04:03:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60509 DF PROTO=TCP SPT=49524 DPT=9101 SEQ=3725412370 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02ACD90000000001030307) Nov 28 04:03:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60510 DF PROTO=TCP SPT=49524 DPT=9101 SEQ=3725412370 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02B0C20000000001030307) Nov 28 04:03:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 04:03:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 04:03:30 localhost systemd[1]: tmp-crun.j44F37.mount: Deactivated successfully. Nov 28 04:03:30 localhost podman[102725]: 2025-11-28 09:03:30.848754729 +0000 UTC m=+0.086445662 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, vcs-type=git) Nov 28 04:03:30 localhost podman[102725]: 2025-11-28 09:03:30.85783106 +0000 UTC m=+0.095521943 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step3, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 04:03:30 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 04:03:30 localhost podman[102726]: 2025-11-28 09:03:30.9497336 +0000 UTC m=+0.183342746 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_id=tripleo_step3, batch=17.1_20251118.1) Nov 28 04:03:30 localhost podman[102726]: 2025-11-28 09:03:30.988477531 +0000 UTC m=+0.222086627 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1) Nov 28 04:03:31 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 04:03:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60511 DF PROTO=TCP SPT=49524 DPT=9101 SEQ=3725412370 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02B8C20000000001030307) Nov 28 04:03:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32505 DF PROTO=TCP SPT=47858 DPT=9102 SEQ=1229046501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02BDC40000000001030307) Nov 28 04:03:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 04:03:32 localhost podman[102765]: 2025-11-28 09:03:32.839793145 +0000 UTC m=+0.078438834 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, vcs-type=git) Nov 28 04:03:33 localhost podman[102765]: 2025-11-28 09:03:33.045833254 +0000 UTC m=+0.284478953 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 04:03:33 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 04:03:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26990 DF PROTO=TCP SPT=39260 DPT=9105 SEQ=1654414081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02C1820000000001030307) Nov 28 04:03:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32506 DF PROTO=TCP SPT=47858 DPT=9102 SEQ=1229046501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02C1C30000000001030307) Nov 28 04:03:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60512 DF PROTO=TCP SPT=49524 DPT=9101 SEQ=3725412370 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02C8830000000001030307) Nov 28 04:03:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29953 DF PROTO=TCP SPT=49902 DPT=9100 SEQ=4105540862 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02C9500000000001030307) Nov 28 04:03:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32507 DF PROTO=TCP SPT=47858 DPT=9102 SEQ=1229046501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02C9C30000000001030307) Nov 28 04:03:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29954 DF PROTO=TCP SPT=49902 DPT=9100 SEQ=4105540862 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02CD430000000001030307) Nov 28 04:03:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 04:03:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 04:03:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 04:03:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 04:03:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 04:03:37 localhost systemd[1]: tmp-crun.qXqCU2.mount: Deactivated successfully. Nov 28 04:03:37 localhost podman[102795]: 2025-11-28 09:03:37.851135133 +0000 UTC m=+0.082204870 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 04:03:37 localhost systemd[1]: tmp-crun.ztu0md.mount: Deactivated successfully. Nov 28 04:03:37 localhost podman[102794]: 2025-11-28 09:03:37.882534486 +0000 UTC m=+0.114704488 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc.) Nov 28 04:03:37 localhost podman[102794]: 2025-11-28 09:03:37.921571837 +0000 UTC m=+0.153741869 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible) Nov 28 04:03:37 localhost podman[102794]: unhealthy Nov 28 04:03:37 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:03:37 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Failed with result 'exit-code'. Nov 28 04:03:37 localhost podman[102809]: 2025-11-28 09:03:37.89457738 +0000 UTC m=+0.110372144 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044) Nov 28 04:03:37 localhost podman[102796]: 2025-11-28 09:03:37.922569558 +0000 UTC m=+0.148271208 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=logrotate_crond, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc.) Nov 28 04:03:38 localhost podman[102796]: 2025-11-28 09:03:38.000886567 +0000 UTC m=+0.226588247 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, build-date=2025-11-18T22:49:32Z) Nov 28 04:03:38 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 04:03:38 localhost podman[102809]: 2025-11-28 09:03:38.023628052 +0000 UTC m=+0.239422876 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi) Nov 28 04:03:38 localhost podman[102809]: unhealthy Nov 28 04:03:38 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:03:38 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Failed with result 'exit-code'. Nov 28 04:03:38 localhost podman[102797]: 2025-11-28 09:03:38.080510285 +0000 UTC m=+0.302012736 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 04:03:38 localhost podman[102797]: 2025-11-28 09:03:38.098515513 +0000 UTC m=+0.320017974 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_id=tripleo_step5, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-nova-compute-container) Nov 28 04:03:38 localhost podman[102797]: unhealthy Nov 28 04:03:38 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:03:38 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'. Nov 28 04:03:38 localhost podman[102795]: 2025-11-28 09:03:38.228114322 +0000 UTC m=+0.459184029 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Nov 28 04:03:38 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 04:03:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29955 DF PROTO=TCP SPT=49902 DPT=9100 SEQ=4105540862 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02D5420000000001030307) Nov 28 04:03:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32508 DF PROTO=TCP SPT=47858 DPT=9102 SEQ=1229046501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02D9820000000001030307) Nov 28 04:03:39 localhost sshd[102893]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:03:40 localhost systemd-logind[764]: New session 36 of user zuul. Nov 28 04:03:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5554 DF PROTO=TCP SPT=57248 DPT=9882 SEQ=2974427963 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02DBD60000000001030307) Nov 28 04:03:40 localhost systemd[1]: Started Session 36 of User zuul. Nov 28 04:03:40 localhost python3.9[102988]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:03:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5555 DF PROTO=TCP SPT=57248 DPT=9882 SEQ=2974427963 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02DFC20000000001030307) Nov 28 04:03:41 localhost python3.9[103082]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf'); print(p['DEFAULT']['host'])"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:03:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29956 DF PROTO=TCP SPT=49902 DPT=9100 SEQ=4105540862 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02E5030000000001030307) Nov 28 04:03:42 localhost python3.9[103205]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:03:43 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5556 DF PROTO=TCP SPT=57248 DPT=9882 SEQ=2974427963 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02E7C20000000001030307) Nov 28 04:03:43 localhost python3.9[103330]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf'); print(p['DEFAULT']['host'])"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:03:43 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60513 DF PROTO=TCP SPT=49524 DPT=9101 SEQ=3725412370 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02E9820000000001030307) Nov 28 04:03:43 localhost python3.9[103438]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:03:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 04:03:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 04:03:44 localhost python3.9[103529]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline Nov 28 04:03:44 localhost podman[103531]: 2025-11-28 09:03:44.839621287 +0000 UTC m=+0.072280322 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 04:03:44 localhost systemd[1]: tmp-crun.ZnnGSl.mount: Deactivated successfully. Nov 28 04:03:44 localhost podman[103530]: 2025-11-28 09:03:44.858569495 +0000 UTC m=+0.090062105 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, maintainer=OpenStack TripleO Team, distribution-scope=public) Nov 28 04:03:44 localhost podman[103531]: 2025-11-28 09:03:44.8887471 +0000 UTC m=+0.121406165 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=ovn_controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1) Nov 28 04:03:44 localhost podman[103531]: unhealthy Nov 28 04:03:44 localhost podman[103530]: 2025-11-28 09:03:44.899685609 +0000 UTC m=+0.131178239 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn) Nov 28 04:03:44 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:03:44 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 04:03:44 localhost podman[103530]: unhealthy Nov 28 04:03:44 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:03:44 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 04:03:46 localhost python3.9[103658]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:03:46 localhost python3.9[103750]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile Nov 28 04:03:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5557 DF PROTO=TCP SPT=57248 DPT=9882 SEQ=2974427963 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02F7820000000001030307) Nov 28 04:03:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32509 DF PROTO=TCP SPT=47858 DPT=9102 SEQ=1229046501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02F9820000000001030307) Nov 28 04:03:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5398 DF PROTO=TCP SPT=47412 DPT=9105 SEQ=2833108021 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02FB1A0000000001030307) Nov 28 04:03:48 localhost python3.9[103840]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 28 04:03:48 localhost python3.9[103888]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 28 04:03:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5399 DF PROTO=TCP SPT=47412 DPT=9105 SEQ=2833108021 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB02FF430000000001030307) Nov 28 04:03:49 localhost systemd-logind[764]: Session 36 logged out. Waiting for processes to exit. Nov 28 04:03:49 localhost systemd[1]: session-36.scope: Deactivated successfully. Nov 28 04:03:49 localhost systemd[1]: session-36.scope: Consumed 5.017s CPU time. Nov 28 04:03:49 localhost systemd-logind[764]: Removed session 36. Nov 28 04:03:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5400 DF PROTO=TCP SPT=47412 DPT=9105 SEQ=2833108021 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0307420000000001030307) Nov 28 04:03:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5401 DF PROTO=TCP SPT=47412 DPT=9105 SEQ=2833108021 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0317020000000001030307) Nov 28 04:03:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40253 DF PROTO=TCP SPT=38946 DPT=9101 SEQ=3847402935 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0322090000000001030307) Nov 28 04:04:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40255 DF PROTO=TCP SPT=38946 DPT=9101 SEQ=3847402935 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB032E030000000001030307) Nov 28 04:04:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 04:04:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 04:04:01 localhost systemd[1]: tmp-crun.Ho0RdK.mount: Deactivated successfully. Nov 28 04:04:01 localhost podman[103905]: 2025-11-28 09:04:01.867468475 +0000 UTC m=+0.096903755 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12) Nov 28 04:04:01 localhost podman[103905]: 2025-11-28 09:04:01.881311124 +0000 UTC m=+0.110746424 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, release=1761123044, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 04:04:01 localhost podman[103904]: 2025-11-28 09:04:01.910134687 +0000 UTC m=+0.140046333 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, build-date=2025-11-18T23:44:13Z, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Nov 28 04:04:01 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 04:04:01 localhost podman[103904]: 2025-11-28 09:04:01.948365243 +0000 UTC m=+0.178276849 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, version=17.1.12, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=iscsid, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible) Nov 28 04:04:01 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 04:04:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7653 DF PROTO=TCP SPT=51962 DPT=9102 SEQ=3559992270 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0337020000000001030307) Nov 28 04:04:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 04:04:03 localhost podman[103942]: 2025-11-28 09:04:03.830094031 +0000 UTC m=+0.071613502 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=metrics_qdr, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z) Nov 28 04:04:04 localhost podman[103942]: 2025-11-28 09:04:04.034310843 +0000 UTC m=+0.275830224 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, container_name=metrics_qdr, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true) Nov 28 04:04:04 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 04:04:04 localhost sshd[103971]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:04:04 localhost systemd-logind[764]: New session 37 of user zuul. Nov 28 04:04:04 localhost systemd[1]: Started Session 37 of User zuul. Nov 28 04:04:05 localhost python3.9[104066]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 28 04:04:05 localhost systemd[1]: Reloading. Nov 28 04:04:05 localhost systemd-rc-local-generator[104092]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:04:05 localhost systemd-sysv-generator[104097]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:04:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:04:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1354 DF PROTO=TCP SPT=60910 DPT=9100 SEQ=1665886645 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0342820000000001030307) Nov 28 04:04:07 localhost python3.9[104193]: ansible-ansible.builtin.service_facts Invoked Nov 28 04:04:07 localhost network[104210]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 28 04:04:07 localhost network[104211]: 'network-scripts' will be removed from distribution in near future. Nov 28 04:04:07 localhost network[104212]: It is advised to switch to 'NetworkManager' instead for network management. Nov 28 04:04:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 04:04:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 04:04:08 localhost podman[104234]: 2025-11-28 09:04:08.059241965 +0000 UTC m=+0.092530390 container health_status 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Nov 28 04:04:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 04:04:08 localhost podman[104234]: 2025-11-28 09:04:08.103033463 +0000 UTC m=+0.136321908 container exec_died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4) Nov 28 04:04:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 04:04:08 localhost systemd[1]: tmp-crun.t1xnbp.mount: Deactivated successfully. Nov 28 04:04:08 localhost podman[104234]: unhealthy Nov 28 04:04:08 localhost podman[104251]: 2025-11-28 09:04:08.171369862 +0000 UTC m=+0.121457907 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=) Nov 28 04:04:08 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:04:08 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Failed with result 'exit-code'. Nov 28 04:04:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 04:04:08 localhost podman[104286]: 2025-11-28 09:04:08.264087678 +0000 UTC m=+0.124146651 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, vcs-type=git, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z) Nov 28 04:04:08 localhost podman[104269]: 2025-11-28 09:04:08.225064638 +0000 UTC m=+0.143406169 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, url=https://www.redhat.com) Nov 28 04:04:08 localhost podman[104251]: 2025-11-28 09:04:08.283069416 +0000 UTC m=+0.233157451 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, release=1761123044, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-cron, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 04:04:08 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 04:04:08 localhost podman[104269]: 2025-11-28 09:04:08.308155484 +0000 UTC m=+0.226496955 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team) Nov 28 04:04:08 localhost podman[104269]: unhealthy Nov 28 04:04:08 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:04:08 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Failed with result 'exit-code'. Nov 28 04:04:08 localhost podman[104286]: 2025-11-28 09:04:08.359539257 +0000 UTC m=+0.219598260 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.component=openstack-nova-compute-container) Nov 28 04:04:08 localhost podman[104286]: unhealthy Nov 28 04:04:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:04:08 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:04:08 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'. Nov 28 04:04:08 localhost podman[104320]: 2025-11-28 09:04:08.454719059 +0000 UTC m=+0.185265967 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, release=1761123044, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.openshift.expose-services=, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Nov 28 04:04:08 localhost podman[104320]: 2025-11-28 09:04:08.849407736 +0000 UTC m=+0.579954604 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, release=1761123044, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 04:04:08 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 04:04:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7655 DF PROTO=TCP SPT=51962 DPT=9102 SEQ=3559992270 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB034EC20000000001030307) Nov 28 04:04:11 localhost python3.9[104510]: ansible-ansible.builtin.service_facts Invoked Nov 28 04:04:11 localhost network[104527]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 28 04:04:11 localhost network[104528]: 'network-scripts' will be removed from distribution in near future. Nov 28 04:04:11 localhost network[104529]: It is advised to switch to 'NetworkManager' instead for network management. Nov 28 04:04:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1356 DF PROTO=TCP SPT=60910 DPT=9100 SEQ=1665886645 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB035A420000000001030307) Nov 28 04:04:13 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:04:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 04:04:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 04:04:15 localhost systemd[1]: tmp-crun.vnfl5d.mount: Deactivated successfully. Nov 28 04:04:15 localhost podman[104728]: 2025-11-28 09:04:15.237099152 +0000 UTC m=+0.101224010 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, tcib_managed=true, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64) Nov 28 04:04:15 localhost podman[104728]: 2025-11-28 09:04:15.252364045 +0000 UTC m=+0.116488913 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible) Nov 28 04:04:15 localhost podman[104728]: unhealthy Nov 28 04:04:15 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:04:15 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 04:04:15 localhost podman[104729]: 2025-11-28 09:04:15.325238454 +0000 UTC m=+0.182115928 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Nov 28 04:04:15 localhost podman[104729]: 2025-11-28 09:04:15.343613294 +0000 UTC m=+0.200490768 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044) Nov 28 04:04:15 localhost podman[104729]: unhealthy Nov 28 04:04:15 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:04:15 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 04:04:15 localhost python3.9[104730]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:04:15 localhost systemd[1]: Reloading. Nov 28 04:04:15 localhost systemd-sysv-generator[104799]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:04:15 localhost systemd-rc-local-generator[104795]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:04:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:04:15 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 04:04:15 localhost systemd[1]: Stopping ceilometer_agent_compute container... Nov 28 04:04:15 localhost recover_tripleo_nova_virtqemud[104812]: 61397 Nov 28 04:04:15 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 04:04:15 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 04:04:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4591 DF PROTO=TCP SPT=36590 DPT=9882 SEQ=2705474875 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB036CC20000000001030307) Nov 28 04:04:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17534 DF PROTO=TCP SPT=51352 DPT=9105 SEQ=270335686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0374420000000001030307) Nov 28 04:04:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17535 DF PROTO=TCP SPT=51352 DPT=9105 SEQ=270335686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB037C420000000001030307) Nov 28 04:04:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17536 DF PROTO=TCP SPT=51352 DPT=9105 SEQ=270335686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB038C020000000001030307) Nov 28 04:04:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20513 DF PROTO=TCP SPT=37358 DPT=9101 SEQ=38800111 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0397390000000001030307) Nov 28 04:04:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20515 DF PROTO=TCP SPT=37358 DPT=9101 SEQ=38800111 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB03A3420000000001030307) Nov 28 04:04:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 04:04:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 04:04:32 localhost podman[104828]: 2025-11-28 09:04:32.108793136 +0000 UTC m=+0.090414944 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_step3, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4) Nov 28 04:04:32 localhost podman[104827]: 2025-11-28 09:04:32.15307166 +0000 UTC m=+0.134687838 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container) Nov 28 04:04:32 localhost podman[104827]: 2025-11-28 09:04:32.161592974 +0000 UTC m=+0.143209102 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.12, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 04:04:32 localhost podman[104828]: 2025-11-28 09:04:32.175532166 +0000 UTC m=+0.157154034 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, batch=17.1_20251118.1) Nov 28 04:04:32 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 04:04:32 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 04:04:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17537 DF PROTO=TCP SPT=51352 DPT=9105 SEQ=270335686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB03AB820000000001030307) Nov 28 04:04:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 04:04:34 localhost systemd[1]: tmp-crun.VB3gP4.mount: Deactivated successfully. Nov 28 04:04:34 localhost podman[104867]: 2025-11-28 09:04:34.459523396 +0000 UTC m=+0.092306503 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr) Nov 28 04:04:34 localhost podman[104867]: 2025-11-28 09:04:34.654296186 +0000 UTC m=+0.287079213 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, release=1761123044, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public) Nov 28 04:04:34 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 04:04:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32511 DF PROTO=TCP SPT=47858 DPT=9102 SEQ=1229046501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB03B7830000000001030307) Nov 28 04:04:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 04:04:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 04:04:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 04:04:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 04:04:38 localhost podman[104898]: 2025-11-28 09:04:38.603612414 +0000 UTC m=+0.085331647 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., vcs-type=git, release=1761123044, name=rhosp17/openstack-nova-compute) Nov 28 04:04:38 localhost podman[104896]: Error: container 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 is not running Nov 28 04:04:38 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Main process exited, code=exited, status=125/n/a Nov 28 04:04:38 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Failed with result 'exit-code'. Nov 28 04:04:38 localhost systemd[1]: tmp-crun.lQDgVk.mount: Deactivated successfully. Nov 28 04:04:38 localhost podman[104897]: 2025-11-28 09:04:38.716801314 +0000 UTC m=+0.202505760 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, vcs-type=git, version=17.1.12, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-cron-container) Nov 28 04:04:38 localhost podman[104898]: 2025-11-28 09:04:38.728770184 +0000 UTC m=+0.210489397 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, container_name=nova_compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 04:04:38 localhost podman[104898]: unhealthy Nov 28 04:04:38 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:04:38 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'. Nov 28 04:04:38 localhost podman[104897]: 2025-11-28 09:04:38.749312982 +0000 UTC m=+0.235017388 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, tcib_managed=true) Nov 28 04:04:38 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 04:04:38 localhost podman[104904]: 2025-11-28 09:04:38.814242655 +0000 UTC m=+0.291956073 container health_status f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, batch=17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi) Nov 28 04:04:38 localhost podman[104904]: 2025-11-28 09:04:38.853477942 +0000 UTC m=+0.331191380 container exec_died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, config_id=tripleo_step4) Nov 28 04:04:38 localhost podman[104904]: unhealthy Nov 28 04:04:38 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:04:38 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Failed with result 'exit-code'. Nov 28 04:04:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 04:04:38 localhost podman[104968]: 2025-11-28 09:04:38.973316268 +0000 UTC m=+0.083333125 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 28 04:04:39 localhost podman[104968]: 2025-11-28 09:04:39.335819588 +0000 UTC m=+0.445836435 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Nov 28 04:04:39 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 04:04:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29959 DF PROTO=TCP SPT=49902 DPT=9100 SEQ=4105540862 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB03C3820000000001030307) Nov 28 04:04:39 localhost systemd[1]: tmp-crun.jGx6ku.mount: Deactivated successfully. Nov 28 04:04:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17093 DF PROTO=TCP SPT=47480 DPT=9100 SEQ=2077246492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB03CF820000000001030307) Nov 28 04:04:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 04:04:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 04:04:45 localhost systemd[1]: tmp-crun.gB2nSj.mount: Deactivated successfully. Nov 28 04:04:45 localhost podman[105068]: 2025-11-28 09:04:45.463446879 +0000 UTC m=+0.086983368 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.) Nov 28 04:04:45 localhost podman[105067]: 2025-11-28 09:04:45.483522452 +0000 UTC m=+0.106387590 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public) Nov 28 04:04:45 localhost podman[105068]: 2025-11-28 09:04:45.512596813 +0000 UTC m=+0.136133282 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 04:04:45 localhost podman[105068]: unhealthy Nov 28 04:04:45 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:04:45 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 04:04:45 localhost podman[105067]: 2025-11-28 09:04:45.526529985 +0000 UTC m=+0.149395103 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true) Nov 28 04:04:45 localhost podman[105067]: unhealthy Nov 28 04:04:45 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:04:45 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 04:04:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3368 DF PROTO=TCP SPT=57916 DPT=9882 SEQ=1245516862 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB03E2020000000001030307) Nov 28 04:04:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44877 DF PROTO=TCP SPT=35524 DPT=9105 SEQ=2507938630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB03E9820000000001030307) Nov 28 04:04:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44878 DF PROTO=TCP SPT=35524 DPT=9105 SEQ=2507938630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB03F1830000000001030307) Nov 28 04:04:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44879 DF PROTO=TCP SPT=35524 DPT=9105 SEQ=2507938630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0401420000000001030307) Nov 28 04:04:58 localhost podman[104814]: time="2025-11-28T09:04:58Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_compute in 42 seconds, resorting to SIGKILL" Nov 28 04:04:58 localhost systemd[1]: libpod-4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.scope: Deactivated successfully. Nov 28 04:04:58 localhost systemd[1]: libpod-4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.scope: Consumed 6.001s CPU time. Nov 28 04:04:58 localhost podman[104814]: 2025-11-28 09:04:58.027835346 +0000 UTC m=+42.083970291 container stop 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com) Nov 28 04:04:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31348 DF PROTO=TCP SPT=41632 DPT=9101 SEQ=2572632617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB040C6A0000000001030307) Nov 28 04:04:58 localhost podman[104814]: 2025-11-28 09:04:58.064284946 +0000 UTC m=+42.120419891 container died 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=) Nov 28 04:04:58 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.timer: Deactivated successfully. Nov 28 04:04:58 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5. Nov 28 04:04:58 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Failed to open /run/systemd/transient/4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: No such file or directory Nov 28 04:04:58 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5-userdata-shm.mount: Deactivated successfully. Nov 28 04:04:58 localhost podman[104814]: 2025-11-28 09:04:58.124273727 +0000 UTC m=+42.180408632 container cleanup 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 04:04:58 localhost podman[104814]: ceilometer_agent_compute Nov 28 04:04:58 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.timer: Failed to open /run/systemd/transient/4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.timer: No such file or directory Nov 28 04:04:58 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Failed to open /run/systemd/transient/4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: No such file or directory Nov 28 04:04:58 localhost podman[105110]: 2025-11-28 09:04:58.177056904 +0000 UTC m=+0.126910307 container cleanup 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, version=17.1.12) Nov 28 04:04:58 localhost systemd[1]: libpod-conmon-4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.scope: Deactivated successfully. Nov 28 04:04:58 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.timer: Failed to open /run/systemd/transient/4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.timer: No such file or directory Nov 28 04:04:58 localhost systemd[1]: 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: Failed to open /run/systemd/transient/4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5.service: No such file or directory Nov 28 04:04:58 localhost podman[105128]: 2025-11-28 09:04:58.287824318 +0000 UTC m=+0.070391154 container cleanup 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute) Nov 28 04:04:58 localhost podman[105128]: ceilometer_agent_compute Nov 28 04:04:58 localhost systemd[1]: tripleo_ceilometer_agent_compute.service: Deactivated successfully. Nov 28 04:04:58 localhost systemd[1]: Stopped ceilometer_agent_compute container. Nov 28 04:04:58 localhost systemd[1]: tripleo_ceilometer_agent_compute.service: Consumed 1.098s CPU time, no IO. Nov 28 04:04:59 localhost systemd[1]: var-lib-containers-storage-overlay-9f3450f8dbd8f52854977f74bd961373e3aeac1471ae57db291ae89b64fa40dd-merged.mount: Deactivated successfully. Nov 28 04:04:59 localhost python3.9[105232]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_ipmi.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:04:59 localhost systemd[1]: Reloading. Nov 28 04:04:59 localhost systemd-rc-local-generator[105255]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:04:59 localhost systemd-sysv-generator[105262]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:04:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:04:59 localhost systemd[1]: Stopping ceilometer_agent_ipmi container... Nov 28 04:05:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31350 DF PROTO=TCP SPT=41632 DPT=9101 SEQ=2572632617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0418820000000001030307) Nov 28 04:05:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 04:05:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 04:05:02 localhost systemd[1]: tmp-crun.Ro8dQo.mount: Deactivated successfully. Nov 28 04:05:02 localhost podman[105289]: 2025-11-28 09:05:02.356190896 +0000 UTC m=+0.090388673 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, version=17.1.12, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 28 04:05:02 localhost podman[105288]: 2025-11-28 09:05:02.40630592 +0000 UTC m=+0.142452937 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1) Nov 28 04:05:02 localhost podman[105288]: 2025-11-28 09:05:02.414857655 +0000 UTC m=+0.151004723 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, tcib_managed=true, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team) Nov 28 04:05:02 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 04:05:02 localhost podman[105289]: 2025-11-28 09:05:02.468279283 +0000 UTC m=+0.202477030 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, container_name=collectd, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, distribution-scope=public) Nov 28 04:05:02 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 04:05:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48331 DF PROTO=TCP SPT=33600 DPT=9102 SEQ=2680091707 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0421420000000001030307) Nov 28 04:05:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 04:05:04 localhost systemd[1]: tmp-crun.fdLolF.mount: Deactivated successfully. Nov 28 04:05:04 localhost podman[105325]: 2025-11-28 09:05:04.862035846 +0000 UTC m=+0.095198103 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, vcs-type=git) Nov 28 04:05:05 localhost podman[105325]: 2025-11-28 09:05:05.081553942 +0000 UTC m=+0.314716209 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, release=1761123044) Nov 28 04:05:05 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 04:05:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21990 DF PROTO=TCP SPT=56088 DPT=9100 SEQ=1367658504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB042D020000000001030307) Nov 28 04:05:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 04:05:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 04:05:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 04:05:09 localhost systemd[1]: tmp-crun.IPytoW.mount: Deactivated successfully. Nov 28 04:05:09 localhost podman[105355]: 2025-11-28 09:05:09.109871641 +0000 UTC m=+0.088378491 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, build-date=2025-11-19T00:36:58Z, architecture=x86_64, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 04:05:09 localhost podman[105355]: 2025-11-28 09:05:09.134379161 +0000 UTC m=+0.112886051 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, batch=17.1_20251118.1, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, vcs-type=git, tcib_managed=true, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 04:05:09 localhost systemd[1]: tmp-crun.Dk2P6H.mount: Deactivated successfully. Nov 28 04:05:09 localhost podman[105354]: 2025-11-28 09:05:09.164487955 +0000 UTC m=+0.144970497 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, release=1761123044, architecture=x86_64, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron) Nov 28 04:05:09 localhost podman[105356]: Error: container f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b is not running Nov 28 04:05:09 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Main process exited, code=exited, status=125/n/a Nov 28 04:05:09 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Failed with result 'exit-code'. Nov 28 04:05:09 localhost podman[105354]: 2025-11-28 09:05:09.200870433 +0000 UTC m=+0.181352945 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, tcib_managed=true) Nov 28 04:05:09 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 04:05:09 localhost podman[105355]: unhealthy Nov 28 04:05:09 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:05:09 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'. Nov 28 04:05:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48333 DF PROTO=TCP SPT=33600 DPT=9102 SEQ=2680091707 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0439020000000001030307) Nov 28 04:05:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 04:05:09 localhost podman[105410]: 2025-11-28 09:05:09.840900778 +0000 UTC m=+0.076545584 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vendor=Red Hat, Inc.) Nov 28 04:05:10 localhost podman[105410]: 2025-11-28 09:05:10.252383567 +0000 UTC m=+0.488028373 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, container_name=nova_migration_target, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 04:05:10 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 04:05:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21992 DF PROTO=TCP SPT=56088 DPT=9100 SEQ=1367658504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0444C20000000001030307) Nov 28 04:05:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 04:05:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 04:05:15 localhost podman[105433]: 2025-11-28 09:05:15.845629879 +0000 UTC m=+0.082724547 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 04:05:15 localhost podman[105433]: 2025-11-28 09:05:15.889530369 +0000 UTC m=+0.126624977 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 04:05:15 localhost podman[105433]: unhealthy Nov 28 04:05:15 localhost systemd[1]: tmp-crun.A9Xm4C.mount: Deactivated successfully. Nov 28 04:05:15 localhost podman[105434]: 2025-11-28 09:05:15.900353505 +0000 UTC m=+0.133926984 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller) Nov 28 04:05:15 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:05:15 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 04:05:15 localhost podman[105434]: 2025-11-28 09:05:15.944387211 +0000 UTC m=+0.177960700 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 04:05:15 localhost podman[105434]: unhealthy Nov 28 04:05:15 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:05:15 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 04:05:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20651 DF PROTO=TCP SPT=44250 DPT=9882 SEQ=949298037 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0457420000000001030307) Nov 28 04:05:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40133 DF PROTO=TCP SPT=49662 DPT=9105 SEQ=1209768136 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB045AAB0000000001030307) Nov 28 04:05:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40135 DF PROTO=TCP SPT=49662 DPT=9105 SEQ=1209768136 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0466C20000000001030307) Nov 28 04:05:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40136 DF PROTO=TCP SPT=49662 DPT=9105 SEQ=1209768136 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0476820000000001030307) Nov 28 04:05:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53947 DF PROTO=TCP SPT=44546 DPT=9101 SEQ=2264730479 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB04819C0000000001030307) Nov 28 04:05:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53949 DF PROTO=TCP SPT=44546 DPT=9101 SEQ=2264730479 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB048DC30000000001030307) Nov 28 04:05:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 04:05:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 04:05:32 localhost systemd[1]: tmp-crun.vW94hE.mount: Deactivated successfully. Nov 28 04:05:32 localhost podman[105472]: 2025-11-28 09:05:32.600291064 +0000 UTC m=+0.081983123 container health_status 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true) Nov 28 04:05:32 localhost podman[105472]: 2025-11-28 09:05:32.637798687 +0000 UTC m=+0.119490736 container exec_died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Nov 28 04:05:32 localhost podman[105473]: 2025-11-28 09:05:32.652256985 +0000 UTC m=+0.129841327 container health_status 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Nov 28 04:05:32 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Deactivated successfully. Nov 28 04:05:32 localhost podman[105473]: 2025-11-28 09:05:32.688353575 +0000 UTC m=+0.165937877 container exec_died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 28 04:05:32 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Deactivated successfully. Nov 28 04:05:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54987 DF PROTO=TCP SPT=42326 DPT=9102 SEQ=2920543412 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0496820000000001030307) Nov 28 04:05:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 04:05:35 localhost podman[105511]: 2025-11-28 09:05:35.343741131 +0000 UTC m=+0.085018487 container health_status 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, container_name=metrics_qdr, io.buildah.version=1.41.4, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd) Nov 28 04:05:35 localhost podman[105511]: 2025-11-28 09:05:35.559499361 +0000 UTC m=+0.300776737 container exec_died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, vcs-type=git, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Nov 28 04:05:35 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Deactivated successfully. Nov 28 04:05:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19474 DF PROTO=TCP SPT=47204 DPT=9102 SEQ=200258602 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB04A1830000000001030307) Nov 28 04:05:37 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 04:05:37 localhost recover_tripleo_nova_virtqemud[105541]: 61397 Nov 28 04:05:37 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 04:05:37 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 04:05:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17096 DF PROTO=TCP SPT=47480 DPT=9100 SEQ=2077246492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB04AD820000000001030307) Nov 28 04:05:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 04:05:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 04:05:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 04:05:39 localhost systemd[1]: tmp-crun.UNz2Wb.mount: Deactivated successfully. Nov 28 04:05:39 localhost podman[105544]: Error: container f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b is not running Nov 28 04:05:39 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Main process exited, code=exited, status=125/n/a Nov 28 04:05:39 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Failed with result 'exit-code'. Nov 28 04:05:39 localhost podman[105542]: 2025-11-28 09:05:39.853114824 +0000 UTC m=+0.084173621 container health_status bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044) Nov 28 04:05:39 localhost podman[105543]: 2025-11-28 09:05:39.918694487 +0000 UTC m=+0.147096543 container health_status c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, url=https://www.redhat.com) Nov 28 04:05:39 localhost podman[105542]: 2025-11-28 09:05:39.936340514 +0000 UTC m=+0.167399311 container exec_died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, name=rhosp17/openstack-cron, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, container_name=logrotate_crond, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Nov 28 04:05:39 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Deactivated successfully. Nov 28 04:05:39 localhost podman[105543]: 2025-11-28 09:05:39.967423887 +0000 UTC m=+0.195825933 container exec_died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, vcs-type=git, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, version=17.1.12) Nov 28 04:05:39 localhost podman[105543]: unhealthy Nov 28 04:05:39 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:05:39 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'. Nov 28 04:05:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 04:05:40 localhost podman[105596]: 2025-11-28 09:05:40.848218069 +0000 UTC m=+0.086485683 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 04:05:41 localhost podman[105596]: 2025-11-28 09:05:41.24845927 +0000 UTC m=+0.486726834 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute) Nov 28 04:05:41 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 04:05:41 localhost podman[105273]: time="2025-11-28T09:05:41Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_ipmi in 42 seconds, resorting to SIGKILL" Nov 28 04:05:41 localhost systemd[1]: libpod-f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.scope: Deactivated successfully. Nov 28 04:05:41 localhost systemd[1]: libpod-f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.scope: Consumed 6.086s CPU time. Nov 28 04:05:41 localhost podman[105273]: 2025-11-28 09:05:41.621836307 +0000 UTC m=+42.092958188 container died f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.expose-services=) Nov 28 04:05:41 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.timer: Deactivated successfully. Nov 28 04:05:41 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b. Nov 28 04:05:41 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Failed to open /run/systemd/transient/f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: No such file or directory Nov 28 04:05:41 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b-userdata-shm.mount: Deactivated successfully. Nov 28 04:05:41 localhost podman[105273]: 2025-11-28 09:05:41.671712454 +0000 UTC m=+42.142834315 container cleanup f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z) Nov 28 04:05:41 localhost podman[105273]: ceilometer_agent_ipmi Nov 28 04:05:41 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.timer: Failed to open /run/systemd/transient/f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.timer: No such file or directory Nov 28 04:05:41 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Failed to open /run/systemd/transient/f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: No such file or directory Nov 28 04:05:41 localhost podman[105620]: 2025-11-28 09:05:41.709629909 +0000 UTC m=+0.078262478 container cleanup f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12) Nov 28 04:05:41 localhost systemd[1]: libpod-conmon-f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.scope: Deactivated successfully. Nov 28 04:05:41 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.timer: Failed to open /run/systemd/transient/f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.timer: No such file or directory Nov 28 04:05:41 localhost systemd[1]: f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: Failed to open /run/systemd/transient/f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b.service: No such file or directory Nov 28 04:05:41 localhost podman[105633]: 2025-11-28 09:05:41.810461535 +0000 UTC m=+0.068831194 container cleanup f53eaaed929807ccb3f0046ad451b4ad9a1c76a10b36ac3ed6be8fc58311ac8b (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, vcs-type=git, build-date=2025-11-19T00:12:45Z, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Nov 28 04:05:41 localhost podman[105633]: ceilometer_agent_ipmi Nov 28 04:05:41 localhost systemd[1]: tripleo_ceilometer_agent_ipmi.service: Deactivated successfully. Nov 28 04:05:41 localhost systemd[1]: Stopped ceilometer_agent_ipmi container. Nov 28 04:05:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54619 DF PROTO=TCP SPT=49072 DPT=9100 SEQ=2495149367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB04B9C20000000001030307) Nov 28 04:05:42 localhost systemd[1]: var-lib-containers-storage-overlay-b9124600c137d24812fa12ae9a3723386ce2301b24a72f8d26d11f6206571e1d-merged.mount: Deactivated successfully. Nov 28 04:05:42 localhost python3.9[105737]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_collectd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:05:42 localhost systemd[1]: Reloading. Nov 28 04:05:42 localhost systemd-rc-local-generator[105766]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:05:42 localhost systemd-sysv-generator[105770]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:05:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:05:43 localhost systemd[1]: Stopping collectd container... Nov 28 04:05:43 localhost systemd[1]: tmp-crun.UVRp2z.mount: Deactivated successfully. Nov 28 04:05:43 localhost systemd[1]: libpod-2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.scope: Deactivated successfully. Nov 28 04:05:43 localhost systemd[1]: libpod-2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.scope: Consumed 2.099s CPU time. Nov 28 04:05:43 localhost podman[105778]: 2025-11-28 09:05:43.20446602 +0000 UTC m=+0.162030055 container died 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Nov 28 04:05:43 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.timer: Deactivated successfully. Nov 28 04:05:43 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c. Nov 28 04:05:43 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Failed to open /run/systemd/transient/2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: No such file or directory Nov 28 04:05:43 localhost podman[105778]: 2025-11-28 09:05:43.292946733 +0000 UTC m=+0.250510718 container cleanup 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, architecture=x86_64, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step3, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, container_name=collectd, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Nov 28 04:05:43 localhost podman[105778]: collectd Nov 28 04:05:43 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.timer: Failed to open /run/systemd/transient/2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.timer: No such file or directory Nov 28 04:05:43 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Failed to open /run/systemd/transient/2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: No such file or directory Nov 28 04:05:43 localhost podman[105792]: 2025-11-28 09:05:43.314394118 +0000 UTC m=+0.096141951 container cleanup 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, container_name=collectd, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, config_id=tripleo_step3) Nov 28 04:05:43 localhost systemd[1]: tripleo_collectd.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:05:43 localhost systemd[1]: libpod-conmon-2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.scope: Deactivated successfully. Nov 28 04:05:43 localhost podman[105818]: error opening file `/run/crun/2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c/status`: No such file or directory Nov 28 04:05:43 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.timer: Failed to open /run/systemd/transient/2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.timer: No such file or directory Nov 28 04:05:43 localhost systemd[1]: 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: Failed to open /run/systemd/transient/2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c.service: No such file or directory Nov 28 04:05:43 localhost podman[105807]: 2025-11-28 09:05:43.419193068 +0000 UTC m=+0.073293124 container cleanup 2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 04:05:43 localhost podman[105807]: collectd Nov 28 04:05:43 localhost systemd[1]: tripleo_collectd.service: Failed with result 'exit-code'. Nov 28 04:05:43 localhost systemd[1]: Stopped collectd container. Nov 28 04:05:43 localhost systemd[1]: var-lib-containers-storage-overlay-483382bf68693c05a65aded8bd4df683c0e0d8870bcc7441be08a396c5476266-merged.mount: Deactivated successfully. Nov 28 04:05:43 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2dbe5b5de57669990516043a7fcb4a0ff163cb9b2d11c0870314ef6b58988e2c-userdata-shm.mount: Deactivated successfully. Nov 28 04:05:44 localhost python3.9[105914]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_iscsid.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:05:44 localhost systemd[1]: Reloading. Nov 28 04:05:44 localhost systemd-rc-local-generator[105944]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:05:44 localhost systemd-sysv-generator[105947]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:05:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:05:44 localhost systemd[1]: Stopping iscsid container... Nov 28 04:05:44 localhost systemd[1]: tmp-crun.SDRykm.mount: Deactivated successfully. Nov 28 04:05:44 localhost systemd[1]: libpod-08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.scope: Deactivated successfully. Nov 28 04:05:44 localhost systemd[1]: libpod-08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.scope: Consumed 1.103s CPU time. Nov 28 04:05:44 localhost podman[105955]: 2025-11-28 09:05:44.72798901 +0000 UTC m=+0.075358297 container died 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, url=https://www.redhat.com) Nov 28 04:05:44 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.timer: Deactivated successfully. Nov 28 04:05:44 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93. Nov 28 04:05:44 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Failed to open /run/systemd/transient/08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: No such file or directory Nov 28 04:05:44 localhost podman[105955]: 2025-11-28 09:05:44.821074247 +0000 UTC m=+0.168443524 container cleanup 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, release=1761123044, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible) Nov 28 04:05:44 localhost podman[105955]: iscsid Nov 28 04:05:44 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.timer: Failed to open /run/systemd/transient/08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.timer: No such file or directory Nov 28 04:05:44 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Failed to open /run/systemd/transient/08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: No such file or directory Nov 28 04:05:44 localhost podman[105967]: 2025-11-28 09:05:44.831876982 +0000 UTC m=+0.103354166 container cleanup 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, config_id=tripleo_step3, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Nov 28 04:05:44 localhost systemd[1]: libpod-conmon-08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.scope: Deactivated successfully. Nov 28 04:05:44 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.timer: Failed to open /run/systemd/transient/08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.timer: No such file or directory Nov 28 04:05:44 localhost systemd[1]: 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: Failed to open /run/systemd/transient/08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93.service: No such file or directory Nov 28 04:05:44 localhost podman[105984]: 2025-11-28 09:05:44.931870942 +0000 UTC m=+0.066742440 container cleanup 08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible) Nov 28 04:05:44 localhost podman[105984]: iscsid Nov 28 04:05:44 localhost systemd[1]: tripleo_iscsid.service: Deactivated successfully. Nov 28 04:05:44 localhost systemd[1]: Stopped iscsid container. Nov 28 04:05:45 localhost python3.9[106088]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_logrotate_crond.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:05:45 localhost systemd[1]: var-lib-containers-storage-overlay-7b21fd5920b03309569c61b370f896baf7b2149eb0e6261b2fb5b94e6a082fed-merged.mount: Deactivated successfully. Nov 28 04:05:45 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-08d3d8f84cf3120489744ea84891ec1bd5a8ea86cc3e0b69cac4aca8eb820a93-userdata-shm.mount: Deactivated successfully. Nov 28 04:05:45 localhost systemd[1]: Reloading. Nov 28 04:05:45 localhost systemd-rc-local-generator[106144]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:05:45 localhost systemd-sysv-generator[106149]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:05:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:05:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 04:05:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 04:05:46 localhost systemd[1]: Stopping logrotate_crond container... Nov 28 04:05:46 localhost podman[106173]: 2025-11-28 09:05:46.168421774 +0000 UTC m=+0.087552365 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, release=1761123044, build-date=2025-11-18T23:34:05Z) Nov 28 04:05:46 localhost podman[106172]: 2025-11-28 09:05:46.223403199 +0000 UTC m=+0.140526549 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, version=17.1.12, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 28 04:05:46 localhost podman[106173]: 2025-11-28 09:05:46.236714111 +0000 UTC m=+0.155844732 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20251118.1) Nov 28 04:05:46 localhost podman[106173]: unhealthy Nov 28 04:05:46 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:05:46 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 04:05:46 localhost systemd[1]: libpod-bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.scope: Deactivated successfully. Nov 28 04:05:46 localhost systemd[1]: libpod-bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.scope: Consumed 1.023s CPU time. Nov 28 04:05:46 localhost podman[106175]: 2025-11-28 09:05:46.319780187 +0000 UTC m=+0.233280494 container died bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 04:05:46 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.timer: Deactivated successfully. Nov 28 04:05:46 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3. Nov 28 04:05:46 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Failed to open /run/systemd/transient/bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: No such file or directory Nov 28 04:05:46 localhost podman[106172]: 2025-11-28 09:05:46.344292828 +0000 UTC m=+0.261416108 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044) Nov 28 04:05:46 localhost podman[106172]: unhealthy Nov 28 04:05:46 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:05:46 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 04:05:46 localhost podman[106175]: 2025-11-28 09:05:46.41241244 +0000 UTC m=+0.325912747 container cleanup bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, container_name=logrotate_crond, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64) Nov 28 04:05:46 localhost podman[106175]: logrotate_crond Nov 28 04:05:46 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.timer: Failed to open /run/systemd/transient/bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.timer: No such file or directory Nov 28 04:05:46 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Failed to open /run/systemd/transient/bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: No such file or directory Nov 28 04:05:46 localhost podman[106243]: 2025-11-28 09:05:46.433285766 +0000 UTC m=+0.107830904 container cleanup bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-type=git, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron) Nov 28 04:05:46 localhost systemd[1]: libpod-conmon-bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.scope: Deactivated successfully. Nov 28 04:05:46 localhost podman[106269]: error opening file `/run/crun/bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3/status`: No such file or directory Nov 28 04:05:46 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.timer: Failed to open /run/systemd/transient/bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.timer: No such file or directory Nov 28 04:05:46 localhost systemd[1]: bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: Failed to open /run/systemd/transient/bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3.service: No such file or directory Nov 28 04:05:46 localhost podman[106258]: 2025-11-28 09:05:46.540478781 +0000 UTC m=+0.072638074 container cleanup bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, architecture=x86_64) Nov 28 04:05:46 localhost podman[106258]: logrotate_crond Nov 28 04:05:46 localhost systemd[1]: tripleo_logrotate_crond.service: Deactivated successfully. Nov 28 04:05:46 localhost systemd[1]: Stopped logrotate_crond container. Nov 28 04:05:46 localhost systemd[1]: var-lib-containers-storage-overlay-926487bff9cf92c9a8f31e53ba63d52e32adb5e89e0ae8702a4e60968ca6f3f1-merged.mount: Deactivated successfully. Nov 28 04:05:46 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bb08a8853b4f4840b97cd0dbcf6dc1fc9e23090b97a807298bd2bd2fab919fe3-userdata-shm.mount: Deactivated successfully. Nov 28 04:05:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62197 DF PROTO=TCP SPT=54478 DPT=9882 SEQ=1118034570 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB04CC420000000001030307) Nov 28 04:05:47 localhost python3.9[106377]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_metrics_qdr.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:05:47 localhost systemd[1]: Reloading. Nov 28 04:05:47 localhost systemd-rc-local-generator[106402]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:05:47 localhost systemd-sysv-generator[106407]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:05:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:05:47 localhost systemd[1]: Stopping metrics_qdr container... Nov 28 04:05:47 localhost kernel: qdrouterd[54084]: segfault at 0 ip 00007f65b00807cb sp 00007ffcc2077190 error 4 in libc.so.6[7f65b001d000+175000] Nov 28 04:05:47 localhost kernel: Code: 0b 00 64 44 89 23 85 c0 75 d4 e9 2b ff ff ff e8 db a5 00 00 e9 fd fe ff ff e8 41 1d 0d 00 90 f3 0f 1e fa 41 54 55 48 89 fd 53 <8b> 07 f6 c4 20 0f 85 aa 00 00 00 89 c2 81 e2 00 80 00 00 0f 84 a9 Nov 28 04:05:47 localhost systemd[1]: Created slice Slice /system/systemd-coredump. Nov 28 04:05:47 localhost systemd[1]: Started Process Core Dump (PID 106430/UID 0). Nov 28 04:05:47 localhost systemd-coredump[106431]: Resource limits disable core dumping for process 54084 (qdrouterd). Nov 28 04:05:47 localhost systemd-coredump[106431]: Process 54084 (qdrouterd) of user 42465 dumped core. Nov 28 04:05:47 localhost systemd[1]: systemd-coredump@0-106430-0.service: Deactivated successfully. Nov 28 04:05:47 localhost podman[106418]: 2025-11-28 09:05:47.894162765 +0000 UTC m=+0.233530362 container died 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public) Nov 28 04:05:47 localhost systemd[1]: libpod-63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.scope: Deactivated successfully. Nov 28 04:05:47 localhost systemd[1]: libpod-63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.scope: Consumed 27.231s CPU time. Nov 28 04:05:47 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.timer: Deactivated successfully. Nov 28 04:05:47 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339. Nov 28 04:05:47 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Failed to open /run/systemd/transient/63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: No such file or directory Nov 28 04:05:47 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339-userdata-shm.mount: Deactivated successfully. Nov 28 04:05:47 localhost systemd[1]: var-lib-containers-storage-overlay-701465e48d77119796b927e784ad3d56a8f99cc392002110647f3a4cfb83b9e9-merged.mount: Deactivated successfully. Nov 28 04:05:47 localhost podman[106418]: 2025-11-28 09:05:47.953991459 +0000 UTC m=+0.293359056 container cleanup 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr) Nov 28 04:05:47 localhost podman[106418]: metrics_qdr Nov 28 04:05:47 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.timer: Failed to open /run/systemd/transient/63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.timer: No such file or directory Nov 28 04:05:47 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Failed to open /run/systemd/transient/63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: No such file or directory Nov 28 04:05:47 localhost podman[106435]: 2025-11-28 09:05:47.980845412 +0000 UTC m=+0.077873525 container cleanup 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, architecture=x86_64, release=1761123044, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container) Nov 28 04:05:47 localhost systemd[1]: tripleo_metrics_qdr.service: Main process exited, code=exited, status=139/n/a Nov 28 04:05:48 localhost systemd[1]: libpod-conmon-63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.scope: Deactivated successfully. Nov 28 04:05:48 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.timer: Failed to open /run/systemd/transient/63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.timer: No such file or directory Nov 28 04:05:48 localhost systemd[1]: 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: Failed to open /run/systemd/transient/63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339.service: No such file or directory Nov 28 04:05:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23218 DF PROTO=TCP SPT=52560 DPT=9105 SEQ=1946760551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB04CFDB0000000001030307) Nov 28 04:05:48 localhost podman[106452]: 2025-11-28 09:05:48.09655256 +0000 UTC m=+0.077783783 container cleanup 63a543cdbfb000d8af6f43192219059e38f542ace773420e03d33737ff784339 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd871e9c8e59a273b3131348d6d370386'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1) Nov 28 04:05:48 localhost podman[106452]: metrics_qdr Nov 28 04:05:48 localhost systemd[1]: tripleo_metrics_qdr.service: Failed with result 'exit-code'. Nov 28 04:05:48 localhost systemd[1]: Stopped metrics_qdr container. Nov 28 04:05:48 localhost python3.9[106555]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_dhcp.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:05:49 localhost python3.9[106648]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_l3_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:05:50 localhost python3.9[106741]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_ovs_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:05:50 localhost python3.9[106834]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:05:51 localhost systemd[1]: Reloading. Nov 28 04:05:51 localhost systemd-rc-local-generator[106860]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:05:51 localhost systemd-sysv-generator[106866]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:05:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23220 DF PROTO=TCP SPT=52560 DPT=9105 SEQ=1946760551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB04DC020000000001030307) Nov 28 04:05:51 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:05:51 localhost systemd[1]: Stopping nova_compute container... Nov 28 04:05:51 localhost systemd[1]: tmp-crun.1e26mL.mount: Deactivated successfully. Nov 28 04:05:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23221 DF PROTO=TCP SPT=52560 DPT=9105 SEQ=1946760551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB04EBC20000000001030307) Nov 28 04:05:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59531 DF PROTO=TCP SPT=45700 DPT=9101 SEQ=242776502 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB04F6CA0000000001030307) Nov 28 04:06:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59533 DF PROTO=TCP SPT=45700 DPT=9101 SEQ=242776502 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0502C20000000001030307) Nov 28 04:06:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23222 DF PROTO=TCP SPT=52560 DPT=9105 SEQ=1946760551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB050B820000000001030307) Nov 28 04:06:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26901 DF PROTO=TCP SPT=41424 DPT=9100 SEQ=1474215830 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0517420000000001030307) Nov 28 04:06:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21995 DF PROTO=TCP SPT=56088 DPT=9100 SEQ=1367658504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0523830000000001030307) Nov 28 04:06:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 04:06:10 localhost podman[106886]: Error: container c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 is not running Nov 28 04:06:10 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Main process exited, code=exited, status=125/n/a Nov 28 04:06:10 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed with result 'exit-code'. Nov 28 04:06:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 04:06:11 localhost podman[106897]: 2025-11-28 09:06:11.84900679 +0000 UTC m=+0.084008116 container health_status 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team) Nov 28 04:06:12 localhost podman[106897]: 2025-11-28 09:06:12.230478528 +0000 UTC m=+0.465479874 container exec_died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=) Nov 28 04:06:12 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Deactivated successfully. Nov 28 04:06:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26903 DF PROTO=TCP SPT=41424 DPT=9100 SEQ=1474215830 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB052F020000000001030307) Nov 28 04:06:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 04:06:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 04:06:16 localhost podman[106921]: 2025-11-28 09:06:16.604672109 +0000 UTC m=+0.084964446 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team) Nov 28 04:06:16 localhost systemd[1]: tmp-crun.EDC0Is.mount: Deactivated successfully. Nov 28 04:06:16 localhost podman[106920]: 2025-11-28 09:06:16.65987356 +0000 UTC m=+0.142720276 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 04:06:16 localhost podman[106921]: 2025-11-28 09:06:16.676740653 +0000 UTC m=+0.157032980 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller) Nov 28 04:06:16 localhost podman[106920]: 2025-11-28 09:06:16.679394356 +0000 UTC m=+0.162241092 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public) Nov 28 04:06:16 localhost podman[106920]: unhealthy Nov 28 04:06:16 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:06:16 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 04:06:16 localhost podman[106921]: unhealthy Nov 28 04:06:16 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:06:16 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 04:06:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18806 DF PROTO=TCP SPT=58632 DPT=9882 SEQ=3487650905 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0541830000000001030307) Nov 28 04:06:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59540 DF PROTO=TCP SPT=43388 DPT=9105 SEQ=2886964641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0549020000000001030307) Nov 28 04:06:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59541 DF PROTO=TCP SPT=43388 DPT=9105 SEQ=2886964641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0551020000000001030307) Nov 28 04:06:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59542 DF PROTO=TCP SPT=43388 DPT=9105 SEQ=2886964641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0560C20000000001030307) Nov 28 04:06:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49276 DF PROTO=TCP SPT=58512 DPT=9101 SEQ=2552739480 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB056BFA0000000001030307) Nov 28 04:06:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49278 DF PROTO=TCP SPT=58512 DPT=9101 SEQ=2552739480 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0578020000000001030307) Nov 28 04:06:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39596 DF PROTO=TCP SPT=37864 DPT=9102 SEQ=3041625844 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0581020000000001030307) Nov 28 04:06:33 localhost podman[106874]: time="2025-11-28T09:06:33Z" level=warning msg="StopSignal SIGTERM failed to stop container nova_compute in 42 seconds, resorting to SIGKILL" Nov 28 04:06:33 localhost systemd[1]: session-c11.scope: Deactivated successfully. Nov 28 04:06:33 localhost systemd[1]: libpod-c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.scope: Deactivated successfully. Nov 28 04:06:33 localhost systemd[1]: libpod-c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.scope: Consumed 34.339s CPU time. Nov 28 04:06:33 localhost podman[106874]: 2025-11-28 09:06:33.56624692 +0000 UTC m=+42.146618142 container died c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12) Nov 28 04:06:33 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.timer: Deactivated successfully. Nov 28 04:06:33 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6. Nov 28 04:06:33 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed to open /run/systemd/transient/c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: No such file or directory Nov 28 04:06:33 localhost systemd[1]: var-lib-containers-storage-overlay-d078e66fd31ac4112ed049a2e764bba9cb8e12df988d1bfceb2a6cc73592ff91-merged.mount: Deactivated successfully. Nov 28 04:06:33 localhost podman[106874]: 2025-11-28 09:06:33.634836148 +0000 UTC m=+42.215207340 container cleanup c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12) Nov 28 04:06:33 localhost podman[106874]: nova_compute Nov 28 04:06:33 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.timer: Failed to open /run/systemd/transient/c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.timer: No such file or directory Nov 28 04:06:33 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed to open /run/systemd/transient/c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: No such file or directory Nov 28 04:06:33 localhost podman[106964]: 2025-11-28 09:06:33.67266095 +0000 UTC m=+0.135357068 container cleanup c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute) Nov 28 04:06:33 localhost systemd[1]: libpod-conmon-c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.scope: Deactivated successfully. Nov 28 04:06:33 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.timer: Failed to open /run/systemd/transient/c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.timer: No such file or directory Nov 28 04:06:33 localhost systemd[1]: c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: Failed to open /run/systemd/transient/c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6.service: No such file or directory Nov 28 04:06:33 localhost podman[106980]: 2025-11-28 09:06:33.785558291 +0000 UTC m=+0.070828848 container cleanup c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4) Nov 28 04:06:33 localhost podman[106980]: nova_compute Nov 28 04:06:33 localhost systemd[1]: tripleo_nova_compute.service: Deactivated successfully. Nov 28 04:06:33 localhost systemd[1]: Stopped nova_compute container. Nov 28 04:06:33 localhost systemd[1]: tripleo_nova_compute.service: Consumed 1.140s CPU time, no IO. Nov 28 04:06:35 localhost python3.9[107084]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:06:35 localhost systemd[1]: Reloading. Nov 28 04:06:35 localhost systemd-sysv-generator[107115]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:06:35 localhost systemd-rc-local-generator[107110]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:06:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:06:35 localhost systemd[1]: Stopping nova_migration_target container... Nov 28 04:06:35 localhost systemd[1]: tmp-crun.Ve2Ng0.mount: Deactivated successfully. Nov 28 04:06:35 localhost systemd[1]: libpod-9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.scope: Deactivated successfully. Nov 28 04:06:35 localhost systemd[1]: libpod-9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.scope: Consumed 32.848s CPU time. Nov 28 04:06:35 localhost podman[107124]: 2025-11-28 09:06:35.695468552 +0000 UTC m=+0.084894233 container died 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z) Nov 28 04:06:35 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.timer: Deactivated successfully. Nov 28 04:06:35 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019. Nov 28 04:06:35 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Failed to open /run/systemd/transient/9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: No such file or directory Nov 28 04:06:35 localhost podman[107124]: 2025-11-28 09:06:35.752257183 +0000 UTC m=+0.141682864 container cleanup 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible) Nov 28 04:06:35 localhost podman[107124]: nova_migration_target Nov 28 04:06:35 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.timer: Failed to open /run/systemd/transient/9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.timer: No such file or directory Nov 28 04:06:35 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Failed to open /run/systemd/transient/9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: No such file or directory Nov 28 04:06:35 localhost podman[107138]: 2025-11-28 09:06:35.784638257 +0000 UTC m=+0.082860100 container cleanup 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step4, container_name=nova_migration_target, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Nov 28 04:06:35 localhost systemd[1]: libpod-conmon-9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.scope: Deactivated successfully. Nov 28 04:06:35 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.timer: Failed to open /run/systemd/transient/9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.timer: No such file or directory Nov 28 04:06:35 localhost systemd[1]: 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: Failed to open /run/systemd/transient/9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019.service: No such file or directory Nov 28 04:06:35 localhost podman[107150]: 2025-11-28 09:06:35.876444514 +0000 UTC m=+0.057217365 container cleanup 9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, container_name=nova_migration_target, config_id=tripleo_step4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4) Nov 28 04:06:35 localhost podman[107150]: nova_migration_target Nov 28 04:06:35 localhost systemd[1]: tripleo_nova_migration_target.service: Deactivated successfully. Nov 28 04:06:35 localhost systemd[1]: Stopped nova_migration_target container. Nov 28 04:06:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36989 DF PROTO=TCP SPT=50824 DPT=9100 SEQ=1753447102 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB058C820000000001030307) Nov 28 04:06:36 localhost python3.9[107252]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:06:36 localhost systemd[1]: var-lib-containers-storage-overlay-4f1ef42e70cf0e1a0a8a92baa1944c6e077a6987018321471d4c334fae1280ec-merged.mount: Deactivated successfully. Nov 28 04:06:36 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9d8be640ea5d6cfea51bb900f6de22cb23ff968b0ad4b3144ee849e7af6ef019-userdata-shm.mount: Deactivated successfully. Nov 28 04:06:36 localhost systemd[1]: Reloading. Nov 28 04:06:36 localhost systemd-sysv-generator[107281]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:06:36 localhost systemd-rc-local-generator[107276]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:06:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:06:37 localhost systemd[1]: Stopping nova_virtlogd_wrapper container... Nov 28 04:06:37 localhost systemd[1]: libpod-8e7c684118204ac89680ace1a0889960b1955fa40b4d50e655f226c9c1f115be.scope: Deactivated successfully. Nov 28 04:06:37 localhost podman[107292]: 2025-11-28 09:06:37.156973459 +0000 UTC m=+0.071115906 container died 8e7c684118204ac89680ace1a0889960b1955fa40b4d50e655f226c9c1f115be (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, io.openshift.expose-services=, config_id=tripleo_step3, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_virtlogd_wrapper, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 04:06:37 localhost podman[107292]: 2025-11-28 09:06:37.19278036 +0000 UTC m=+0.106922777 container cleanup 8e7c684118204ac89680ace1a0889960b1955fa40b4d50e655f226c9c1f115be (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.expose-services=, container_name=nova_virtlogd_wrapper, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4) Nov 28 04:06:37 localhost podman[107292]: nova_virtlogd_wrapper Nov 28 04:06:37 localhost podman[107306]: 2025-11-28 09:06:37.220591652 +0000 UTC m=+0.059976151 container cleanup 8e7c684118204ac89680ace1a0889960b1955fa40b4d50e655f226c9c1f115be (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_virtlogd_wrapper, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20251118.1) Nov 28 04:06:37 localhost systemd[1]: var-lib-containers-storage-overlay-870580852a6f869c68321019e7d77d0da890e64d97007d93989d967a33c02183-merged.mount: Deactivated successfully. Nov 28 04:06:37 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8e7c684118204ac89680ace1a0889960b1955fa40b4d50e655f226c9c1f115be-userdata-shm.mount: Deactivated successfully. Nov 28 04:06:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54622 DF PROTO=TCP SPT=49072 DPT=9100 SEQ=2495149367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0597830000000001030307) Nov 28 04:06:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36991 DF PROTO=TCP SPT=50824 DPT=9100 SEQ=1753447102 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB05A4420000000001030307) Nov 28 04:06:43 localhost systemd[1]: Stopping User Manager for UID 0... Nov 28 04:06:43 localhost systemd[83313]: Activating special unit Exit the Session... Nov 28 04:06:43 localhost systemd[83313]: Removed slice User Background Tasks Slice. Nov 28 04:06:43 localhost systemd[83313]: Stopped target Main User Target. Nov 28 04:06:43 localhost systemd[83313]: Stopped target Basic System. Nov 28 04:06:43 localhost systemd[83313]: Stopped target Paths. Nov 28 04:06:43 localhost systemd[83313]: Stopped target Sockets. Nov 28 04:06:43 localhost systemd[83313]: Stopped target Timers. Nov 28 04:06:43 localhost systemd[83313]: Stopped Daily Cleanup of User's Temporary Directories. Nov 28 04:06:43 localhost systemd[83313]: Closed D-Bus User Message Bus Socket. Nov 28 04:06:43 localhost systemd[83313]: Stopped Create User's Volatile Files and Directories. Nov 28 04:06:43 localhost systemd[83313]: Removed slice User Application Slice. Nov 28 04:06:43 localhost systemd[83313]: Reached target Shutdown. Nov 28 04:06:43 localhost systemd[83313]: Finished Exit the Session. Nov 28 04:06:43 localhost systemd[83313]: Reached target Exit the Session. Nov 28 04:06:43 localhost systemd[1]: user@0.service: Deactivated successfully. Nov 28 04:06:43 localhost systemd[1]: Stopped User Manager for UID 0. Nov 28 04:06:43 localhost systemd[1]: user@0.service: Consumed 4.523s CPU time, no IO. Nov 28 04:06:43 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Nov 28 04:06:43 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Nov 28 04:06:43 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Nov 28 04:06:43 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Nov 28 04:06:43 localhost systemd[1]: Removed slice User Slice of UID 0. Nov 28 04:06:43 localhost systemd[1]: user-0.slice: Consumed 5.476s CPU time. Nov 28 04:06:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 04:06:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 04:06:46 localhost podman[107323]: 2025-11-28 09:06:46.849755065 +0000 UTC m=+0.083713397 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, container_name=ovn_controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4) Nov 28 04:06:46 localhost podman[107323]: 2025-11-28 09:06:46.890359295 +0000 UTC m=+0.124317617 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, version=17.1.12) Nov 28 04:06:46 localhost podman[107323]: unhealthy Nov 28 04:06:46 localhost podman[107322]: 2025-11-28 09:06:46.903088179 +0000 UTC m=+0.138626989 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 04:06:46 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:06:46 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 04:06:46 localhost podman[107322]: 2025-11-28 09:06:46.919434806 +0000 UTC m=+0.154973616 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, managed_by=tripleo_ansible) Nov 28 04:06:46 localhost podman[107322]: unhealthy Nov 28 04:06:46 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:06:46 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 04:06:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11542 DF PROTO=TCP SPT=55520 DPT=9882 SEQ=1934118378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB05B6C30000000001030307) Nov 28 04:06:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42345 DF PROTO=TCP SPT=47674 DPT=9105 SEQ=3097923637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB05BE420000000001030307) Nov 28 04:06:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42346 DF PROTO=TCP SPT=47674 DPT=9105 SEQ=3097923637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB05C6430000000001030307) Nov 28 04:06:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42347 DF PROTO=TCP SPT=47674 DPT=9105 SEQ=3097923637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB05D6020000000001030307) Nov 28 04:06:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10202 DF PROTO=TCP SPT=49960 DPT=9101 SEQ=3082867192 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB05E12B0000000001030307) Nov 28 04:07:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10204 DF PROTO=TCP SPT=49960 DPT=9101 SEQ=3082867192 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB05ED420000000001030307) Nov 28 04:07:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42348 DF PROTO=TCP SPT=47674 DPT=9105 SEQ=3097923637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB05F5820000000001030307) Nov 28 04:07:05 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Nov 28 04:07:05 localhost recover_tripleo_nova_virtqemud[107441]: 61397 Nov 28 04:07:05 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Nov 28 04:07:05 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Nov 28 04:07:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47026 DF PROTO=TCP SPT=46070 DPT=9102 SEQ=2352280691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0601830000000001030307) Nov 28 04:07:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26906 DF PROTO=TCP SPT=41424 DPT=9100 SEQ=1474215830 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB060D830000000001030307) Nov 28 04:07:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39019 DF PROTO=TCP SPT=43748 DPT=9100 SEQ=68847106 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0619820000000001030307) Nov 28 04:07:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 04:07:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 04:07:17 localhost systemd[1]: tmp-crun.xdByW0.mount: Deactivated successfully. Nov 28 04:07:17 localhost podman[107442]: 2025-11-28 09:07:17.097408443 +0000 UTC m=+0.085246184 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, tcib_managed=true) Nov 28 04:07:17 localhost podman[107442]: 2025-11-28 09:07:17.109898221 +0000 UTC m=+0.097735992 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Nov 28 04:07:17 localhost podman[107442]: unhealthy Nov 28 04:07:17 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:07:17 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 04:07:17 localhost systemd[1]: tmp-crun.KowIQv.mount: Deactivated successfully. Nov 28 04:07:17 localhost podman[107443]: 2025-11-28 09:07:17.149790297 +0000 UTC m=+0.133752308 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, architecture=x86_64, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 04:07:17 localhost podman[107443]: 2025-11-28 09:07:17.163216534 +0000 UTC m=+0.147178545 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, container_name=ovn_controller, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Nov 28 04:07:17 localhost podman[107443]: unhealthy Nov 28 04:07:17 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:07:17 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 04:07:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52135 DF PROTO=TCP SPT=33378 DPT=9882 SEQ=2005616798 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB062C020000000001030307) Nov 28 04:07:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9007 DF PROTO=TCP SPT=45510 DPT=9105 SEQ=2131438876 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0633830000000001030307) Nov 28 04:07:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9008 DF PROTO=TCP SPT=45510 DPT=9105 SEQ=2131438876 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB063B820000000001030307) Nov 28 04:07:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9009 DF PROTO=TCP SPT=45510 DPT=9105 SEQ=2131438876 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB064B430000000001030307) Nov 28 04:07:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50369 DF PROTO=TCP SPT=56354 DPT=9101 SEQ=3433932624 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0656590000000001030307) Nov 28 04:07:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50371 DF PROTO=TCP SPT=56354 DPT=9101 SEQ=3433932624 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0662420000000001030307) Nov 28 04:07:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6743 DF PROTO=TCP SPT=33562 DPT=9102 SEQ=583773517 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB066B420000000001030307) Nov 28 04:07:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6552 DF PROTO=TCP SPT=36284 DPT=9100 SEQ=4102555724 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0676C30000000001030307) Nov 28 04:07:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6745 DF PROTO=TCP SPT=33562 DPT=9102 SEQ=583773517 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0683020000000001030307) Nov 28 04:07:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6554 DF PROTO=TCP SPT=36284 DPT=9100 SEQ=4102555724 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB068E820000000001030307) Nov 28 04:07:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=992 DF PROTO=TCP SPT=60428 DPT=9882 SEQ=2180038041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB06A1020000000001030307) Nov 28 04:07:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 04:07:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 04:07:47 localhost podman[107480]: 2025-11-28 09:07:47.327688304 +0000 UTC m=+0.067709341 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 28 04:07:47 localhost podman[107480]: 2025-11-28 09:07:47.33854888 +0000 UTC m=+0.078569847 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 04:07:47 localhost podman[107480]: unhealthy Nov 28 04:07:47 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:07:47 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 04:07:47 localhost podman[107481]: 2025-11-28 09:07:47.379038516 +0000 UTC m=+0.110582851 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 28 04:07:47 localhost podman[107481]: 2025-11-28 09:07:47.394355591 +0000 UTC m=+0.125899986 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, version=17.1.12, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Nov 28 04:07:47 localhost podman[107481]: unhealthy Nov 28 04:07:47 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:07:47 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 04:07:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5958 DF PROTO=TCP SPT=50594 DPT=9105 SEQ=961399672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB06A8C20000000001030307) Nov 28 04:07:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5959 DF PROTO=TCP SPT=50594 DPT=9105 SEQ=961399672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB06B0C20000000001030307) Nov 28 04:07:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5960 DF PROTO=TCP SPT=50594 DPT=9105 SEQ=961399672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB06C0830000000001030307) Nov 28 04:07:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2201 DF PROTO=TCP SPT=40632 DPT=9101 SEQ=2584186812 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB06CB8A0000000001030307) Nov 28 04:08:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2203 DF PROTO=TCP SPT=40632 DPT=9101 SEQ=2584186812 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB06D7830000000001030307) Nov 28 04:08:01 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: State 'stop-sigterm' timed out. Killing. Nov 28 04:08:01 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Killing process 60620 (conmon) with signal SIGKILL. Nov 28 04:08:01 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Main process exited, code=killed, status=9/KILL Nov 28 04:08:01 localhost systemd[1]: libpod-conmon-8e7c684118204ac89680ace1a0889960b1955fa40b4d50e655f226c9c1f115be.scope: Deactivated successfully. Nov 28 04:08:01 localhost podman[107606]: error opening file `/run/crun/8e7c684118204ac89680ace1a0889960b1955fa40b4d50e655f226c9c1f115be/status`: No such file or directory Nov 28 04:08:01 localhost podman[107595]: 2025-11-28 09:08:01.325474377 +0000 UTC m=+0.065406069 container cleanup 8e7c684118204ac89680ace1a0889960b1955fa40b4d50e655f226c9c1f115be (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=nova_virtlogd_wrapper, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, architecture=x86_64, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 04:08:01 localhost podman[107595]: nova_virtlogd_wrapper Nov 28 04:08:01 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Failed with result 'timeout'. Nov 28 04:08:01 localhost systemd[1]: Stopped nova_virtlogd_wrapper container. Nov 28 04:08:02 localhost python3.9[107699]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:08:02 localhost systemd[1]: Reloading. Nov 28 04:08:02 localhost systemd-sysv-generator[107727]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:08:02 localhost systemd-rc-local-generator[107723]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:08:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:08:02 localhost systemd[1]: Stopping nova_virtnodedevd container... Nov 28 04:08:02 localhost systemd[1]: libpod-6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265.scope: Deactivated successfully. Nov 28 04:08:02 localhost systemd[1]: libpod-6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265.scope: Consumed 1.354s CPU time. Nov 28 04:08:02 localhost podman[107740]: 2025-11-28 09:08:02.54270545 +0000 UTC m=+0.076584976 container died 6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, container_name=nova_virtnodedevd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 04:08:02 localhost systemd[1]: tmp-crun.TlXCti.mount: Deactivated successfully. Nov 28 04:08:02 localhost podman[107740]: 2025-11-28 09:08:02.59108006 +0000 UTC m=+0.124959556 container cleanup 6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_id=tripleo_step3, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_virtnodedevd, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true) Nov 28 04:08:02 localhost podman[107740]: nova_virtnodedevd Nov 28 04:08:02 localhost podman[107754]: 2025-11-28 09:08:02.624345001 +0000 UTC m=+0.070472136 container cleanup 6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, build-date=2025-11-19T00:35:22Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, config_id=tripleo_step3, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-nova-libvirt, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., container_name=nova_virtnodedevd, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt) Nov 28 04:08:02 localhost systemd[1]: libpod-conmon-6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265.scope: Deactivated successfully. Nov 28 04:08:02 localhost podman[107782]: error opening file `/run/crun/6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265/status`: No such file or directory Nov 28 04:08:02 localhost podman[107770]: 2025-11-28 09:08:02.712766452 +0000 UTC m=+0.058351039 container cleanup 6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, release=1761123044, name=rhosp17/openstack-nova-libvirt, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., container_name=nova_virtnodedevd, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 28 04:08:02 localhost podman[107770]: nova_virtnodedevd Nov 28 04:08:02 localhost systemd[1]: tripleo_nova_virtnodedevd.service: Deactivated successfully. Nov 28 04:08:02 localhost systemd[1]: Stopped nova_virtnodedevd container. Nov 28 04:08:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2651 DF PROTO=TCP SPT=58140 DPT=9102 SEQ=2884902030 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB06E0820000000001030307) Nov 28 04:08:03 localhost python3.9[107875]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:08:03 localhost systemd[1]: Reloading. Nov 28 04:08:03 localhost systemd-rc-local-generator[107900]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:08:03 localhost systemd-sysv-generator[107903]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:08:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:08:03 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6c759ff14c93ee6f1945ac1deeb1ee24de00cb02b75f2d36309b911d2c2ef265-userdata-shm.mount: Deactivated successfully. Nov 28 04:08:03 localhost systemd[1]: var-lib-containers-storage-overlay-025413710f75ace93305dda8659754012c7e6b0ed71f82e0bba1b040134d7ebe-merged.mount: Deactivated successfully. Nov 28 04:08:03 localhost systemd[1]: Stopping nova_virtproxyd container... Nov 28 04:08:03 localhost systemd[1]: libpod-76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50.scope: Deactivated successfully. Nov 28 04:08:03 localhost podman[107916]: 2025-11-28 09:08:03.908106867 +0000 UTC m=+0.078760153 container died 76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., container_name=nova_virtproxyd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 28 04:08:03 localhost podman[107916]: 2025-11-28 09:08:03.996180488 +0000 UTC m=+0.166833774 container cleanup 76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step3, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, build-date=2025-11-19T00:35:22Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, release=1761123044, container_name=nova_virtproxyd, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-type=git) Nov 28 04:08:03 localhost podman[107916]: nova_virtproxyd Nov 28 04:08:04 localhost podman[107931]: 2025-11-28 09:08:04.005745945 +0000 UTC m=+0.090100485 container cleanup 76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, tcib_managed=true, config_id=tripleo_step3, container_name=nova_virtproxyd, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git) Nov 28 04:08:04 localhost systemd[1]: libpod-conmon-76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50.scope: Deactivated successfully. Nov 28 04:08:04 localhost podman[107960]: error opening file `/run/crun/76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50/status`: No such file or directory Nov 28 04:08:04 localhost podman[107948]: 2025-11-28 09:08:04.084110304 +0000 UTC m=+0.044941444 container cleanup 76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtproxyd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt) Nov 28 04:08:04 localhost podman[107948]: nova_virtproxyd Nov 28 04:08:04 localhost systemd[1]: tripleo_nova_virtproxyd.service: Deactivated successfully. Nov 28 04:08:04 localhost systemd[1]: Stopped nova_virtproxyd container. Nov 28 04:08:04 localhost python3.9[108054]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:08:04 localhost systemd[1]: var-lib-containers-storage-overlay-2ed6a910a473599a3c0eb40fb417f8a3b4e8b49460b9c3a2f878bde6830d5976-merged.mount: Deactivated successfully. Nov 28 04:08:04 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-76872bed08c91831e19aa7b39a2f6d28609fb77cf6edfbfc03abad69a6c5be50-userdata-shm.mount: Deactivated successfully. Nov 28 04:08:05 localhost systemd[1]: Reloading. Nov 28 04:08:05 localhost systemd-rc-local-generator[108078]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:08:05 localhost systemd-sysv-generator[108083]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:08:05 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:08:06 localhost systemd[1]: tripleo_nova_virtqemud_recover.timer: Deactivated successfully. Nov 28 04:08:06 localhost systemd[1]: Stopped Check and recover tripleo_nova_virtqemud every 10m. Nov 28 04:08:06 localhost systemd[1]: Stopping nova_virtqemud container... Nov 28 04:08:06 localhost systemd[1]: tmp-crun.LESlEN.mount: Deactivated successfully. Nov 28 04:08:06 localhost systemd[1]: libpod-60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057.scope: Deactivated successfully. Nov 28 04:08:06 localhost systemd[1]: libpod-60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057.scope: Consumed 2.603s CPU time. Nov 28 04:08:06 localhost podman[108095]: 2025-11-28 09:08:06.173984315 +0000 UTC m=+0.071921421 container died 60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, container_name=nova_virtqemud, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step3, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4) Nov 28 04:08:06 localhost podman[108095]: 2025-11-28 09:08:06.204677677 +0000 UTC m=+0.102614753 container cleanup 60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, version=17.1.12, release=1761123044, config_id=tripleo_step3, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_virtqemud, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1) Nov 28 04:08:06 localhost podman[108095]: nova_virtqemud Nov 28 04:08:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30837 DF PROTO=TCP SPT=47286 DPT=9102 SEQ=2784489092 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB06EB830000000001030307) Nov 28 04:08:06 localhost podman[108110]: 2025-11-28 09:08:06.249849758 +0000 UTC m=+0.066309808 container cleanup 60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vcs-type=git, config_id=tripleo_step3, container_name=nova_virtqemud, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt) Nov 28 04:08:07 localhost systemd[1]: var-lib-containers-storage-overlay-850a5494078c53945d7c69a5c914ed859c31bc07523103c44dd52329f5c128d7-merged.mount: Deactivated successfully. Nov 28 04:08:07 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057-userdata-shm.mount: Deactivated successfully. Nov 28 04:08:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39022 DF PROTO=TCP SPT=43748 DPT=9100 SEQ=68847106 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB06F7820000000001030307) Nov 28 04:08:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17415 DF PROTO=TCP SPT=33454 DPT=9100 SEQ=3195106899 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0703C30000000001030307) Nov 28 04:08:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51068 DF PROTO=TCP SPT=58596 DPT=9882 SEQ=1402631918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0716420000000001030307) Nov 28 04:08:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 04:08:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 04:08:17 localhost systemd[1]: tmp-crun.6PTELK.mount: Deactivated successfully. Nov 28 04:08:17 localhost podman[108127]: 2025-11-28 09:08:17.835820367 +0000 UTC m=+0.075604805 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Nov 28 04:08:17 localhost podman[108128]: 2025-11-28 09:08:17.888942924 +0000 UTC m=+0.125872544 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, tcib_managed=true, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 04:08:17 localhost podman[108127]: 2025-11-28 09:08:17.907414296 +0000 UTC m=+0.147198734 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 04:08:17 localhost podman[108127]: unhealthy Nov 28 04:08:17 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:08:17 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 04:08:17 localhost podman[108128]: 2025-11-28 09:08:17.932438372 +0000 UTC m=+0.169367962 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com) Nov 28 04:08:17 localhost podman[108128]: unhealthy Nov 28 04:08:17 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:08:17 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 04:08:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43567 DF PROTO=TCP SPT=45272 DPT=9105 SEQ=315254468 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB071DC20000000001030307) Nov 28 04:08:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43568 DF PROTO=TCP SPT=45272 DPT=9105 SEQ=315254468 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0725C20000000001030307) Nov 28 04:08:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43569 DF PROTO=TCP SPT=45272 DPT=9105 SEQ=315254468 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0735820000000001030307) Nov 28 04:08:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56153 DF PROTO=TCP SPT=49374 DPT=9101 SEQ=444321622 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0740BA0000000001030307) Nov 28 04:08:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56155 DF PROTO=TCP SPT=49374 DPT=9101 SEQ=444321622 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB074CC20000000001030307) Nov 28 04:08:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43570 DF PROTO=TCP SPT=45272 DPT=9105 SEQ=315254468 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0755820000000001030307) Nov 28 04:08:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 04:08:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.1 total, 600.0 interval#012Cumulative writes: 4899 writes, 22K keys, 4899 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4899 writes, 608 syncs, 8.06 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 28 04:08:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40582 DF PROTO=TCP SPT=57634 DPT=9100 SEQ=456275993 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0761420000000001030307) Nov 28 04:08:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7902 DF PROTO=TCP SPT=40818 DPT=9102 SEQ=535589742 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB076D830000000001030307) Nov 28 04:08:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 04:08:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.3 total, 600.0 interval#012Cumulative writes: 5616 writes, 25K keys, 5616 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5616 writes, 758 syncs, 7.41 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 28 04:08:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40584 DF PROTO=TCP SPT=57634 DPT=9100 SEQ=456275993 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0779020000000001030307) Nov 28 04:08:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19333 DF PROTO=TCP SPT=41712 DPT=9882 SEQ=3726534465 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB078B820000000001030307) Nov 28 04:08:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 04:08:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 04:08:48 localhost podman[108168]: 2025-11-28 09:08:48.096202951 +0000 UTC m=+0.078779624 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, architecture=x86_64, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 04:08:48 localhost podman[108168]: 2025-11-28 09:08:48.10909167 +0000 UTC m=+0.091668353 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.4, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 04:08:48 localhost podman[108168]: unhealthy Nov 28 04:08:48 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:08:48 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 04:08:48 localhost systemd[1]: tmp-crun.1uzbzQ.mount: Deactivated successfully. Nov 28 04:08:48 localhost podman[108169]: 2025-11-28 09:08:48.157234473 +0000 UTC m=+0.136603546 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 04:08:48 localhost podman[108169]: 2025-11-28 09:08:48.175331404 +0000 UTC m=+0.154700467 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, container_name=ovn_controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 04:08:48 localhost podman[108169]: unhealthy Nov 28 04:08:48 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:08:48 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 04:08:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35635 DF PROTO=TCP SPT=45508 DPT=9105 SEQ=1666481019 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0793020000000001030307) Nov 28 04:08:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35636 DF PROTO=TCP SPT=45508 DPT=9105 SEQ=1666481019 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB079B020000000001030307) Nov 28 04:08:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35637 DF PROTO=TCP SPT=45508 DPT=9105 SEQ=1666481019 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB07AAC20000000001030307) Nov 28 04:08:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34804 DF PROTO=TCP SPT=54386 DPT=9101 SEQ=368064243 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB07B5E90000000001030307) Nov 28 04:09:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34806 DF PROTO=TCP SPT=54386 DPT=9101 SEQ=368064243 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB07C2020000000001030307) Nov 28 04:09:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28986 DF PROTO=TCP SPT=38092 DPT=9102 SEQ=2847201981 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB07CAC20000000001030307) Nov 28 04:09:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14661 DF PROTO=TCP SPT=34100 DPT=9100 SEQ=1491282970 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB07D6820000000001030307) Nov 28 04:09:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17418 DF PROTO=TCP SPT=33454 DPT=9100 SEQ=3195106899 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB07E1820000000001030307) Nov 28 04:09:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14663 DF PROTO=TCP SPT=34100 DPT=9100 SEQ=1491282970 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB07EE420000000001030307) Nov 28 04:09:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65432 DF PROTO=TCP SPT=57836 DPT=9882 SEQ=2076022789 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0800C20000000001030307) Nov 28 04:09:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 04:09:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 04:09:18 localhost podman[108286]: 2025-11-28 09:09:18.348168525 +0000 UTC m=+0.084724119 container health_status 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Nov 28 04:09:18 localhost systemd[1]: tmp-crun.BazvoZ.mount: Deactivated successfully. Nov 28 04:09:18 localhost podman[108285]: 2025-11-28 09:09:18.402983824 +0000 UTC m=+0.139221737 container health_status 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Nov 28 04:09:18 localhost podman[108286]: 2025-11-28 09:09:18.41573732 +0000 UTC m=+0.152292914 container exec_died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, distribution-scope=public, container_name=ovn_controller, config_id=tripleo_step4, version=17.1.12, managed_by=tripleo_ansible) Nov 28 04:09:18 localhost podman[108285]: 2025-11-28 09:09:18.422580232 +0000 UTC m=+0.158818145 container exec_died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, version=17.1.12, vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true) Nov 28 04:09:18 localhost podman[108285]: unhealthy Nov 28 04:09:18 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:09:18 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed with result 'exit-code'. Nov 28 04:09:18 localhost podman[108286]: unhealthy Nov 28 04:09:18 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:09:18 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed with result 'exit-code'. Nov 28 04:09:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37750 DF PROTO=TCP SPT=60094 DPT=9105 SEQ=3483768617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0808430000000001030307) Nov 28 04:09:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37751 DF PROTO=TCP SPT=60094 DPT=9105 SEQ=3483768617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0810430000000001030307) Nov 28 04:09:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37752 DF PROTO=TCP SPT=60094 DPT=9105 SEQ=3483768617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0820020000000001030307) Nov 28 04:09:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54525 DF PROTO=TCP SPT=60424 DPT=9101 SEQ=2486827943 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB082B1A0000000001030307) Nov 28 04:09:28 localhost sshd[108328]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:09:30 localhost systemd[1]: tripleo_nova_virtqemud.service: State 'stop-sigterm' timed out. Killing. Nov 28 04:09:30 localhost systemd[1]: tripleo_nova_virtqemud.service: Killing process 61393 (conmon) with signal SIGKILL. Nov 28 04:09:30 localhost systemd[1]: tripleo_nova_virtqemud.service: Main process exited, code=killed, status=9/KILL Nov 28 04:09:30 localhost systemd[1]: libpod-conmon-60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057.scope: Deactivated successfully. Nov 28 04:09:30 localhost podman[108340]: error opening file `/run/crun/60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057/status`: No such file or directory Nov 28 04:09:30 localhost podman[108329]: 2025-11-28 09:09:30.330888475 +0000 UTC m=+0.063918543 container cleanup 60f2b639dc4dc451fa73648b6521061f232e851646b473a91ae4d921fb334057 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_virtqemud, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step3) Nov 28 04:09:30 localhost podman[108329]: nova_virtqemud Nov 28 04:09:30 localhost systemd[1]: tripleo_nova_virtqemud.service: Failed with result 'timeout'. Nov 28 04:09:30 localhost systemd[1]: Stopped nova_virtqemud container. Nov 28 04:09:31 localhost python3.9[108433]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud_recover.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:09:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54527 DF PROTO=TCP SPT=60424 DPT=9101 SEQ=2486827943 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0837420000000001030307) Nov 28 04:09:32 localhost systemd[1]: Reloading. Nov 28 04:09:32 localhost systemd-rc-local-generator[108461]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:09:32 localhost systemd-sysv-generator[108467]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:09:32 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:09:33 localhost python3.9[108564]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:09:33 localhost systemd[1]: Reloading. Nov 28 04:09:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37753 DF PROTO=TCP SPT=60094 DPT=9105 SEQ=3483768617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB083F820000000001030307) Nov 28 04:09:33 localhost systemd-rc-local-generator[108588]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:09:33 localhost systemd-sysv-generator[108593]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:09:33 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:09:33 localhost systemd[1]: Stopping nova_virtsecretd container... Nov 28 04:09:33 localhost systemd[1]: libpod-2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76.scope: Deactivated successfully. Nov 28 04:09:33 localhost podman[108604]: 2025-11-28 09:09:33.634088529 +0000 UTC m=+0.074453179 container died 2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, release=1761123044, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_virtsecretd, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Nov 28 04:09:33 localhost podman[108604]: 2025-11-28 09:09:33.673130529 +0000 UTC m=+0.113495149 container cleanup 2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtsecretd, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team) Nov 28 04:09:33 localhost podman[108604]: nova_virtsecretd Nov 28 04:09:33 localhost podman[108617]: 2025-11-28 09:09:33.714370678 +0000 UTC m=+0.067633908 container cleanup 2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, container_name=nova_virtsecretd, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z, architecture=x86_64, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com) Nov 28 04:09:33 localhost systemd[1]: libpod-conmon-2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76.scope: Deactivated successfully. Nov 28 04:09:33 localhost podman[108645]: error opening file `/run/crun/2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76/status`: No such file or directory Nov 28 04:09:33 localhost podman[108634]: 2025-11-28 09:09:33.829673983 +0000 UTC m=+0.075336867 container cleanup 2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=nova_virtsecretd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, architecture=x86_64, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Nov 28 04:09:33 localhost podman[108634]: nova_virtsecretd Nov 28 04:09:33 localhost systemd[1]: tripleo_nova_virtsecretd.service: Deactivated successfully. Nov 28 04:09:33 localhost systemd[1]: Stopped nova_virtsecretd container. Nov 28 04:09:34 localhost systemd[1]: var-lib-containers-storage-overlay-e56fe7373565106f757a890fd2f61ff50243f466d810e377560c4fab1b4ea8c4-merged.mount: Deactivated successfully. Nov 28 04:09:34 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2f34ac593fdce9350ee198d200f72a44d8feebd361e95898260c904680cf8f76-userdata-shm.mount: Deactivated successfully. Nov 28 04:09:34 localhost python3.9[108740]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:09:34 localhost systemd[1]: Reloading. Nov 28 04:09:34 localhost systemd-rc-local-generator[108768]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:09:34 localhost systemd-sysv-generator[108772]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:09:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:09:35 localhost systemd[1]: Stopping nova_virtstoraged container... Nov 28 04:09:35 localhost systemd[1]: libpod-635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951.scope: Deactivated successfully. Nov 28 04:09:35 localhost podman[108781]: 2025-11-28 09:09:35.243776511 +0000 UTC m=+0.059770835 container died 635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_virtstoraged, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step3, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.expose-services=, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.12) Nov 28 04:09:35 localhost podman[108781]: 2025-11-28 09:09:35.289057685 +0000 UTC m=+0.105051999 container cleanup 635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:35:22Z, container_name=nova_virtstoraged, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step3, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Nov 28 04:09:35 localhost podman[108781]: nova_virtstoraged Nov 28 04:09:35 localhost podman[108795]: 2025-11-28 09:09:35.333577596 +0000 UTC m=+0.076770513 container cleanup 635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, version=17.1.12, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtstoraged, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z) Nov 28 04:09:35 localhost systemd[1]: libpod-conmon-635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951.scope: Deactivated successfully. Nov 28 04:09:35 localhost podman[108825]: error opening file `/run/crun/635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951/status`: No such file or directory Nov 28 04:09:35 localhost podman[108813]: 2025-11-28 09:09:35.431608675 +0000 UTC m=+0.065204652 container cleanup 635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '0f0904943dda1bf1d123bdf96d71020f'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=nova_virtstoraged) Nov 28 04:09:35 localhost podman[108813]: nova_virtstoraged Nov 28 04:09:35 localhost systemd[1]: tripleo_nova_virtstoraged.service: Deactivated successfully. Nov 28 04:09:35 localhost systemd[1]: Stopped nova_virtstoraged container. Nov 28 04:09:35 localhost systemd[1]: tmp-crun.xpCcLo.mount: Deactivated successfully. Nov 28 04:09:35 localhost systemd[1]: var-lib-containers-storage-overlay-fc6f1986e221e102f7feefd3b96110a2bdd081d61203281438672df579c3b8f4-merged.mount: Deactivated successfully. Nov 28 04:09:35 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-635237f1fe33e7fa6bf9dc6e26e9b13e7bb3027f3e4c594c17ae527cf8241951-userdata-shm.mount: Deactivated successfully. Nov 28 04:09:36 localhost python3.9[108918]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_controller.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:09:36 localhost systemd[1]: Reloading. Nov 28 04:09:36 localhost systemd-sysv-generator[108949]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:09:36 localhost systemd-rc-local-generator[108942]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:09:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7905 DF PROTO=TCP SPT=40818 DPT=9102 SEQ=535589742 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB084B820000000001030307) Nov 28 04:09:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:09:36 localhost systemd[1]: Stopping ovn_controller container... Nov 28 04:09:36 localhost systemd[1]: libpod-3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.scope: Deactivated successfully. Nov 28 04:09:36 localhost systemd[1]: libpod-3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.scope: Consumed 2.618s CPU time. Nov 28 04:09:36 localhost podman[108959]: 2025-11-28 09:09:36.628936991 +0000 UTC m=+0.076541424 container died 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 04:09:36 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.timer: Deactivated successfully. Nov 28 04:09:36 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e. Nov 28 04:09:36 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed to open /run/systemd/transient/3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: No such file or directory Nov 28 04:09:36 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e-userdata-shm.mount: Deactivated successfully. Nov 28 04:09:36 localhost systemd[1]: var-lib-containers-storage-overlay-069878c0f25bf999d8bcfd5288562c93f75c544f63f9cb3c25f6c968781689e7-merged.mount: Deactivated successfully. Nov 28 04:09:36 localhost podman[108959]: 2025-11-28 09:09:36.674815924 +0000 UTC m=+0.122420337 container cleanup 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc.) Nov 28 04:09:36 localhost podman[108959]: ovn_controller Nov 28 04:09:36 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.timer: Failed to open /run/systemd/transient/3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.timer: No such file or directory Nov 28 04:09:36 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed to open /run/systemd/transient/3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: No such file or directory Nov 28 04:09:36 localhost podman[108971]: 2025-11-28 09:09:36.724531735 +0000 UTC m=+0.080975392 container cleanup 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, release=1761123044, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Nov 28 04:09:36 localhost systemd[1]: libpod-conmon-3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.scope: Deactivated successfully. Nov 28 04:09:36 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.timer: Failed to open /run/systemd/transient/3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.timer: No such file or directory Nov 28 04:09:36 localhost systemd[1]: 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: Failed to open /run/systemd/transient/3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e.service: No such file or directory Nov 28 04:09:36 localhost podman[108986]: 2025-11-28 09:09:36.819538821 +0000 UTC m=+0.065387999 container cleanup 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, vcs-type=git, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1) Nov 28 04:09:36 localhost podman[108986]: ovn_controller Nov 28 04:09:36 localhost systemd[1]: tripleo_ovn_controller.service: Deactivated successfully. Nov 28 04:09:36 localhost systemd[1]: Stopped ovn_controller container. Nov 28 04:09:37 localhost python3.9[109089]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_metadata_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:09:37 localhost systemd[1]: Reloading. Nov 28 04:09:37 localhost systemd-rc-local-generator[109113]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:09:37 localhost systemd-sysv-generator[109119]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:09:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:09:37 localhost systemd[1]: Stopping ovn_metadata_agent container... Nov 28 04:09:38 localhost systemd[1]: libpod-1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.scope: Deactivated successfully. Nov 28 04:09:38 localhost systemd[1]: libpod-1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.scope: Consumed 10.934s CPU time. Nov 28 04:09:38 localhost podman[109130]: 2025-11-28 09:09:38.966098781 +0000 UTC m=+1.018096261 container died 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_id=tripleo_step4, release=1761123044, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64) Nov 28 04:09:38 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.timer: Deactivated successfully. Nov 28 04:09:38 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633. Nov 28 04:09:38 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed to open /run/systemd/transient/1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: No such file or directory Nov 28 04:09:39 localhost systemd[1]: tmp-crun.7iLb7H.mount: Deactivated successfully. Nov 28 04:09:39 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633-userdata-shm.mount: Deactivated successfully. Nov 28 04:09:39 localhost systemd[1]: var-lib-containers-storage-overlay-278e8886c08486a895bd57117a67612048d79e44c13305c2caa42c992931b27d-merged.mount: Deactivated successfully. Nov 28 04:09:39 localhost podman[109130]: 2025-11-28 09:09:39.044996367 +0000 UTC m=+1.096993797 container cleanup 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 04:09:39 localhost podman[109130]: ovn_metadata_agent Nov 28 04:09:39 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.timer: Failed to open /run/systemd/transient/1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.timer: No such file or directory Nov 28 04:09:39 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed to open /run/systemd/transient/1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: No such file or directory Nov 28 04:09:39 localhost podman[109142]: 2025-11-28 09:09:39.067818924 +0000 UTC m=+0.093531231 container cleanup 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 04:09:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40587 DF PROTO=TCP SPT=57634 DPT=9100 SEQ=456275993 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0857820000000001030307) Nov 28 04:09:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36018 DF PROTO=TCP SPT=57636 DPT=9100 SEQ=2153070501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0863420000000001030307) Nov 28 04:09:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64177 DF PROTO=TCP SPT=51402 DPT=9882 SEQ=2193097018 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0875C20000000001030307) Nov 28 04:09:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55959 DF PROTO=TCP SPT=41712 DPT=9105 SEQ=4262444411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB08795B0000000001030307) Nov 28 04:09:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55961 DF PROTO=TCP SPT=41712 DPT=9105 SEQ=4262444411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0885820000000001030307) Nov 28 04:09:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55962 DF PROTO=TCP SPT=41712 DPT=9105 SEQ=4262444411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0895430000000001030307) Nov 28 04:09:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58888 DF PROTO=TCP SPT=52048 DPT=9101 SEQ=3011082735 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB08A04A0000000001030307) Nov 28 04:10:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58890 DF PROTO=TCP SPT=52048 DPT=9101 SEQ=3011082735 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB08AC420000000001030307) Nov 28 04:10:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23435 DF PROTO=TCP SPT=57294 DPT=9102 SEQ=2423972304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB08B5420000000001030307) Nov 28 04:10:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47641 DF PROTO=TCP SPT=56056 DPT=9100 SEQ=274390103 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB08C0C20000000001030307) Nov 28 04:10:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23437 DF PROTO=TCP SPT=57294 DPT=9102 SEQ=2423972304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB08CD020000000001030307) Nov 28 04:10:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47643 DF PROTO=TCP SPT=56056 DPT=9100 SEQ=274390103 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB08D8820000000001030307) Nov 28 04:10:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36413 DF PROTO=TCP SPT=35634 DPT=9882 SEQ=1664321515 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB08EB020000000001030307) Nov 28 04:10:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55519 DF PROTO=TCP SPT=34028 DPT=9105 SEQ=454237546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB08F2820000000001030307) Nov 28 04:10:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55520 DF PROTO=TCP SPT=34028 DPT=9105 SEQ=454237546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB08FA820000000001030307) Nov 28 04:10:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55521 DF PROTO=TCP SPT=34028 DPT=9105 SEQ=454237546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB090A430000000001030307) Nov 28 04:10:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53353 DF PROTO=TCP SPT=43420 DPT=9101 SEQ=2773884380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB09157A0000000001030307) Nov 28 04:10:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53355 DF PROTO=TCP SPT=43420 DPT=9101 SEQ=2773884380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0921820000000001030307) Nov 28 04:10:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9006 DF PROTO=TCP SPT=51120 DPT=9102 SEQ=239423527 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB092A820000000001030307) Nov 28 04:10:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58283 DF PROTO=TCP SPT=59492 DPT=9102 SEQ=2294016223 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0935820000000001030307) Nov 28 04:10:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36021 DF PROTO=TCP SPT=57636 DPT=9100 SEQ=2153070501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0941820000000001030307) Nov 28 04:10:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7253 DF PROTO=TCP SPT=49748 DPT=9100 SEQ=2026696104 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB094DC30000000001030307) Nov 28 04:10:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47504 DF PROTO=TCP SPT=52916 DPT=9882 SEQ=35957365 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0960420000000001030307) Nov 28 04:10:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48678 DF PROTO=TCP SPT=40998 DPT=9105 SEQ=3512029773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0963BB0000000001030307) Nov 28 04:10:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48680 DF PROTO=TCP SPT=40998 DPT=9105 SEQ=3512029773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB096FC20000000001030307) Nov 28 04:10:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48681 DF PROTO=TCP SPT=40998 DPT=9105 SEQ=3512029773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB097F820000000001030307) Nov 28 04:10:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46832 DF PROTO=TCP SPT=59510 DPT=9101 SEQ=4033662674 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB098AA90000000001030307) Nov 28 04:11:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46834 DF PROTO=TCP SPT=59510 DPT=9101 SEQ=4033662674 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0996C20000000001030307) Nov 28 04:11:03 localhost systemd[1]: tripleo_ovn_metadata_agent.service: State 'stop-sigterm' timed out. Killing. Nov 28 04:11:03 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Killing process 70684 (conmon) with signal SIGKILL. Nov 28 04:11:03 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Main process exited, code=killed, status=9/KILL Nov 28 04:11:03 localhost systemd[1]: libpod-conmon-1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.scope: Deactivated successfully. Nov 28 04:11:03 localhost podman[109380]: error opening file `/run/crun/1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633/status`: No such file or directory Nov 28 04:11:03 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.timer: Failed to open /run/systemd/transient/1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.timer: No such file or directory Nov 28 04:11:03 localhost systemd[1]: 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: Failed to open /run/systemd/transient/1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633.service: No such file or directory Nov 28 04:11:03 localhost podman[109368]: 2025-11-28 09:11:03.356661159 +0000 UTC m=+0.086092211 container cleanup 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, vcs-type=git, distribution-scope=public, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 04:11:03 localhost podman[109368]: ovn_metadata_agent Nov 28 04:11:03 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Failed with result 'timeout'. Nov 28 04:11:03 localhost systemd[1]: Stopped ovn_metadata_agent container. Nov 28 04:11:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48682 DF PROTO=TCP SPT=40998 DPT=9105 SEQ=3512029773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB099F820000000001030307) Nov 28 04:11:04 localhost python3.9[109473]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_rsyslog.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:11:05 localhost systemd[1]: Reloading. Nov 28 04:11:05 localhost systemd-rc-local-generator[109499]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:11:05 localhost systemd-sysv-generator[109505]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:11:05 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:11:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49224 DF PROTO=TCP SPT=44206 DPT=9100 SEQ=2623352065 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB09AB420000000001030307) Nov 28 04:11:07 localhost python3.9[109603]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:07 localhost python3.9[109695]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:08 localhost python3.9[109787]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:09 localhost python3.9[109879]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31420 DF PROTO=TCP SPT=49396 DPT=9102 SEQ=1959724451 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB09B7420000000001030307) Nov 28 04:11:09 localhost python3.9[109971]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:10 localhost python3.9[110063]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:10 localhost python3.9[110155]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:11 localhost python3.9[110247]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:12 localhost python3.9[110339]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47506 DF PROTO=TCP SPT=52916 DPT=9882 SEQ=35957365 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB09C1820000000001030307) Nov 28 04:11:12 localhost python3.9[110431]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:13 localhost python3.9[110523]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:13 localhost python3.9[110615]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:14 localhost python3.9[110707]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:15 localhost python3.9[110799]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:15 localhost python3.9[110891]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:16 localhost python3.9[110983]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:16 localhost python3.9[111075]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45121 DF PROTO=TCP SPT=51856 DPT=9882 SEQ=2081403544 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB09D5820000000001030307) Nov 28 04:11:17 localhost python3.9[111167]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:17 localhost python3.9[111259]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:18 localhost python3.9[111351]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:19 localhost python3.9[111443]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57945 DF PROTO=TCP SPT=41722 DPT=9105 SEQ=2884774350 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB09DD020000000001030307) Nov 28 04:11:20 localhost python3.9[111535]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:21 localhost python3.9[111627]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57946 DF PROTO=TCP SPT=41722 DPT=9105 SEQ=2884774350 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB09E5020000000001030307) Nov 28 04:11:21 localhost python3.9[111719]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:22 localhost python3.9[111811]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:22 localhost python3.9[111903]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:23 localhost python3.9[111995]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:24 localhost python3.9[112087]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:24 localhost python3.9[112179]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57947 DF PROTO=TCP SPT=41722 DPT=9105 SEQ=2884774350 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB09F4C30000000001030307) Nov 28 04:11:25 localhost python3.9[112271]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:25 localhost python3.9[112363]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:26 localhost python3.9[112455]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:27 localhost python3.9[112547]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:27 localhost python3.9[112639]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31253 DF PROTO=TCP SPT=52284 DPT=9101 SEQ=1031790436 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB09FFDA0000000001030307) Nov 28 04:11:28 localhost python3.9[112731]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:28 localhost python3.9[112823]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:29 localhost python3.9[112915]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:29 localhost python3.9[113007]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:30 localhost python3.9[113099]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31255 DF PROTO=TCP SPT=52284 DPT=9101 SEQ=1031790436 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A0C020000000001030307) Nov 28 04:11:31 localhost python3.9[113191]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:31 localhost python3.9[113283]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:32 localhost python3.9[113375]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:11:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12806 DF PROTO=TCP SPT=59416 DPT=9102 SEQ=1390093034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A14C30000000001030307) Nov 28 04:11:34 localhost python3.9[113467]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:11:35 localhost python3.9[113559]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 28 04:11:35 localhost python3.9[113651]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 28 04:11:35 localhost systemd[1]: Reloading. Nov 28 04:11:36 localhost systemd-rc-local-generator[113677]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:11:36 localhost systemd-sysv-generator[113682]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:11:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:11:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13450 DF PROTO=TCP SPT=43218 DPT=9100 SEQ=3957548637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A20420000000001030307) Nov 28 04:11:36 localhost python3.9[113779]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:11:37 localhost python3.9[113872]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_ipmi.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:11:38 localhost python3.9[113965]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_collectd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:11:38 localhost python3.9[114058]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_iscsid.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:11:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7256 DF PROTO=TCP SPT=49748 DPT=9100 SEQ=2026696104 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A2B820000000001030307) Nov 28 04:11:39 localhost python3.9[114151]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_logrotate_crond.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:11:39 localhost python3.9[114244]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_metrics_qdr.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:11:40 localhost python3.9[114337]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_dhcp.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:11:41 localhost python3.9[114430]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_l3_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:11:41 localhost python3.9[114523]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_ovs_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:11:42 localhost python3.9[114616]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:11:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13452 DF PROTO=TCP SPT=43218 DPT=9100 SEQ=3957548637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A38020000000001030307) Nov 28 04:11:43 localhost python3.9[114709]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:11:43 localhost python3.9[114802]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:11:44 localhost python3.9[114895]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:11:44 localhost python3.9[114988]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:11:45 localhost python3.9[115081]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:11:46 localhost python3.9[115174]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud_recover.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:11:46 localhost python3.9[115267]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:11:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6490 DF PROTO=TCP SPT=43228 DPT=9882 SEQ=1533842803 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A4A820000000001030307) Nov 28 04:11:48 localhost python3.9[115360]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:11:48 localhost python3.9[115453]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_controller.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:11:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35531 DF PROTO=TCP SPT=36474 DPT=9105 SEQ=3246025437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A52420000000001030307) Nov 28 04:11:49 localhost python3.9[115546]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_metadata_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:11:50 localhost python3.9[115639]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_rsyslog.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:11:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35532 DF PROTO=TCP SPT=36474 DPT=9105 SEQ=3246025437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A5A420000000001030307) Nov 28 04:11:53 localhost systemd[1]: session-37.scope: Deactivated successfully. Nov 28 04:11:53 localhost systemd[1]: session-37.scope: Consumed 47.937s CPU time. Nov 28 04:11:53 localhost systemd-logind[764]: Session 37 logged out. Waiting for processes to exit. Nov 28 04:11:53 localhost systemd-logind[764]: Removed session 37. Nov 28 04:11:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35533 DF PROTO=TCP SPT=36474 DPT=9105 SEQ=3246025437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A6A030000000001030307) Nov 28 04:11:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22290 DF PROTO=TCP SPT=51966 DPT=9101 SEQ=3923722619 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A75090000000001030307) Nov 28 04:12:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22292 DF PROTO=TCP SPT=51966 DPT=9101 SEQ=3923722619 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A81020000000001030307) Nov 28 04:12:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35534 DF PROTO=TCP SPT=36474 DPT=9105 SEQ=3246025437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A89830000000001030307) Nov 28 04:12:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13522 DF PROTO=TCP SPT=60118 DPT=9100 SEQ=1126112828 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0A95820000000001030307) Nov 28 04:12:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49229 DF PROTO=TCP SPT=44206 DPT=9100 SEQ=2623352065 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0AA1830000000001030307) Nov 28 04:12:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13524 DF PROTO=TCP SPT=60118 DPT=9100 SEQ=1126112828 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0AAD430000000001030307) Nov 28 04:12:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29832 DF PROTO=TCP SPT=50094 DPT=9882 SEQ=1137643269 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0ABFC30000000001030307) Nov 28 04:12:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38586 DF PROTO=TCP SPT=42676 DPT=9105 SEQ=2931150111 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0AC7420000000001030307) Nov 28 04:12:20 localhost sshd[115732]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:12:20 localhost systemd-logind[764]: New session 38 of user zuul. Nov 28 04:12:20 localhost systemd[1]: Started Session 38 of User zuul. Nov 28 04:12:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38587 DF PROTO=TCP SPT=42676 DPT=9105 SEQ=2931150111 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0ACF430000000001030307) Nov 28 04:12:21 localhost python3.9[115825]: ansible-ansible.legacy.ping Invoked with data=pong Nov 28 04:12:22 localhost python3.9[115929]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:12:23 localhost python3.9[116021]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:12:24 localhost python3.9[116114]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:12:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38588 DF PROTO=TCP SPT=42676 DPT=9105 SEQ=2931150111 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0ADF020000000001030307) Nov 28 04:12:25 localhost python3.9[116206]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:12:25 localhost python3.9[116298]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:12:26 localhost python3.9[116371]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321145.4133835-177-143025092198065/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:12:27 localhost python3.9[116463]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:12:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27309 DF PROTO=TCP SPT=38220 DPT=9101 SEQ=842800084 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0AEA390000000001030307) Nov 28 04:12:28 localhost python3.9[116559]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:12:29 localhost python3.9[116651]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:12:29 localhost python3.9[116741]: ansible-ansible.builtin.service_facts Invoked Nov 28 04:12:30 localhost network[116758]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 28 04:12:30 localhost network[116759]: 'network-scripts' will be removed from distribution in near future. Nov 28 04:12:30 localhost network[116760]: It is advised to switch to 'NetworkManager' instead for network management. Nov 28 04:12:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27311 DF PROTO=TCP SPT=38220 DPT=9101 SEQ=842800084 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0AF6430000000001030307) Nov 28 04:12:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:12:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6898 DF PROTO=TCP SPT=56826 DPT=9102 SEQ=3374587745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0AFF420000000001030307) Nov 28 04:12:35 localhost python3.9[116957]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:12:36 localhost python3.9[117047]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:12:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7407 DF PROTO=TCP SPT=45554 DPT=9100 SEQ=2959990013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B0AC20000000001030307) Nov 28 04:12:37 localhost python3.9[117143]: ansible-ansible.legacy.command Invoked with _raw_params=# This is a hack to deploy RDO Delorean repos to RHEL as if it were Centos 9 Stream#012set -euxo pipefail#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./repo-setup-main#012# This is required for FIPS enabled until trunk.rdoproject.org#012# is not being served from a centos7 host, tracked by#012# https://issues.redhat.com/browse/RHOSZUUL-1517#012dnf -y install crypto-policies#012update-crypto-policies --set FIPS:NO-ENFORCE-EMS#012./venv/bin/repo-setup current-podified -b antelope -d centos9 --stream#012#012# Exclude ceph-common-18.2.7 as it's pulling newer openssl not compatible#012# with rhel 9.2 openssh#012dnf config-manager --setopt centos9-storage.exclude="ceph-common-18.2.7" --save#012# FIXME: perform dnf upgrade for other packages in EDPM ansible#012# here we only ensuring that decontainerized libvirt can start#012dnf -y upgrade openstack-selinux#012rm -f /run/virtlogd.pid#012#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:12:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13455 DF PROTO=TCP SPT=43218 DPT=9100 SEQ=3957548637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B15830000000001030307) Nov 28 04:12:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7409 DF PROTO=TCP SPT=45554 DPT=9100 SEQ=2959990013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B22820000000001030307) Nov 28 04:12:46 localhost systemd[1]: Stopping OpenSSH server daemon... Nov 28 04:12:46 localhost systemd[1]: sshd.service: Deactivated successfully. Nov 28 04:12:46 localhost systemd[1]: Stopped OpenSSH server daemon. Nov 28 04:12:46 localhost systemd[1]: sshd.service: Consumed 1.441s CPU time. Nov 28 04:12:46 localhost systemd[1]: Stopped target sshd-keygen.target. Nov 28 04:12:46 localhost systemd[1]: Stopping sshd-keygen.target... Nov 28 04:12:46 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 28 04:12:46 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 28 04:12:46 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 28 04:12:46 localhost systemd[1]: Reached target sshd-keygen.target. Nov 28 04:12:46 localhost systemd[1]: Starting OpenSSH server daemon... Nov 28 04:12:46 localhost sshd[117187]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:12:46 localhost systemd[1]: Started OpenSSH server daemon. Nov 28 04:12:46 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 28 04:12:46 localhost systemd[1]: Starting man-db-cache-update.service... Nov 28 04:12:46 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 28 04:12:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11896 DF PROTO=TCP SPT=42266 DPT=9882 SEQ=2507771196 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B35020000000001030307) Nov 28 04:12:47 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 28 04:12:47 localhost systemd[1]: Finished man-db-cache-update.service. Nov 28 04:12:47 localhost systemd[1]: run-r4557b5a9d04f48e79325a350f4e111bf.service: Deactivated successfully. Nov 28 04:12:47 localhost systemd[1]: run-rf4bf9086689948fcbad8d1e7ff44a82b.service: Deactivated successfully. Nov 28 04:12:47 localhost systemd[1]: Stopping OpenSSH server daemon... Nov 28 04:12:47 localhost systemd[1]: sshd.service: Deactivated successfully. Nov 28 04:12:47 localhost systemd[1]: Stopped OpenSSH server daemon. Nov 28 04:12:47 localhost systemd[1]: Stopped target sshd-keygen.target. Nov 28 04:12:47 localhost systemd[1]: Stopping sshd-keygen.target... Nov 28 04:12:47 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 28 04:12:47 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 28 04:12:47 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 28 04:12:47 localhost systemd[1]: Reached target sshd-keygen.target. Nov 28 04:12:47 localhost systemd[1]: Starting OpenSSH server daemon... Nov 28 04:12:47 localhost sshd[117359]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:12:47 localhost systemd[1]: Started OpenSSH server daemon. Nov 28 04:12:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2 DF PROTO=TCP SPT=35578 DPT=9105 SEQ=1878191914 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B387B0000000001030307) Nov 28 04:12:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4 DF PROTO=TCP SPT=35578 DPT=9105 SEQ=1878191914 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B44820000000001030307) Nov 28 04:12:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5 DF PROTO=TCP SPT=35578 DPT=9105 SEQ=1878191914 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B54420000000001030307) Nov 28 04:12:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54495 DF PROTO=TCP SPT=52084 DPT=9101 SEQ=1576886192 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B5F690000000001030307) Nov 28 04:13:00 localhost podman[117555]: 2025-11-28 09:13:00.492168491 +0000 UTC m=+0.092795288 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, version=7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, release=553, GIT_CLEAN=True, distribution-scope=public, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, build-date=2025-09-24T08:57:55) Nov 28 04:13:00 localhost podman[117555]: 2025-11-28 09:13:00.595500935 +0000 UTC m=+0.196127692 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, name=rhceph, description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.component=rhceph-container, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.buildah.version=1.33.12, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, CEPH_POINT_RELEASE=, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 28 04:13:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54497 DF PROTO=TCP SPT=52084 DPT=9101 SEQ=1576886192 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B6B820000000001030307) Nov 28 04:13:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43740 DF PROTO=TCP SPT=39052 DPT=9102 SEQ=2134080702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B74420000000001030307) Nov 28 04:13:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32980 DF PROTO=TCP SPT=60448 DPT=9102 SEQ=2357390439 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B7F820000000001030307) Nov 28 04:13:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13527 DF PROTO=TCP SPT=60118 DPT=9100 SEQ=1126112828 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B8B820000000001030307) Nov 28 04:13:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53970 DF PROTO=TCP SPT=41942 DPT=9100 SEQ=800026680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0B97C20000000001030307) Nov 28 04:13:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8233 DF PROTO=TCP SPT=36980 DPT=9882 SEQ=4151490587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0BAA430000000001030307) Nov 28 04:13:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59422 DF PROTO=TCP SPT=53994 DPT=9105 SEQ=2631582463 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0BB1C20000000001030307) Nov 28 04:13:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59423 DF PROTO=TCP SPT=53994 DPT=9105 SEQ=2631582463 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0BB9C20000000001030307) Nov 28 04:13:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59424 DF PROTO=TCP SPT=53994 DPT=9105 SEQ=2631582463 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0BC9830000000001030307) Nov 28 04:13:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64987 DF PROTO=TCP SPT=47716 DPT=9101 SEQ=549047946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0BD4990000000001030307) Nov 28 04:13:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64989 DF PROTO=TCP SPT=47716 DPT=9101 SEQ=549047946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0BE0820000000001030307) Nov 28 04:13:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59425 DF PROTO=TCP SPT=53994 DPT=9105 SEQ=2631582463 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0BE9820000000001030307) Nov 28 04:13:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44952 DF PROTO=TCP SPT=35102 DPT=9100 SEQ=3646895248 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0BF5020000000001030307) Nov 28 04:13:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4950 DF PROTO=TCP SPT=40686 DPT=9102 SEQ=27247962 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C01420000000001030307) Nov 28 04:13:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8235 DF PROTO=TCP SPT=36980 DPT=9882 SEQ=4151490587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C0B820000000001030307) Nov 28 04:13:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25418 DF PROTO=TCP SPT=48412 DPT=9882 SEQ=2403479743 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C1F430000000001030307) Nov 28 04:13:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42779 DF PROTO=TCP SPT=34424 DPT=9105 SEQ=850738264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C27020000000001030307) Nov 28 04:13:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42780 DF PROTO=TCP SPT=34424 DPT=9105 SEQ=850738264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C2F030000000001030307) Nov 28 04:13:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42781 DF PROTO=TCP SPT=34424 DPT=9105 SEQ=850738264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C3EC20000000001030307) Nov 28 04:13:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38049 DF PROTO=TCP SPT=37430 DPT=9101 SEQ=2111172106 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C49CB0000000001030307) Nov 28 04:13:59 localhost kernel: SELinux: Converting 2754 SID table entries... Nov 28 04:13:59 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 28 04:13:59 localhost kernel: SELinux: policy capability open_perms=1 Nov 28 04:13:59 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 28 04:13:59 localhost kernel: SELinux: policy capability always_check_network=0 Nov 28 04:13:59 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 28 04:13:59 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 28 04:13:59 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 28 04:14:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38051 DF PROTO=TCP SPT=37430 DPT=9101 SEQ=2111172106 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C55C20000000001030307) Nov 28 04:14:02 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=17 res=1 Nov 28 04:14:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51483 DF PROTO=TCP SPT=56590 DPT=9102 SEQ=1973360585 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C5EC20000000001030307) Nov 28 04:14:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43745 DF PROTO=TCP SPT=39052 DPT=9102 SEQ=2134080702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C69830000000001030307) Nov 28 04:14:06 localhost python3.9[118241]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:14:07 localhost python3.9[118333]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/edpm.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:14:07 localhost python3.9[118406]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/edpm.fact mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321246.659117-426-172103223993847/.source.fact _original_basename=.5ew5x_ts follow=False checksum=03aee63dcf9b49b0ac4473b2f1a1b5d3783aa639 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:14:08 localhost python3.9[118496]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:14:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53973 DF PROTO=TCP SPT=41942 DPT=9100 SEQ=800026680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C75820000000001030307) Nov 28 04:14:09 localhost python3.9[118594]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 28 04:14:10 localhost python3.9[118648]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 28 04:14:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41221 DF PROTO=TCP SPT=50396 DPT=9100 SEQ=701157145 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C82030000000001030307) Nov 28 04:14:14 localhost systemd[1]: Reloading. Nov 28 04:14:14 localhost systemd-rc-local-generator[118681]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:14:14 localhost systemd-sysv-generator[118687]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:14:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:14:14 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 28 04:14:16 localhost python3.9[118787]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:14:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58354 DF PROTO=TCP SPT=41250 DPT=9882 SEQ=1232032057 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C94820000000001030307) Nov 28 04:14:18 localhost python3.9[119026]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False Nov 28 04:14:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38687 DF PROTO=TCP SPT=40280 DPT=9105 SEQ=623990083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0C9C030000000001030307) Nov 28 04:14:19 localhost python3.9[119118]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None Nov 28 04:14:20 localhost python3.9[119211]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:14:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38688 DF PROTO=TCP SPT=40280 DPT=9105 SEQ=623990083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0CA4020000000001030307) Nov 28 04:14:21 localhost python3.9[119303]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None Nov 28 04:14:23 localhost python3.9[119395]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:14:23 localhost python3.9[119487]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:14:24 localhost python3.9[119560]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321263.4025307-750-234091474317493/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:14:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38689 DF PROTO=TCP SPT=40280 DPT=9105 SEQ=623990083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0CB3C30000000001030307) Nov 28 04:14:25 localhost python3.9[119652]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:14:27 localhost python3.9[119746]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None Nov 28 04:14:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62894 DF PROTO=TCP SPT=49510 DPT=9101 SEQ=1712621461 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0CBEFA0000000001030307) Nov 28 04:14:28 localhost python3.9[119839]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None Nov 28 04:14:29 localhost python3.9[119932]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Nov 28 04:14:29 localhost python3.9[120030]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None Nov 28 04:14:30 localhost python3.9[120122]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 28 04:14:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62896 DF PROTO=TCP SPT=49510 DPT=9101 SEQ=1712621461 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0CCB020000000001030307) Nov 28 04:14:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38690 DF PROTO=TCP SPT=40280 DPT=9105 SEQ=623990083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0CD3830000000001030307) Nov 28 04:14:34 localhost python3.9[120216]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:14:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8860 DF PROTO=TCP SPT=57342 DPT=9100 SEQ=3269640789 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0CDF820000000001030307) Nov 28 04:14:39 localhost python3.9[120309]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:14:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44957 DF PROTO=TCP SPT=35102 DPT=9100 SEQ=3646895248 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0CEB830000000001030307) Nov 28 04:14:39 localhost python3.9[120382]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321278.9657657-1023-43509833346468/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 28 04:14:42 localhost sshd[120397]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:14:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8862 DF PROTO=TCP SPT=57342 DPT=9100 SEQ=3269640789 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0CF7420000000001030307) Nov 28 04:14:44 localhost python3.9[120476]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 04:14:45 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Nov 28 04:14:45 localhost systemd[1]: Stopped Load Kernel Modules. Nov 28 04:14:45 localhost systemd[1]: Stopping Load Kernel Modules... Nov 28 04:14:45 localhost systemd[1]: Starting Load Kernel Modules... Nov 28 04:14:45 localhost systemd-modules-load[120480]: Module 'msr' is built in Nov 28 04:14:45 localhost systemd[1]: Finished Load Kernel Modules. Nov 28 04:14:46 localhost python3.9[120573]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:14:46 localhost python3.9[120646]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321285.9471498-1092-17497424745442/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 28 04:14:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59354 DF PROTO=TCP SPT=39978 DPT=9882 SEQ=1600516699 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D09C20000000001030307) Nov 28 04:14:48 localhost python3.9[120738]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 28 04:14:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5412 DF PROTO=TCP SPT=45724 DPT=9105 SEQ=3240212235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D11420000000001030307) Nov 28 04:14:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5413 DF PROTO=TCP SPT=45724 DPT=9105 SEQ=3240212235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D19420000000001030307) Nov 28 04:14:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5414 DF PROTO=TCP SPT=45724 DPT=9105 SEQ=3240212235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D29020000000001030307) Nov 28 04:14:56 localhost python3.9[120831]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:14:57 localhost python3.9[120923]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile Nov 28 04:14:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38332 DF PROTO=TCP SPT=41366 DPT=9101 SEQ=1105872768 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D34290000000001030307) Nov 28 04:14:58 localhost python3.9[121013]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:14:59 localhost python3.9[121105]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:15:00 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Nov 28 04:15:00 localhost systemd[1]: tuned.service: Deactivated successfully. Nov 28 04:15:00 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Nov 28 04:15:00 localhost systemd[1]: tuned.service: Consumed 1.744s CPU time, no IO. Nov 28 04:15:00 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Nov 28 04:15:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38334 DF PROTO=TCP SPT=41366 DPT=9101 SEQ=1105872768 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D40420000000001030307) Nov 28 04:15:01 localhost systemd[1]: Started Dynamic System Tuning Daemon. Nov 28 04:15:02 localhost python3.9[121207]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline Nov 28 04:15:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41135 DF PROTO=TCP SPT=54278 DPT=9102 SEQ=767752292 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D49020000000001030307) Nov 28 04:15:06 localhost python3.9[121375]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:15:06 localhost systemd[1]: Reloading. Nov 28 04:15:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28466 DF PROTO=TCP SPT=33224 DPT=9100 SEQ=2100537511 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D54C20000000001030307) Nov 28 04:15:06 localhost systemd-rc-local-generator[121400]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:15:06 localhost systemd-sysv-generator[121405]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:15:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:15:07 localhost python3.9[121505]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:15:07 localhost systemd[1]: Reloading. Nov 28 04:15:07 localhost systemd-rc-local-generator[121531]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:15:07 localhost systemd-sysv-generator[121537]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:15:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:15:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41224 DF PROTO=TCP SPT=50396 DPT=9100 SEQ=701157145 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D5F820000000001030307) Nov 28 04:15:09 localhost python3.9[121635]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:15:10 localhost python3.9[121728]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:15:10 localhost kernel: Adding 1048572k swap on /swap. Priority:-2 extents:1 across:1048572k FS Nov 28 04:15:11 localhost python3.9[121821]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:15:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28468 DF PROTO=TCP SPT=33224 DPT=9100 SEQ=2100537511 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D6C820000000001030307) Nov 28 04:15:12 localhost python3.9[121920]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:15:13 localhost python3.9[122013]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 04:15:13 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Nov 28 04:15:13 localhost systemd[1]: Stopped Apply Kernel Variables. Nov 28 04:15:13 localhost systemd[1]: Stopping Apply Kernel Variables... Nov 28 04:15:13 localhost systemd[1]: Starting Apply Kernel Variables... Nov 28 04:15:13 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Nov 28 04:15:13 localhost systemd[1]: Finished Apply Kernel Variables. Nov 28 04:15:14 localhost systemd[1]: session-38.scope: Deactivated successfully. Nov 28 04:15:14 localhost systemd[1]: session-38.scope: Consumed 1min 55.336s CPU time. Nov 28 04:15:14 localhost systemd-logind[764]: Session 38 logged out. Waiting for processes to exit. Nov 28 04:15:14 localhost systemd-logind[764]: Removed session 38. Nov 28 04:15:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57701 DF PROTO=TCP SPT=53970 DPT=9882 SEQ=1263280768 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D7F020000000001030307) Nov 28 04:15:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29488 DF PROTO=TCP SPT=43470 DPT=9105 SEQ=2570715432 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D86820000000001030307) Nov 28 04:15:19 localhost sshd[122033]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:15:19 localhost systemd-logind[764]: New session 39 of user zuul. Nov 28 04:15:19 localhost systemd[1]: Started Session 39 of User zuul. Nov 28 04:15:20 localhost python3.9[122126]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:15:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29489 DF PROTO=TCP SPT=43470 DPT=9105 SEQ=2570715432 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D8E820000000001030307) Nov 28 04:15:22 localhost python3.9[122220]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:15:23 localhost python3.9[122316]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:15:24 localhost python3.9[122407]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:15:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29490 DF PROTO=TCP SPT=43470 DPT=9105 SEQ=2570715432 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0D9E420000000001030307) Nov 28 04:15:25 localhost python3.9[122503]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 28 04:15:26 localhost python3.9[122557]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 28 04:15:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24265 DF PROTO=TCP SPT=36684 DPT=9101 SEQ=1335908385 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0DA9590000000001030307) Nov 28 04:15:30 localhost python3.9[122651]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 28 04:15:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24267 DF PROTO=TCP SPT=36684 DPT=9101 SEQ=1335908385 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0DB5420000000001030307) Nov 28 04:15:32 localhost python3.9[122806]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:15:32 localhost python3.9[122898]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:15:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21035 DF PROTO=TCP SPT=34850 DPT=9102 SEQ=2583488510 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0DBE430000000001030307) Nov 28 04:15:33 localhost python3.9[123002]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:15:34 localhost python3.9[123050]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:15:34 localhost python3.9[123142]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:15:35 localhost python3.9[123215]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321334.234006-323-74672324063493/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 28 04:15:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8351 DF PROTO=TCP SPT=45700 DPT=9102 SEQ=3988058356 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0DC9820000000001030307) Nov 28 04:15:36 localhost python3.9[123307]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Nov 28 04:15:36 localhost systemd-journald[47227]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation. Nov 28 04:15:36 localhost systemd-journald[47227]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 28 04:15:36 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 28 04:15:36 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 28 04:15:36 localhost python3.9[123400]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Nov 28 04:15:37 localhost python3.9[123492]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Nov 28 04:15:37 localhost python3.9[123584]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Nov 28 04:15:39 localhost python3.9[123674]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:15:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8865 DF PROTO=TCP SPT=57342 DPT=9100 SEQ=3269640789 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0DD5820000000001030307) Nov 28 04:15:39 localhost python3.9[123768]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 28 04:15:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18742 DF PROTO=TCP SPT=51008 DPT=9100 SEQ=3993436640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0DE1830000000001030307) Nov 28 04:15:43 localhost python3.9[123862]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 28 04:15:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50121 DF PROTO=TCP SPT=44726 DPT=9882 SEQ=2210243804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0DF4020000000001030307) Nov 28 04:15:48 localhost python3.9[123956]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 28 04:15:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49702 DF PROTO=TCP SPT=39658 DPT=9105 SEQ=877194211 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0DFBC20000000001030307) Nov 28 04:15:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49703 DF PROTO=TCP SPT=39658 DPT=9105 SEQ=877194211 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E03C20000000001030307) Nov 28 04:15:52 localhost python3.9[124056]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 28 04:15:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50122 DF PROTO=TCP SPT=44726 DPT=9882 SEQ=2210243804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E13820000000001030307) Nov 28 04:15:56 localhost python3.9[124150]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 28 04:15:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49204 DF PROTO=TCP SPT=40540 DPT=9101 SEQ=4276460536 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E1E890000000001030307) Nov 28 04:16:00 localhost python3.9[124244]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 28 04:16:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49206 DF PROTO=TCP SPT=40540 DPT=9101 SEQ=4276460536 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E2A830000000001030307) Nov 28 04:16:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23294 DF PROTO=TCP SPT=49232 DPT=9102 SEQ=3662492745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E33820000000001030307) Nov 28 04:16:05 localhost python3.9[124338]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 28 04:16:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46330 DF PROTO=TCP SPT=47586 DPT=9100 SEQ=124756526 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E3F020000000001030307) Nov 28 04:16:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23296 DF PROTO=TCP SPT=49232 DPT=9102 SEQ=3662492745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E4B420000000001030307) Nov 28 04:16:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46332 DF PROTO=TCP SPT=47586 DPT=9100 SEQ=124756526 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E56C20000000001030307) Nov 28 04:16:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16903 DF PROTO=TCP SPT=35070 DPT=9882 SEQ=1773086771 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E69420000000001030307) Nov 28 04:16:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17457 DF PROTO=TCP SPT=60926 DPT=9105 SEQ=1125394785 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E6CCB0000000001030307) Nov 28 04:16:18 localhost python3.9[124587]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:16:18 localhost python3.9[124692]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:16:19 localhost python3.9[124765]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764321378.3839915-722-222428517102657/.source.json _original_basename=.fwcp6m_k follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:16:20 localhost python3.9[124857]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Nov 28 04:16:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17459 DF PROTO=TCP SPT=60926 DPT=9105 SEQ=1125394785 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E78C20000000001030307) Nov 28 04:16:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17460 DF PROTO=TCP SPT=60926 DPT=9105 SEQ=1125394785 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E88820000000001030307) Nov 28 04:16:26 localhost podman[124870]: 2025-11-28 09:16:20.837235796 +0000 UTC m=+0.042189929 image pull quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Nov 28 04:16:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57647 DF PROTO=TCP SPT=56874 DPT=9101 SEQ=1905996746 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E93BB0000000001030307) Nov 28 04:16:28 localhost python3.9[125069]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Nov 28 04:16:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57649 DF PROTO=TCP SPT=56874 DPT=9101 SEQ=1905996746 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0E9FC20000000001030307) Nov 28 04:16:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3368 DF PROTO=TCP SPT=33524 DPT=9102 SEQ=3116283377 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0EA8C20000000001030307) Nov 28 04:16:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21040 DF PROTO=TCP SPT=34850 DPT=9102 SEQ=2583488510 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0EB3820000000001030307) Nov 28 04:16:36 localhost podman[125082]: 2025-11-28 09:16:28.743117886 +0000 UTC m=+0.045231304 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Nov 28 04:16:38 localhost python3.9[125284]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Nov 28 04:16:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18745 DF PROTO=TCP SPT=51008 DPT=9100 SEQ=3993436640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0EBF830000000001030307) Nov 28 04:16:40 localhost podman[125296]: 2025-11-28 09:16:38.188204953 +0000 UTC m=+0.046738470 image pull quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified Nov 28 04:16:41 localhost python3.9[125459]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Nov 28 04:16:42 localhost podman[125472]: 2025-11-28 09:16:41.244963625 +0000 UTC m=+0.028103392 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 04:16:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33296 DF PROTO=TCP SPT=38448 DPT=9100 SEQ=3436919252 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0ECC020000000001030307) Nov 28 04:16:43 localhost python3.9[125633]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Nov 28 04:16:46 localhost podman[125646]: 2025-11-28 09:16:43.547577273 +0000 UTC m=+0.043823469 image pull quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified Nov 28 04:16:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36958 DF PROTO=TCP SPT=50924 DPT=9882 SEQ=611293252 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0EDE820000000001030307) Nov 28 04:16:47 localhost python3.9[125824]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Nov 28 04:16:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24092 DF PROTO=TCP SPT=44216 DPT=9105 SEQ=3684348139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0EE6020000000001030307) Nov 28 04:16:50 localhost podman[125838]: 2025-11-28 09:16:47.905410938 +0000 UTC m=+0.048655370 image pull quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c Nov 28 04:16:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24093 DF PROTO=TCP SPT=44216 DPT=9105 SEQ=3684348139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0EEE020000000001030307) Nov 28 04:16:51 localhost systemd[1]: session-39.scope: Deactivated successfully. Nov 28 04:16:51 localhost systemd[1]: session-39.scope: Consumed 1min 31.367s CPU time. Nov 28 04:16:51 localhost systemd-logind[764]: Session 39 logged out. Waiting for processes to exit. Nov 28 04:16:51 localhost systemd-logind[764]: Removed session 39. Nov 28 04:16:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24094 DF PROTO=TCP SPT=44216 DPT=9105 SEQ=3684348139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0EFDC20000000001030307) Nov 28 04:16:57 localhost sshd[126187]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:16:57 localhost systemd-logind[764]: New session 40 of user zuul. Nov 28 04:16:57 localhost systemd[1]: Started Session 40 of User zuul. Nov 28 04:16:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31337 DF PROTO=TCP SPT=38766 DPT=9101 SEQ=361867278 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F08E90000000001030307) Nov 28 04:16:58 localhost python3.9[126291]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:17:00 localhost python3.9[126387]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None Nov 28 04:17:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31339 DF PROTO=TCP SPT=38766 DPT=9101 SEQ=361867278 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F15020000000001030307) Nov 28 04:17:01 localhost python3.9[126480]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 28 04:17:02 localhost python3.9[126534]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch3.3'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 28 04:17:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24095 DF PROTO=TCP SPT=44216 DPT=9105 SEQ=3684348139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F1D820000000001030307) Nov 28 04:17:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23299 DF PROTO=TCP SPT=49232 DPT=9102 SEQ=3662492745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F29820000000001030307) Nov 28 04:17:07 localhost python3.9[126654]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 28 04:17:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1098 DF PROTO=TCP SPT=60774 DPT=9102 SEQ=635022861 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F35820000000001030307) Nov 28 04:17:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55541 DF PROTO=TCP SPT=42570 DPT=9100 SEQ=476428184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F41420000000001030307) Nov 28 04:17:13 localhost python3.9[126949]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 28 04:17:15 localhost python3.9[127042]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:17:16 localhost python3.9[127134]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None Nov 28 04:17:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37031 DF PROTO=TCP SPT=55112 DPT=9882 SEQ=2605183829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F53C20000000001030307) Nov 28 04:17:18 localhost kernel: SELinux: Converting 2756 SID table entries... Nov 28 04:17:18 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 28 04:17:18 localhost kernel: SELinux: policy capability open_perms=1 Nov 28 04:17:18 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 28 04:17:18 localhost kernel: SELinux: policy capability always_check_network=0 Nov 28 04:17:18 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 28 04:17:18 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 28 04:17:18 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 28 04:17:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54123 DF PROTO=TCP SPT=51150 DPT=9105 SEQ=2386692259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F5B430000000001030307) Nov 28 04:17:19 localhost python3.9[127452]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:17:20 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=18 res=1 Nov 28 04:17:20 localhost python3.9[127550]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 28 04:17:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54124 DF PROTO=TCP SPT=51150 DPT=9105 SEQ=2386692259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F63420000000001030307) Nov 28 04:17:24 localhost python3.9[127644]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:17:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54125 DF PROTO=TCP SPT=51150 DPT=9105 SEQ=2386692259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F73030000000001030307) Nov 28 04:17:26 localhost python3.9[127889]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Nov 28 04:17:26 localhost python3.9[127979]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:17:27 localhost python3.9[128073]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 28 04:17:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41582 DF PROTO=TCP SPT=57622 DPT=9101 SEQ=1880801936 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F7E1A0000000001030307) Nov 28 04:17:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41584 DF PROTO=TCP SPT=57622 DPT=9101 SEQ=1880801936 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F8A430000000001030307) Nov 28 04:17:31 localhost python3.9[128167]: ansible-ansible.legacy.dnf Invoked with name=['openstack-network-scripts'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 28 04:17:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64797 DF PROTO=TCP SPT=55252 DPT=9102 SEQ=2816659087 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F93020000000001030307) Nov 28 04:17:35 localhost python3.9[128261]: ansible-ansible.builtin.systemd Invoked with enabled=True name=network daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Nov 28 04:17:35 localhost systemd[1]: Reloading. Nov 28 04:17:35 localhost systemd-sysv-generator[128292]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:17:35 localhost systemd-rc-local-generator[128288]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:17:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:17:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29988 DF PROTO=TCP SPT=38016 DPT=9100 SEQ=1368347695 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0F9E830000000001030307) Nov 28 04:17:38 localhost python3.9[128393]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:17:39 localhost python3.9[128485]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:17:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33299 DF PROTO=TCP SPT=38448 DPT=9100 SEQ=3436919252 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0FA9820000000001030307) Nov 28 04:17:39 localhost python3.9[128579]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:17:40 localhost python3.9[128671]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:17:41 localhost python3.9[128763]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:17:41 localhost python3.9[128836]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321460.8359594-563-258500151359875/.source _original_basename=.69igfolw follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:17:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29990 DF PROTO=TCP SPT=38016 DPT=9100 SEQ=1368347695 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0FB6420000000001030307) Nov 28 04:17:42 localhost python3.9[128928]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:17:43 localhost python3.9[129020]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={} Nov 28 04:17:44 localhost python3.9[129112]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:17:45 localhost python3.9[129204]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/config.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:17:45 localhost python3.9[129277]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/os-net-config/config.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321464.721-689-136042312256638/.source.yaml _original_basename=.w35bxh1u follow=False checksum=0cadac3cfc033a4e07cfac59b43f6459e787700a force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:17:46 localhost python3.9[129369]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml Nov 28 04:17:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22453 DF PROTO=TCP SPT=44742 DPT=9882 SEQ=4064839962 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0FC8C20000000001030307) Nov 28 04:17:47 localhost ansible-async_wrapper.py[129474]: Invoked with j621656087058 300 /home/zuul/.ansible/tmp/ansible-tmp-1764321467.0727656-761-150858680704529/AnsiballZ_edpm_os_net_config.py _ Nov 28 04:17:47 localhost ansible-async_wrapper.py[129477]: Starting module and watcher Nov 28 04:17:47 localhost ansible-async_wrapper.py[129477]: Start watching 129478 (300) Nov 28 04:17:47 localhost ansible-async_wrapper.py[129478]: Start module (129478) Nov 28 04:17:47 localhost ansible-async_wrapper.py[129474]: Return async_wrapper task started. Nov 28 04:17:48 localhost python3.9[129479]: ansible-edpm_os_net_config Invoked with cleanup=False config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=False Nov 28 04:17:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49278 DF PROTO=TCP SPT=56494 DPT=9105 SEQ=4039797680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0FCC5B0000000001030307) Nov 28 04:17:48 localhost ansible-async_wrapper.py[129478]: Module complete (129478) Nov 28 04:17:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49280 DF PROTO=TCP SPT=56494 DPT=9105 SEQ=4039797680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0FD8820000000001030307) Nov 28 04:17:51 localhost python3.9[129571]: ansible-ansible.legacy.async_status Invoked with jid=j621656087058.129474 mode=status _async_dir=/root/.ansible_async Nov 28 04:17:52 localhost python3.9[129630]: ansible-ansible.legacy.async_status Invoked with jid=j621656087058.129474 mode=cleanup _async_dir=/root/.ansible_async Nov 28 04:17:52 localhost python3.9[129722]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:17:52 localhost ansible-async_wrapper.py[129477]: Done in kid B. Nov 28 04:17:53 localhost python3.9[129795]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321472.3655045-827-274069601578939/.source.returncode _original_basename=.eqmqbu20 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:17:54 localhost python3.9[129887]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:17:54 localhost python3.9[129960]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321473.5790632-875-17894140427829/.source.cfg _original_basename=.phx59nse follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:17:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49281 DF PROTO=TCP SPT=56494 DPT=9105 SEQ=4039797680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0FE8420000000001030307) Nov 28 04:17:55 localhost python3.9[130052]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 04:17:56 localhost systemd[1]: Reloading Network Manager... Nov 28 04:17:56 localhost NetworkManager[5967]: [1764321476.4809] audit: op="reload" arg="0" pid=130056 uid=0 result="success" Nov 28 04:17:56 localhost NetworkManager[5967]: [1764321476.4824] config: signal: SIGHUP (no changes from disk) Nov 28 04:17:56 localhost systemd[1]: Reloaded Network Manager. Nov 28 04:17:56 localhost systemd-logind[764]: Session 40 logged out. Waiting for processes to exit. Nov 28 04:17:56 localhost systemd[1]: session-40.scope: Deactivated successfully. Nov 28 04:17:56 localhost systemd[1]: session-40.scope: Consumed 34.968s CPU time. Nov 28 04:17:56 localhost systemd-logind[764]: Removed session 40. Nov 28 04:17:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22179 DF PROTO=TCP SPT=51420 DPT=9101 SEQ=1333309435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0FF3490000000001030307) Nov 28 04:18:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22181 DF PROTO=TCP SPT=51420 DPT=9101 SEQ=1333309435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB0FFF420000000001030307) Nov 28 04:18:02 localhost sshd[130071]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:18:02 localhost systemd-logind[764]: New session 41 of user zuul. Nov 28 04:18:02 localhost systemd[1]: Started Session 41 of User zuul. Nov 28 04:18:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38310 DF PROTO=TCP SPT=54804 DPT=9102 SEQ=1841594550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1008420000000001030307) Nov 28 04:18:03 localhost python3.9[130164]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:18:04 localhost python3.9[130258]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 28 04:18:06 localhost python3.9[130411]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:18:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1101 DF PROTO=TCP SPT=60774 DPT=9102 SEQ=635022861 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1013830000000001030307) Nov 28 04:18:06 localhost systemd[1]: session-41.scope: Deactivated successfully. Nov 28 04:18:06 localhost systemd[1]: session-41.scope: Consumed 2.074s CPU time. Nov 28 04:18:06 localhost systemd-logind[764]: Session 41 logged out. Waiting for processes to exit. Nov 28 04:18:06 localhost systemd-logind[764]: Removed session 41. Nov 28 04:18:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55544 DF PROTO=TCP SPT=42570 DPT=9100 SEQ=476428184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB101F820000000001030307) Nov 28 04:18:11 localhost sshd[130427]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:18:12 localhost systemd-logind[764]: New session 42 of user zuul. Nov 28 04:18:12 localhost systemd[1]: Started Session 42 of User zuul. Nov 28 04:18:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36527 DF PROTO=TCP SPT=37356 DPT=9100 SEQ=3976809844 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB102B820000000001030307) Nov 28 04:18:13 localhost python3.9[130550]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:18:13 localhost python3.9[130688]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:18:15 localhost python3.9[130820]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 28 04:18:15 localhost python3.9[130874]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 28 04:18:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57216 DF PROTO=TCP SPT=41028 DPT=9882 SEQ=160539914 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB103E020000000001030307) Nov 28 04:18:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11109 DF PROTO=TCP SPT=43810 DPT=9105 SEQ=2250170345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1045830000000001030307) Nov 28 04:18:19 localhost python3.9[130983]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 28 04:18:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11110 DF PROTO=TCP SPT=43810 DPT=9105 SEQ=2250170345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB104D820000000001030307) Nov 28 04:18:21 localhost python3.9[131138]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:18:22 localhost python3.9[131230]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:18:22 localhost python3.9[131335]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:18:23 localhost python3.9[131383]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:18:24 localhost python3.9[131475]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:18:24 localhost python3.9[131523]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:18:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11111 DF PROTO=TCP SPT=43810 DPT=9105 SEQ=2250170345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB105D430000000001030307) Nov 28 04:18:25 localhost python3.9[131615]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Nov 28 04:18:26 localhost python3.9[131707]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Nov 28 04:18:26 localhost python3.9[131799]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Nov 28 04:18:27 localhost python3.9[131891]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Nov 28 04:18:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55113 DF PROTO=TCP SPT=56848 DPT=9101 SEQ=2674059226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1068790000000001030307) Nov 28 04:18:28 localhost python3.9[131983]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 28 04:18:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55115 DF PROTO=TCP SPT=56848 DPT=9101 SEQ=2674059226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1074820000000001030307) Nov 28 04:18:31 localhost sshd[132000]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:18:31 localhost sshd[132008]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:18:32 localhost python3.9[132079]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:18:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59513 DF PROTO=TCP SPT=39556 DPT=9102 SEQ=1558159202 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB107D830000000001030307) Nov 28 04:18:33 localhost python3.9[132173]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:18:34 localhost python3.9[132265]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:18:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 04:18:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.1 total, 600.0 interval#012Cumulative writes: 4899 writes, 22K keys, 4899 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4899 writes, 608 syncs, 8.06 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 28 04:18:35 localhost python3.9[132357]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:18:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41586 DF PROTO=TCP SPT=39660 DPT=9100 SEQ=739763758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1089020000000001030307) Nov 28 04:18:36 localhost python3.9[132450]: ansible-service_facts Invoked Nov 28 04:18:36 localhost network[132467]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 28 04:18:36 localhost network[132468]: 'network-scripts' will be removed from distribution in near future. Nov 28 04:18:36 localhost network[132469]: It is advised to switch to 'NetworkManager' instead for network management. Nov 28 04:18:37 localhost sshd[132486]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:18:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:18:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59515 DF PROTO=TCP SPT=39556 DPT=9102 SEQ=1558159202 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1095420000000001030307) Nov 28 04:18:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 04:18:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.3 total, 600.0 interval#012Cumulative writes: 5616 writes, 25K keys, 5616 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5616 writes, 758 syncs, 7.41 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 28 04:18:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41588 DF PROTO=TCP SPT=39660 DPT=9100 SEQ=739763758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB10A0C30000000001030307) Nov 28 04:18:45 localhost python3.9[132792]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 28 04:18:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14866 DF PROTO=TCP SPT=33380 DPT=9882 SEQ=1888493366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB10B3420000000001030307) Nov 28 04:18:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34761 DF PROTO=TCP SPT=48630 DPT=9105 SEQ=2335367779 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB10B6BB0000000001030307) Nov 28 04:18:50 localhost python3.9[132886]: ansible-package_facts Invoked with manager=['auto'] strategy=first Nov 28 04:18:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34763 DF PROTO=TCP SPT=48630 DPT=9105 SEQ=2335367779 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB10C2C30000000001030307) Nov 28 04:18:51 localhost python3.9[132978]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:18:52 localhost python3.9[133053]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321531.035931-656-190740213154570/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:18:52 localhost python3.9[133147]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:18:53 localhost python3.9[133222]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321532.5170507-701-187470642978333/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:18:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34764 DF PROTO=TCP SPT=48630 DPT=9105 SEQ=2335367779 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB10D2820000000001030307) Nov 28 04:18:55 localhost python3.9[133316]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:18:56 localhost python3.9[133410]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 28 04:18:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1090 DF PROTO=TCP SPT=40788 DPT=9101 SEQ=3176365671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB10DDA90000000001030307) Nov 28 04:18:58 localhost python3.9[133464]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:19:00 localhost python3.9[133558]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 28 04:19:01 localhost auditd[725]: Audit daemon rotating log files Nov 28 04:19:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1092 DF PROTO=TCP SPT=40788 DPT=9101 SEQ=3176365671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB10E9C30000000001030307) Nov 28 04:19:01 localhost python3.9[133612]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 04:19:01 localhost chronyd[26085]: chronyd exiting Nov 28 04:19:01 localhost systemd[1]: Stopping NTP client/server... Nov 28 04:19:01 localhost systemd[1]: chronyd.service: Deactivated successfully. Nov 28 04:19:01 localhost systemd[1]: Stopped NTP client/server. Nov 28 04:19:01 localhost systemd[1]: Starting NTP client/server... Nov 28 04:19:01 localhost chronyd[133620]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Nov 28 04:19:01 localhost chronyd[133620]: Frequency -30.844 +/- 0.497 ppm read from /var/lib/chrony/drift Nov 28 04:19:01 localhost chronyd[133620]: Loaded seccomp filter (level 2) Nov 28 04:19:01 localhost systemd[1]: Started NTP client/server. Nov 28 04:19:01 localhost systemd[1]: session-42.scope: Deactivated successfully. Nov 28 04:19:01 localhost systemd[1]: session-42.scope: Consumed 28.330s CPU time. Nov 28 04:19:01 localhost systemd-logind[764]: Session 42 logged out. Waiting for processes to exit. Nov 28 04:19:01 localhost systemd-logind[764]: Removed session 42. Nov 28 04:19:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3513 DF PROTO=TCP SPT=41230 DPT=9102 SEQ=2610679703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB10F2820000000001030307) Nov 28 04:19:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38315 DF PROTO=TCP SPT=54804 DPT=9102 SEQ=1841594550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB10FD830000000001030307) Nov 28 04:19:07 localhost sshd[133636]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:19:07 localhost systemd-logind[764]: New session 43 of user zuul. Nov 28 04:19:07 localhost systemd[1]: Started Session 43 of User zuul. Nov 28 04:19:08 localhost python3.9[133729]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:19:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36530 DF PROTO=TCP SPT=37356 DPT=9100 SEQ=3976809844 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1109820000000001030307) Nov 28 04:19:10 localhost python3.9[133825]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:19:10 localhost python3.9[133930]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:19:11 localhost python3.9[133978]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.2j_vxb6s recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:19:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11548 DF PROTO=TCP SPT=60450 DPT=9100 SEQ=4174456282 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1116020000000001030307) Nov 28 04:19:12 localhost python3.9[134070]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:19:13 localhost python3.9[134145]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321552.1577518-143-233066384468499/.source _original_basename=.86_hfc7z follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:19:14 localhost python3.9[134237]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:19:14 localhost python3.9[134329]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:19:15 localhost python3.9[134402]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321554.3577225-215-101002390405577/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 28 04:19:16 localhost python3.9[134494]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:19:16 localhost python3.9[134567]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321555.5790176-215-65413931635646/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 28 04:19:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50708 DF PROTO=TCP SPT=53558 DPT=9882 SEQ=4101534268 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1128820000000001030307) Nov 28 04:19:17 localhost python3.9[134659]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:19:18 localhost python3.9[134807]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:19:18 localhost python3.9[134917]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321557.631082-326-17041010042634/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:19:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29552 DF PROTO=TCP SPT=36232 DPT=9105 SEQ=3194225600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1130030000000001030307) Nov 28 04:19:19 localhost podman[135017]: Nov 28 04:19:19 localhost podman[135017]: 2025-11-28 09:19:19.298149209 +0000 UTC m=+0.083396103 container create 2243db928cb8758362702ee34cc3c653bda1e715a69c2c4b18805c93270daf67 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_snyder, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, architecture=x86_64, name=rhceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 28 04:19:19 localhost systemd[1]: Started libpod-conmon-2243db928cb8758362702ee34cc3c653bda1e715a69c2c4b18805c93270daf67.scope. Nov 28 04:19:19 localhost podman[135017]: 2025-11-28 09:19:19.262269379 +0000 UTC m=+0.047516303 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:19:19 localhost systemd[1]: Started libcrun container. Nov 28 04:19:19 localhost podman[135017]: 2025-11-28 09:19:19.380178608 +0000 UTC m=+0.165425492 container init 2243db928cb8758362702ee34cc3c653bda1e715a69c2c4b18805c93270daf67 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_snyder, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., RELEASE=main, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, ceph=True, architecture=x86_64, build-date=2025-09-24T08:57:55, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=) Nov 28 04:19:19 localhost podman[135017]: 2025-11-28 09:19:19.39210484 +0000 UTC m=+0.177351724 container start 2243db928cb8758362702ee34cc3c653bda1e715a69c2c4b18805c93270daf67 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_snyder, distribution-scope=public, GIT_BRANCH=main, name=rhceph, version=7, architecture=x86_64, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, maintainer=Guillaume Abrioux , ceph=True, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, release=553, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc.) Nov 28 04:19:19 localhost podman[135017]: 2025-11-28 09:19:19.392537233 +0000 UTC m=+0.177784107 container attach 2243db928cb8758362702ee34cc3c653bda1e715a69c2c4b18805c93270daf67 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_snyder, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, RELEASE=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, architecture=x86_64, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , release=553, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, GIT_BRANCH=main, vcs-type=git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, ceph=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=) Nov 28 04:19:19 localhost optimistic_snyder[135063]: 167 167 Nov 28 04:19:19 localhost systemd[1]: libpod-2243db928cb8758362702ee34cc3c653bda1e715a69c2c4b18805c93270daf67.scope: Deactivated successfully. Nov 28 04:19:19 localhost podman[135017]: 2025-11-28 09:19:19.395338271 +0000 UTC m=+0.180585175 container died 2243db928cb8758362702ee34cc3c653bda1e715a69c2c4b18805c93270daf67 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_snyder, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, release=553, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, ceph=True, vcs-type=git) Nov 28 04:19:19 localhost podman[135070]: 2025-11-28 09:19:19.506002194 +0000 UTC m=+0.093442286 container remove 2243db928cb8758362702ee34cc3c653bda1e715a69c2c4b18805c93270daf67 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_snyder, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, release=553, build-date=2025-09-24T08:57:55, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, vcs-type=git, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 28 04:19:19 localhost systemd[1]: libpod-conmon-2243db928cb8758362702ee34cc3c653bda1e715a69c2c4b18805c93270daf67.scope: Deactivated successfully. Nov 28 04:19:19 localhost python3.9[135068]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:19:19 localhost podman[135102]: Nov 28 04:19:19 localhost podman[135102]: 2025-11-28 09:19:19.731520339 +0000 UTC m=+0.073019039 container create c38276a3e9b8f2b0ca5ae16a876280ec5001167f36802775a2ce4814145b2e79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_hamilton, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, version=7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, architecture=x86_64, release=553, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, RELEASE=main, build-date=2025-09-24T08:57:55, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=) Nov 28 04:19:19 localhost systemd[1]: Started libpod-conmon-c38276a3e9b8f2b0ca5ae16a876280ec5001167f36802775a2ce4814145b2e79.scope. Nov 28 04:19:19 localhost systemd[1]: Started libcrun container. Nov 28 04:19:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67f0ca1719e64b291c1616d6e829851871af4278f7df2287f450e9eae4e3b1e2/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 28 04:19:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67f0ca1719e64b291c1616d6e829851871af4278f7df2287f450e9eae4e3b1e2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 28 04:19:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67f0ca1719e64b291c1616d6e829851871af4278f7df2287f450e9eae4e3b1e2/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 28 04:19:19 localhost podman[135102]: 2025-11-28 09:19:19.799594564 +0000 UTC m=+0.141093314 container init c38276a3e9b8f2b0ca5ae16a876280ec5001167f36802775a2ce4814145b2e79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_hamilton, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, distribution-scope=public, RELEASE=main, architecture=x86_64, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, io.openshift.expose-services=) Nov 28 04:19:19 localhost podman[135102]: 2025-11-28 09:19:19.704958771 +0000 UTC m=+0.046457511 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:19:19 localhost podman[135102]: 2025-11-28 09:19:19.810888086 +0000 UTC m=+0.152386786 container start c38276a3e9b8f2b0ca5ae16a876280ec5001167f36802775a2ce4814145b2e79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_hamilton, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.openshift.tags=rhceph ceph, release=553, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public) Nov 28 04:19:19 localhost podman[135102]: 2025-11-28 09:19:19.811178525 +0000 UTC m=+0.152677225 container attach c38276a3e9b8f2b0ca5ae16a876280ec5001167f36802775a2ce4814145b2e79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_hamilton, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, version=7, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, ceph=True, vcs-type=git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, architecture=x86_64, io.buildah.version=1.33.12, release=553, io.openshift.expose-services=) Nov 28 04:19:20 localhost python3.9[135185]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321559.0707934-371-1603773660681/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:19:20 localhost systemd[1]: var-lib-containers-storage-overlay-d9969c749ffddba729ff682fdacf5b8212a9e2f3075e8d8892007d3e32ddb2df-merged.mount: Deactivated successfully. Nov 28 04:19:20 localhost friendly_hamilton[135148]: [ Nov 28 04:19:20 localhost friendly_hamilton[135148]: { Nov 28 04:19:20 localhost friendly_hamilton[135148]: "available": false, Nov 28 04:19:20 localhost friendly_hamilton[135148]: "ceph_device": false, Nov 28 04:19:20 localhost friendly_hamilton[135148]: "device_id": "QEMU_DVD-ROM_QM00001", Nov 28 04:19:20 localhost friendly_hamilton[135148]: "lsm_data": {}, Nov 28 04:19:20 localhost friendly_hamilton[135148]: "lvs": [], Nov 28 04:19:20 localhost friendly_hamilton[135148]: "path": "/dev/sr0", Nov 28 04:19:20 localhost friendly_hamilton[135148]: "rejected_reasons": [ Nov 28 04:19:20 localhost friendly_hamilton[135148]: "Has a FileSystem", Nov 28 04:19:20 localhost friendly_hamilton[135148]: "Insufficient space (<5GB)" Nov 28 04:19:20 localhost friendly_hamilton[135148]: ], Nov 28 04:19:20 localhost friendly_hamilton[135148]: "sys_api": { Nov 28 04:19:20 localhost friendly_hamilton[135148]: "actuators": null, Nov 28 04:19:20 localhost friendly_hamilton[135148]: "device_nodes": "sr0", Nov 28 04:19:20 localhost friendly_hamilton[135148]: "human_readable_size": "482.00 KB", Nov 28 04:19:20 localhost friendly_hamilton[135148]: "id_bus": "ata", Nov 28 04:19:20 localhost friendly_hamilton[135148]: "model": "QEMU DVD-ROM", Nov 28 04:19:20 localhost friendly_hamilton[135148]: "nr_requests": "2", Nov 28 04:19:20 localhost friendly_hamilton[135148]: "partitions": {}, Nov 28 04:19:20 localhost friendly_hamilton[135148]: "path": "/dev/sr0", Nov 28 04:19:20 localhost friendly_hamilton[135148]: "removable": "1", Nov 28 04:19:20 localhost friendly_hamilton[135148]: "rev": "2.5+", Nov 28 04:19:20 localhost friendly_hamilton[135148]: "ro": "0", Nov 28 04:19:20 localhost friendly_hamilton[135148]: "rotational": "1", Nov 28 04:19:20 localhost friendly_hamilton[135148]: "sas_address": "", Nov 28 04:19:20 localhost friendly_hamilton[135148]: "sas_device_handle": "", Nov 28 04:19:20 localhost friendly_hamilton[135148]: "scheduler_mode": "mq-deadline", Nov 28 04:19:20 localhost friendly_hamilton[135148]: "sectors": 0, Nov 28 04:19:20 localhost friendly_hamilton[135148]: "sectorsize": "2048", Nov 28 04:19:20 localhost friendly_hamilton[135148]: "size": 493568.0, Nov 28 04:19:20 localhost friendly_hamilton[135148]: "support_discard": "0", Nov 28 04:19:20 localhost friendly_hamilton[135148]: "type": "disk", Nov 28 04:19:20 localhost friendly_hamilton[135148]: "vendor": "QEMU" Nov 28 04:19:20 localhost friendly_hamilton[135148]: } Nov 28 04:19:20 localhost friendly_hamilton[135148]: } Nov 28 04:19:20 localhost friendly_hamilton[135148]: ] Nov 28 04:19:20 localhost systemd[1]: libpod-c38276a3e9b8f2b0ca5ae16a876280ec5001167f36802775a2ce4814145b2e79.scope: Deactivated successfully. Nov 28 04:19:20 localhost podman[135102]: 2025-11-28 09:19:20.726164992 +0000 UTC m=+1.067663742 container died c38276a3e9b8f2b0ca5ae16a876280ec5001167f36802775a2ce4814145b2e79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_hamilton, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_CLEAN=True, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, ceph=True, name=rhceph, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, version=7, RELEASE=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, maintainer=Guillaume Abrioux ) Nov 28 04:19:20 localhost systemd[1]: tmp-crun.7vOIXL.mount: Deactivated successfully. Nov 28 04:19:20 localhost systemd[1]: var-lib-containers-storage-overlay-67f0ca1719e64b291c1616d6e829851871af4278f7df2287f450e9eae4e3b1e2-merged.mount: Deactivated successfully. Nov 28 04:19:20 localhost podman[136796]: 2025-11-28 09:19:20.831407266 +0000 UTC m=+0.091675912 container remove c38276a3e9b8f2b0ca5ae16a876280ec5001167f36802775a2ce4814145b2e79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_hamilton, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, maintainer=Guillaume Abrioux , GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, name=rhceph, vcs-type=git, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, version=7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, distribution-scope=public) Nov 28 04:19:20 localhost systemd[1]: libpod-conmon-c38276a3e9b8f2b0ca5ae16a876280ec5001167f36802775a2ce4814145b2e79.scope: Deactivated successfully. Nov 28 04:19:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29553 DF PROTO=TCP SPT=36232 DPT=9105 SEQ=3194225600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1138020000000001030307) Nov 28 04:19:21 localhost python3.9[136855]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:19:21 localhost systemd[1]: Reloading. Nov 28 04:19:21 localhost systemd-rc-local-generator[136895]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:19:21 localhost systemd-sysv-generator[136899]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:19:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:19:21 localhost systemd[1]: Reloading. Nov 28 04:19:21 localhost systemd-sysv-generator[136938]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:19:21 localhost systemd-rc-local-generator[136935]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:19:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:19:21 localhost systemd[1]: Starting EDPM Container Shutdown... Nov 28 04:19:21 localhost systemd[1]: Finished EDPM Container Shutdown. Nov 28 04:19:22 localhost python3.9[137038]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:19:23 localhost python3.9[137111]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321562.01919-440-46142071843002/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:19:23 localhost python3.9[137203]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:19:24 localhost python3.9[137276]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321563.3719065-485-137940842376703/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:19:25 localhost python3.9[137368]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:19:25 localhost systemd[1]: Reloading. Nov 28 04:19:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29554 DF PROTO=TCP SPT=36232 DPT=9105 SEQ=3194225600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1147C30000000001030307) Nov 28 04:19:25 localhost systemd-sysv-generator[137399]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:19:25 localhost systemd-rc-local-generator[137394]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:19:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:19:25 localhost systemd[1]: Starting Create netns directory... Nov 28 04:19:25 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 28 04:19:25 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 28 04:19:25 localhost systemd[1]: Finished Create netns directory. Nov 28 04:19:26 localhost python3.9[137500]: ansible-ansible.builtin.service_facts Invoked Nov 28 04:19:26 localhost network[137517]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 28 04:19:26 localhost network[137518]: 'network-scripts' will be removed from distribution in near future. Nov 28 04:19:26 localhost network[137519]: It is advised to switch to 'NetworkManager' instead for network management. Nov 28 04:19:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:19:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13627 DF PROTO=TCP SPT=36168 DPT=9101 SEQ=1442700734 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1152D90000000001030307) Nov 28 04:19:30 localhost python3.9[137720]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:19:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13629 DF PROTO=TCP SPT=36168 DPT=9101 SEQ=1442700734 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB115EC20000000001030307) Nov 28 04:19:31 localhost python3.9[137795]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321570.0929236-608-184580113764586/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:19:32 localhost python3.9[137888]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 04:19:32 localhost systemd[1]: Reloading OpenSSH server daemon... Nov 28 04:19:32 localhost systemd[1]: Reloaded OpenSSH server daemon. Nov 28 04:19:32 localhost sshd[117359]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:19:32 localhost python3.9[137984]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:19:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29555 DF PROTO=TCP SPT=36232 DPT=9105 SEQ=3194225600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1167820000000001030307) Nov 28 04:19:33 localhost python3.9[138076]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:19:34 localhost python3.9[138149]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321573.0202067-701-120072531886176/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:19:35 localhost python3.9[138241]: ansible-community.general.timezone Invoked with name=UTC hwclock=None Nov 28 04:19:35 localhost systemd[1]: Starting Time & Date Service... Nov 28 04:19:35 localhost systemd[1]: Started Time & Date Service. Nov 28 04:19:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5521 DF PROTO=TCP SPT=34726 DPT=9100 SEQ=4115780546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1173430000000001030307) Nov 28 04:19:37 localhost python3.9[138337]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:19:38 localhost python3.9[138429]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:19:38 localhost python3.9[138502]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321577.7063465-806-100933902431769/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:19:39 localhost python3.9[138594]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:19:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41591 DF PROTO=TCP SPT=39660 DPT=9100 SEQ=739763758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB117F820000000001030307) Nov 28 04:19:39 localhost python3.9[138667]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321578.9847374-851-225080342505182/.source.yaml _original_basename=._9jqw2ll follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:19:40 localhost python3.9[138759]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:19:41 localhost python3.9[138834]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321580.2831042-896-191108226772632/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:19:42 localhost python3.9[138926]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:19:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5523 DF PROTO=TCP SPT=34726 DPT=9100 SEQ=4115780546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB118B030000000001030307) Nov 28 04:19:42 localhost python3.9[139019]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:19:43 localhost python3[139112]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Nov 28 04:19:44 localhost python3.9[139204]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:19:45 localhost python3.9[139277]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321584.1865373-1013-11462905983491/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:19:46 localhost python3.9[139369]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:19:46 localhost python3.9[139442]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321585.5141056-1058-249363852328996/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:19:47 localhost python3.9[139534]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:19:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56612 DF PROTO=TCP SPT=36132 DPT=9102 SEQ=3743218911 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB119F820000000001030307) Nov 28 04:19:47 localhost python3.9[139607]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321586.9067307-1103-128874515078201/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:19:48 localhost python3.9[139699]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:19:49 localhost python3.9[139772]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321588.1701167-1148-136979814319947/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:19:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29556 DF PROTO=TCP SPT=36232 DPT=9105 SEQ=3194225600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB11A7820000000001030307) Nov 28 04:19:50 localhost python3.9[139864]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:19:50 localhost python3.9[139937]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321589.483295-1193-185498845146240/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:19:51 localhost python3.9[140029]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:19:52 localhost python3.9[140121]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:19:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34767 DF PROTO=TCP SPT=48630 DPT=9105 SEQ=2335367779 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB11B1820000000001030307) Nov 28 04:19:52 localhost python3.9[140216]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:19:53 localhost python3.9[140309]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:19:54 localhost python3.9[140401]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:19:55 localhost python3.9[140493]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None Nov 28 04:19:55 localhost python3.9[140586]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None Nov 28 04:19:56 localhost systemd-logind[764]: Session 43 logged out. Waiting for processes to exit. Nov 28 04:19:56 localhost systemd[1]: session-43.scope: Deactivated successfully. Nov 28 04:19:56 localhost systemd[1]: session-43.scope: Consumed 27.744s CPU time. Nov 28 04:19:56 localhost systemd-logind[764]: Removed session 43. Nov 28 04:19:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43308 DF PROTO=TCP SPT=41988 DPT=9101 SEQ=2314623050 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB11C8090000000001030307) Nov 28 04:20:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1096 DF PROTO=TCP SPT=40788 DPT=9101 SEQ=3176365671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB11D7820000000001030307) Nov 28 04:20:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16425 DF PROTO=TCP SPT=60498 DPT=9102 SEQ=1550634953 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB11D8F40000000001030307) Nov 28 04:20:02 localhost sshd[140603]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:20:03 localhost systemd-logind[764]: New session 44 of user zuul. Nov 28 04:20:03 localhost systemd[1]: Started Session 44 of User zuul. Nov 28 04:20:03 localhost python3.9[140698]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None Nov 28 04:20:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=932 DF PROTO=TCP SPT=59560 DPT=9100 SEQ=409018828 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB11E4810000000001030307) Nov 28 04:20:05 localhost python3.9[140790]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:20:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3518 DF PROTO=TCP SPT=41230 DPT=9102 SEQ=2610679703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB11E9820000000001030307) Nov 28 04:20:06 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Nov 28 04:20:06 localhost python3.9[140884]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts Nov 28 04:20:08 localhost python3.9[140979]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.x7_26xdp follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:20:08 localhost python3.9[141054]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.x7_26xdp mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321607.5966923-189-262347571873459/.source.x7_26xdp _original_basename=.ye43wm4w follow=False checksum=37b6ce2b006ecd64876d6796769d1ed663c9f074 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:20:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11551 DF PROTO=TCP SPT=60450 DPT=9100 SEQ=4174456282 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB11F3820000000001030307) Nov 28 04:20:11 localhost python3.9[141146]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:20:12 localhost python3.9[141238]: ansible-ansible.builtin.blockinfile Invoked with block=np0005538513.localdomain,192.168.122.106,np0005538513* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCToHi/c1OL/UxMWy2v/t0tcvSlMeoKa6EPBYbcu51p2Gn2UxEPgCRLM9+84Smh2pxAR4Y/5LVm2lbZ9Gf4okHGg5GLIyqzxxqbQHyR+YRljujVEOvksUPuKCptzx9fQj2Ij2t9GPGHc5klgGPIKjx0pza8T37vdz+G9y7zuK5wWI66AeN8y/6dD2hvi1Lp94VRSvTTEo+nUOFSIgsOwqQO+ZSwTgjG1pmtESBe8nkhW0I0BQPX46v9f1PN1LXDg8cN2FSVjQ91RI0uCvTaBYJ3soFBFspgiJ113zapbQCaNwg7lK7ofS0QT5WONP3QIsDAq1gSpWuOdS2DRY4NU3WMd4m5tLbj+ubiWr39rNU/zQiEl8r38aiM0OwOfuQ9S8wxO7phpVCQrbOkYCLLijdy/xTODvP+jYohTMWX8Gh6IVeVtm6SB2Tw3lDBCjpqlclCSs905Xe+mTJ6WYTaz+Q1xgflKEeemzJ0+rt+QZbrmL7u5MUdf/l/yOLAgACNsws=#012np0005538513.localdomain,192.168.122.106,np0005538513* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAcFP+DjLmcEEAm8Lwvxl6FPIO6oOWnH/RhIcXcMqT1F#012np0005538513.localdomain,192.168.122.106,np0005538513* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBCKBYRInRUdTiZ6KYKN+DMW+w3dTbv2b2ZRO5doLdo2BjNWxCzSevWq4Ptdwg4i7AwfVsH37MVU5ijvc8yJB7o=#012np0005538512.localdomain,192.168.122.105,np0005538512* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCy9/gxqH+eMqafXwUuPf+1Clpw4qsugdFefisnCDhJ5U7Pc+eWMUQVMS0ErxabBJhneDOyPwXwIbv72cEAtmgfvHDlSuS3mt8LRzKqsv1dXTy4Zqb3JGVzrvxo0iczGRsn2MIDJUv/Zjq9YqVeCnDj2HOwV+qx+EFecEFXS797FxsnMmTw0A5z8yUtBuJEGAKQX96LpZc4k5ltq+Uy0rK85Kk7cGR4A+wrIChLC8wggxvA99NdPEBtne6Chb+3PcbYUcTGhGtV6FGzpgbWmuWT/gcANb+fJE5/4n87loLmBMsmvGhvQuN9kuJ20g6nwPJbPTpIbV6XALx4tbma68bL3RL+lcGlh3jf0pEXPfolrB/MRmJn5ggMLjRv50FrowQalnCEgWE0gtd9IGjmqFz3jP008bGotn9rcacbjC2AvE+5NEjp7TzXGnFcD6jW8+9AWiusCww4ULs/oWbi0GLkmhwU5EifitDYF2+r1CigAdlEjb6sa0wAQSmclWk6guM=#012np0005538512.localdomain,192.168.122.105,np0005538512* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIK28mLCPVbXy0OXsvv/yFemdmkq0TouDg2F8iIBtrFNP#012np0005538512.localdomain,192.168.122.105,np0005538512* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOAaBZ7v0nx9ZqEqgPbFZS0ak6RTWK6bkXL/jWgEJnhpVMoiRYOxmcwlW3qCW0ftaWYgMItu1j7anWibS+umVXI=#012np0005538511.localdomain,192.168.122.104,np0005538511* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDqtmgm0KAOOIJ7a8whlZPfasnwJpfcm6zVmQjiKHZZrcojE/a6oALfufKXbfWWiLjJ2VzyK9v7QPNXhIWxgAKT9J40A1lSpSAmmxMaWvy+hzzvePs0Z4Fc4bFX7V4zBGI+dAJ+eAu73z6OKNuMhxBrL46ejpRFbqjwBP3veWRiLOMbyPn+Wc+amop0p1eEzV2QHMAIC5Dwm6/tYNLixNSa/Ea0ciaY3jWii+IGhYy+wqQP+9qkoVf9bZ4Ewa+7UfXI/q4zvvic/Znb8ZpCpezLnH4ilBORLyV9r/wkkkVGY7UVgUdSoLVjzTGQAtHl2ZgA3zJ2F2ES9QcBEvrHygT4vGgtEaxQn8XFhBwhzCpPaLyXti/6d+8M36cJx+7gv1eEfDgLz3MNR+tcnFSew9N6dIN4afV0DvA/9FsWk8PTqddN4iHcZzRo0GiDJWNtB+gYVZOytTYMZm2Cyv59IthEzxaB+wTZoSdCeuEeTM0ohYspOKirIPqMPuCbGbtrJFE=#012np0005538511.localdomain,192.168.122.104,np0005538511* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOw3UAOk5rmRZZUABN/csr2bxG0kPuwFOfnLWM0dbphK#012np0005538511.localdomain,192.168.122.104,np0005538511* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOporAXIBWakUq++II8S8bptvpP8um9hXQ1t0EGSEC6CKLIa5aENxiSz3hPWhpfOMIda2pAiC8tHJ/ctg1cA7bI=#012np0005538515.localdomain,192.168.122.108,np0005538515* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDbwc/gBZF5hmsFU6BSK1/DT5hduj2+3ukzoCGLU6mgpBv7BiInN7vVOqXilL+QUAWOvfKTekaQe1Vv/2jpygQnlu6MEMopmac/36IfVjgt39zxCULfSWv3Gp8tLP0ATF2LfhHBWFrGX7G3Bg3AiNfIUnQIQadBaKIByl+FfA7nJ7phwBAwJaQxvByGDeMwC2CWIUPgVqKclcw1WmldPnNmwquLlCbAeMV2hHlBfnVk8BI6fsOUcBB6a05zRpJpbrl584F+qkiQX0RpZYJQdZCoLiJStJv39lYhgiAWChUOVJsCbeNQnC9/Xgs5JhmRESgXh7Tm+8UNW9DxSHN7BS5qKYPUULdjobSp2v9pFOx30MLMsNd5r3JE07pgm5PpjuviSGEvJ8DIAPTF3kUXM43wax1q9rGV4ZfoJiLAwS9CmWWDWZDg17cnC5z+3qi+K8HUKz8LxQCHI+yEtTFzUEYyXTQfQbNvkauEHI/PwFA1iC+4/2g/0UhtjkM+FO2Czwk=#012np0005538515.localdomain,192.168.122.108,np0005538515* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPVAkJQTOfLnB4ufl+yfJWTOwj/+yeZMYj9KPcqQhG41#012np0005538515.localdomain,192.168.122.108,np0005538515* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJxSQcYu8iH02KDWynHrNs+wu90XfG3ktCJ/ydvMFl7Khrh5CImI23f+XeJr4A7okpxJw7hhtVd+bcWjM/VGibU=#012np0005538510.localdomain,192.168.122.103,np0005538510* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAxqgPHnyGChl6yd1/HRo8ox+w8llSVhIj8iYUdDG7IquyLr4/CZguZzRkngbXi/Dq544iKS4kFL/zPKi+yuxeFs4b6fgo4vGoV8wwKNSJXx0d0hOQa9651VqB6k/trENRTgLa2fHkXgF+/g0f7HvloQfhr7qjhTBRV4l4UfJiOEpMvMxN6map/D0JuHlAZGZ5mGUoBTEMuPGEPvMWqe0kc/I8WIgsMsvijOGM2xDxsOqAYlV9a8faoyMdacWUNkeQTfPF6h+z8xdvP8qWPtrPKWHMpcGicTI6pFZ2JxOjWnMaBXs2j/CN7HFLbyOCwuAvAu9efAbxJvgtZlO++6kSlq7SHMzwv7PLP69GaQJHR+jANJ/O2BchbxL09mIkpFSzLSS0k7xXJlwqnAMciIlTaud2n5Hqnnb06WgtvD6O0nnuCLH5am7F1YDGJJgUmNbbgF1PuwzOZqQy+tA2igji/n2z87KkGZdIbrHdPU1PPIlzVGPO6aO02RhvtD+/iQM=#012np0005538510.localdomain,192.168.122.103,np0005538510* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIO316T0CvGWuEUZtluJgtZ9ZZEUIgwqLNzmYcEgwx90d#012np0005538510.localdomain,192.168.122.103,np0005538510* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFs0shW57fSaFIES4CjKi1hUQjnXLq99+vhyRfpt8xn5+tcCwnrhlVxDAoMMHaxjmVGblslVcZ1lb3oEH51GZuE=#012np0005538514.localdomain,192.168.122.107,np0005538514* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLIqwhlOevSQuHXF0nrkLOzRoSQqnWWb/cXzK4um93clqGujVOE9PUyL6ONBo/qlr4Pp+QMzSsIFwjW1T/6G+Ce2CS/TGphIUxvvB9NhBt+OJl/zDUEmjAU6bwVIx6ApqtimsXWWIap9GEtVWA5P9pcqPMyGzq1mCzwCS252Ylioij0zZxfMrxTt3RSsWrDED61vRes0ZKd8HERTLN+Lzis5t8f74zfwTesOea6CRkIHth4cUP7ua3q+KhhbhIPj+fXWN5w+qVbcTMJSYyUPsZ2ymPhR4x4db1oPk1Jg14dw1BnmAZZl3v8o4l7bUQ2Fj/PE1JbSiApxbK+V0KdZGMrG4iVbnMmzwBXPXHa6lNQGneflVd3MNEepnTnXx4hAVpaJHc8EtIREq8aPe07DW0wL9clpTKaSGU2Ma+BLXmSDPkuPh6JWLxn3iM1yybL574NnGt2MgBj6z2tiSb4NkNmaBkoG8PMHw8YUSabKBBZNiMEO2GKBpHZldSrYvOZHU=#012np0005538514.localdomain,192.168.122.107,np0005538514* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINhivqz2RYo1kKlRUCCEwVKn/fRbUXKh+9HKcoRBbRik#012np0005538514.localdomain,192.168.122.107,np0005538514* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEj7Mfl3DOkiBgUjao8Ey8r/pUITSMDHIaEViUpgeShgnNz3/omNuAseQqHK6/tA9gN/Uo8Pq1wRSxeBtUVD++U=#012 create=True mode=0644 path=/tmp/ansible.x7_26xdp state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:20:14 localhost python3.9[141330]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.x7_26xdp' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:20:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50711 DF PROTO=TCP SPT=53558 DPT=9882 SEQ=4101534268 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1207830000000001030307) Nov 28 04:20:16 localhost python3.9[141424]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.x7_26xdp state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:20:17 localhost systemd[1]: session-44.scope: Deactivated successfully. Nov 28 04:20:17 localhost systemd[1]: session-44.scope: Consumed 4.266s CPU time. Nov 28 04:20:17 localhost systemd-logind[764]: Session 44 logged out. Waiting for processes to exit. Nov 28 04:20:17 localhost systemd-logind[764]: Removed session 44. Nov 28 04:20:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17686 DF PROTO=TCP SPT=38134 DPT=9105 SEQ=77650652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB12164B0000000001030307) Nov 28 04:20:24 localhost sshd[141501]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:20:24 localhost systemd-logind[764]: New session 45 of user zuul. Nov 28 04:20:24 localhost systemd[1]: Started Session 45 of User zuul. Nov 28 04:20:25 localhost python3.9[141609]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:20:27 localhost python3.9[141705]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Nov 28 04:20:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23918 DF PROTO=TCP SPT=56768 DPT=9101 SEQ=1563538910 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB123D3A0000000001030307) Nov 28 04:20:29 localhost python3.9[141799]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 04:20:30 localhost python3.9[141892]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:20:30 localhost python3.9[141985]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:20:31 localhost python3.9[142079]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:20:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36977 DF PROTO=TCP SPT=38326 DPT=9102 SEQ=496899107 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB124E240000000001030307) Nov 28 04:20:32 localhost python3.9[142174]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:20:32 localhost systemd[1]: session-45.scope: Deactivated successfully. Nov 28 04:20:32 localhost systemd[1]: session-45.scope: Consumed 3.892s CPU time. Nov 28 04:20:32 localhost systemd-logind[764]: Session 45 logged out. Waiting for processes to exit. Nov 28 04:20:32 localhost systemd-logind[764]: Removed session 45. Nov 28 04:20:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36978 DF PROTO=TCP SPT=38326 DPT=9102 SEQ=496899107 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1252430000000001030307) Nov 28 04:20:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47629 DF PROTO=TCP SPT=46458 DPT=9100 SEQ=1375931586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1259B10000000001030307) Nov 28 04:20:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36979 DF PROTO=TCP SPT=38326 DPT=9102 SEQ=496899107 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB125A420000000001030307) Nov 28 04:20:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47630 DF PROTO=TCP SPT=46458 DPT=9100 SEQ=1375931586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB125DC20000000001030307) Nov 28 04:20:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47631 DF PROTO=TCP SPT=46458 DPT=9100 SEQ=1375931586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1265C20000000001030307) Nov 28 04:20:38 localhost sshd[142189]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:20:38 localhost systemd-logind[764]: New session 46 of user zuul. Nov 28 04:20:38 localhost systemd[1]: Started Session 46 of User zuul. Nov 28 04:20:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36980 DF PROTO=TCP SPT=38326 DPT=9102 SEQ=496899107 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB126A020000000001030307) Nov 28 04:20:39 localhost python3.9[142282]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:20:40 localhost python3.9[142378]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 28 04:20:41 localhost python3.9[142432]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Nov 28 04:20:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47632 DF PROTO=TCP SPT=46458 DPT=9100 SEQ=1375931586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1275820000000001030307) Nov 28 04:20:45 localhost python3.9[142524]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:20:47 localhost python3.9[142617]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/reboot_required/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:20:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57206 DF PROTO=TCP SPT=35184 DPT=9882 SEQ=3273484680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1288020000000001030307) Nov 28 04:20:47 localhost python3.9[142709]: ansible-ansible.builtin.file Invoked with mode=0600 path=/var/lib/openstack/reboot_required/needs_restarting state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:20:48 localhost python3.9[142801]: ansible-ansible.builtin.lineinfile Invoked with dest=/var/lib/openstack/reboot_required/needs_restarting line=Not root, Subscription Management repositories not updated#012Core libraries or services have been updated since boot-up:#012 * systemd#012#012Reboot is required to fully utilize these updates.#012More information: https://access.redhat.com/solutions/27943 path=/var/lib/openstack/reboot_required/needs_restarting state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:20:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7445 DF PROTO=TCP SPT=55160 DPT=9105 SEQ=4117917505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB128F820000000001030307) Nov 28 04:20:49 localhost python3.9[142891]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 28 04:20:50 localhost python3.9[142981]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:20:50 localhost python3.9[143073]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:20:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7446 DF PROTO=TCP SPT=55160 DPT=9105 SEQ=4117917505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1297820000000001030307) Nov 28 04:20:51 localhost systemd[1]: session-46.scope: Deactivated successfully. Nov 28 04:20:51 localhost systemd[1]: session-46.scope: Consumed 8.706s CPU time. Nov 28 04:20:51 localhost systemd-logind[764]: Session 46 logged out. Waiting for processes to exit. Nov 28 04:20:51 localhost systemd-logind[764]: Removed session 46. Nov 28 04:20:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7447 DF PROTO=TCP SPT=55160 DPT=9105 SEQ=4117917505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB12A7430000000001030307) Nov 28 04:20:56 localhost sshd[143088]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:20:57 localhost systemd-logind[764]: New session 47 of user zuul. Nov 28 04:20:57 localhost systemd[1]: Started Session 47 of User zuul. Nov 28 04:20:58 localhost python3.9[143181]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:20:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7015 DF PROTO=TCP SPT=54504 DPT=9101 SEQ=3538890158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB12B26A0000000001030307) Nov 28 04:21:00 localhost python3.9[143277]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:21:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7017 DF PROTO=TCP SPT=54504 DPT=9101 SEQ=3538890158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB12BE820000000001030307) Nov 28 04:21:01 localhost python3.9[143369]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:21:02 localhost python3.9[143442]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321661.1702838-183-142382663671636/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:21:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38609 DF PROTO=TCP SPT=57388 DPT=9102 SEQ=2518440198 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB12C7420000000001030307) Nov 28 04:21:03 localhost python3.9[143534]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-sriov setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:21:04 localhost python3.9[143626]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:21:04 localhost python3.9[143699]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321663.9007866-258-277568398861764/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:21:05 localhost python3.9[143791]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-dhcp setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:21:06 localhost python3.9[143883]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:21:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2907 DF PROTO=TCP SPT=51904 DPT=9100 SEQ=1746863639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB12D3030000000001030307) Nov 28 04:21:06 localhost python3.9[143956]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321665.795746-332-1941741771696/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:21:07 localhost python3.9[144048]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:21:08 localhost python3.9[144140]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:21:08 localhost python3.9[144213]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321667.7813623-401-260500616199848/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:21:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38611 DF PROTO=TCP SPT=57388 DPT=9102 SEQ=2518440198 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB12DF020000000001030307) Nov 28 04:21:09 localhost python3.9[144305]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:21:10 localhost python3.9[144397]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:21:10 localhost chronyd[133620]: Selected source 23.133.168.247 (pool.ntp.org) Nov 28 04:21:10 localhost python3.9[144470]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321669.7539186-473-223586156076465/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:21:12 localhost python3.9[144562]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:21:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2909 DF PROTO=TCP SPT=51904 DPT=9100 SEQ=1746863639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB12EAC20000000001030307) Nov 28 04:21:12 localhost python3.9[144654]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:21:13 localhost python3.9[144727]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321672.3269958-547-108216557261270/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:21:14 localhost python3.9[144819]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:21:15 localhost python3.9[144911]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:21:15 localhost python3.9[144984]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321674.9151146-625-241708710462876/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:21:16 localhost python3.9[145076]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:21:17 localhost python3.9[145168]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:21:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11366 DF PROTO=TCP SPT=50366 DPT=9882 SEQ=1649817095 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB12FD420000000001030307) Nov 28 04:21:17 localhost python3.9[145241]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321676.6958027-693-251209932070235/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=777d5f6763fde4fea484664803960858c2bba706 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:21:18 localhost systemd-logind[764]: Session 47 logged out. Waiting for processes to exit. Nov 28 04:21:18 localhost systemd[1]: session-47.scope: Deactivated successfully. Nov 28 04:21:18 localhost systemd[1]: session-47.scope: Consumed 11.541s CPU time. Nov 28 04:21:18 localhost systemd-logind[764]: Removed session 47. Nov 28 04:21:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9355 DF PROTO=TCP SPT=36044 DPT=9105 SEQ=3290678572 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1304C20000000001030307) Nov 28 04:21:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9356 DF PROTO=TCP SPT=36044 DPT=9105 SEQ=3290678572 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB130CC20000000001030307) Nov 28 04:21:24 localhost sshd[145256]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:21:24 localhost systemd-logind[764]: New session 48 of user zuul. Nov 28 04:21:24 localhost systemd[1]: Started Session 48 of User zuul. Nov 28 04:21:25 localhost python3.9[145351]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:21:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9357 DF PROTO=TCP SPT=36044 DPT=9105 SEQ=3290678572 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB131C820000000001030307) Nov 28 04:21:26 localhost python3.9[145524]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:21:27 localhost python3.9[145630]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321686.172728-62-152621756306967/.source.conf _original_basename=ceph.conf follow=False checksum=e86499341cc75988f759ac10cb7bf332387204b7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:21:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26177 DF PROTO=TCP SPT=43728 DPT=9101 SEQ=465637821 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB13279A0000000001030307) Nov 28 04:21:28 localhost python3.9[145737]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:21:28 localhost python3.9[145810]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321687.7213748-62-256550331317520/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=98ffd20e3b9db1cae39a950d9da1f69e92796658 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:21:29 localhost systemd[1]: session-48.scope: Deactivated successfully. Nov 28 04:21:29 localhost systemd[1]: session-48.scope: Consumed 2.408s CPU time. Nov 28 04:21:29 localhost systemd-logind[764]: Session 48 logged out. Waiting for processes to exit. Nov 28 04:21:29 localhost systemd-logind[764]: Removed session 48. Nov 28 04:21:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26179 DF PROTO=TCP SPT=43728 DPT=9101 SEQ=465637821 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1333C20000000001030307) Nov 28 04:21:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55117 DF PROTO=TCP SPT=57938 DPT=9102 SEQ=386827035 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB133C820000000001030307) Nov 28 04:21:34 localhost sshd[145825]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:21:34 localhost systemd-logind[764]: New session 49 of user zuul. Nov 28 04:21:34 localhost systemd[1]: Started Session 49 of User zuul. Nov 28 04:21:35 localhost python3.9[145918]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:21:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36983 DF PROTO=TCP SPT=38326 DPT=9102 SEQ=496899107 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1347820000000001030307) Nov 28 04:21:37 localhost python3.9[146014]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:21:38 localhost python3.9[146107]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 28 04:21:39 localhost python3.9[146197]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:21:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47635 DF PROTO=TCP SPT=46458 DPT=9100 SEQ=1375931586 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1353820000000001030307) Nov 28 04:21:40 localhost python3.9[146289]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Nov 28 04:21:41 localhost python3.9[146381]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 28 04:21:42 localhost python3.9[146435]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 28 04:21:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50292 DF PROTO=TCP SPT=53066 DPT=9100 SEQ=1538634092 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB135FC30000000001030307) Nov 28 04:21:46 localhost python3.9[146529]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 28 04:21:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28567 DF PROTO=TCP SPT=60740 DPT=9882 SEQ=1022673085 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1372420000000001030307) Nov 28 04:21:48 localhost python3[146624]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012 rule:#012 proto: udp#012 dport: 4789#012- rule_name: 119 neutron geneve networks#012 rule:#012 proto: udp#012 dport: 6081#012 state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012 rule:#012 proto: udp#012 dport: 6081#012 table: raw#012 chain: OUTPUT#012 jump: NOTRACK#012 action: append#012 state: []#012- rule_name: 121 neutron geneve networks no conntrack#012 rule:#012 proto: udp#012 dport: 6081#012 table: raw#012 chain: PREROUTING#012 jump: NOTRACK#012 action: append#012 state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present Nov 28 04:21:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41712 DF PROTO=TCP SPT=53084 DPT=9105 SEQ=4197712296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB137A030000000001030307) Nov 28 04:21:49 localhost python3.9[146716]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:21:50 localhost python3.9[146808]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:21:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41713 DF PROTO=TCP SPT=53084 DPT=9105 SEQ=4197712296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1382030000000001030307) Nov 28 04:21:51 localhost python3.9[146856]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:21:51 localhost python3.9[146948]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:21:52 localhost python3.9[146996]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.glr7xwbq recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:21:52 localhost python3.9[147088]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:21:53 localhost python3.9[147136]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:21:54 localhost python3.9[147228]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:21:55 localhost python3[147321]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Nov 28 04:21:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41714 DF PROTO=TCP SPT=53084 DPT=9105 SEQ=4197712296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1391C20000000001030307) Nov 28 04:21:55 localhost python3.9[147413]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:21:56 localhost python3.9[147488]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321715.4084322-431-101665468607006/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:21:57 localhost python3.9[147580]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:21:57 localhost python3.9[147655]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321716.769259-476-122576591274181/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:21:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57372 DF PROTO=TCP SPT=39120 DPT=9101 SEQ=487601255 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB139CCA0000000001030307) Nov 28 04:21:58 localhost python3.9[147747]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:21:59 localhost python3.9[147822]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321718.0005498-521-259838118134816/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:22:00 localhost python3.9[147914]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:22:01 localhost python3.9[147989]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321720.0569258-566-248504590464415/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:22:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57374 DF PROTO=TCP SPT=39120 DPT=9101 SEQ=487601255 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB13A8C20000000001030307) Nov 28 04:22:03 localhost python3.9[148081]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:22:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41715 DF PROTO=TCP SPT=53084 DPT=9105 SEQ=4197712296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB13B1820000000001030307) Nov 28 04:22:03 localhost python3.9[148156]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764321722.4914188-611-70940371795726/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:22:04 localhost python3.9[148248]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:22:04 localhost python3.9[148340]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:22:05 localhost python3.9[148435]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:22:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25988 DF PROTO=TCP SPT=41342 DPT=9100 SEQ=1338736920 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB13BD420000000001030307) Nov 28 04:22:06 localhost python3.9[148527]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:22:07 localhost python3.9[148620]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:22:07 localhost python3.9[148714]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:22:08 localhost python3.9[148809]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:22:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2912 DF PROTO=TCP SPT=51904 DPT=9100 SEQ=1746863639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB13C9820000000001030307) Nov 28 04:22:09 localhost python3.9[148899]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:22:10 localhost python3.9[148992]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=np0005538513.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:28:f9:1a:af" external_ids:ovn-encap-ip=172.19.0.106 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:22:10 localhost ovs-vsctl[148993]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=np0005538513.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:28:f9:1a:af external_ids:ovn-encap-ip=172.19.0.106 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch Nov 28 04:22:11 localhost python3.9[149085]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:22:12 localhost python3.9[149178]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:22:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25990 DF PROTO=TCP SPT=41342 DPT=9100 SEQ=1338736920 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB13D5020000000001030307) Nov 28 04:22:13 localhost python3.9[149272]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:22:13 localhost python3.9[149364]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:22:14 localhost python3.9[149412]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:22:14 localhost python3.9[149504]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:22:15 localhost python3.9[149552]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:22:15 localhost python3.9[149644]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:22:16 localhost python3.9[149736]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:22:16 localhost python3.9[149784]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:22:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23834 DF PROTO=TCP SPT=54550 DPT=9882 SEQ=1439506991 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB13E7820000000001030307) Nov 28 04:22:17 localhost python3.9[149876]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:22:17 localhost python3.9[149924]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:22:18 localhost python3.9[150016]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:22:18 localhost systemd[1]: Reloading. Nov 28 04:22:18 localhost systemd-rc-local-generator[150038]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:22:18 localhost systemd-sysv-generator[150043]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:22:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:22:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10603 DF PROTO=TCP SPT=38760 DPT=9105 SEQ=4002833696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB13EF020000000001030307) Nov 28 04:22:20 localhost python3.9[150145]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:22:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10604 DF PROTO=TCP SPT=38760 DPT=9105 SEQ=4002833696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB13F7020000000001030307) Nov 28 04:22:21 localhost python3.9[150193]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:22:22 localhost python3.9[150285]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:22:23 localhost python3.9[150333]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:22:24 localhost python3.9[150425]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:22:24 localhost systemd[1]: Reloading. Nov 28 04:22:24 localhost systemd-sysv-generator[150453]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:22:24 localhost systemd-rc-local-generator[150449]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:22:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:22:24 localhost systemd[1]: Starting Create netns directory... Nov 28 04:22:24 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 28 04:22:24 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 28 04:22:24 localhost systemd[1]: Finished Create netns directory. Nov 28 04:22:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10605 DF PROTO=TCP SPT=38760 DPT=9105 SEQ=4002833696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1406C20000000001030307) Nov 28 04:22:25 localhost python3.9[150559]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:22:26 localhost python3.9[150651]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:22:26 localhost python3.9[150724]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321745.6111827-1343-50852746497548/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 28 04:22:27 localhost python3.9[150816]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:22:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37497 DF PROTO=TCP SPT=51806 DPT=9101 SEQ=1834210490 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1411FA0000000001030307) Nov 28 04:22:28 localhost python3.9[150938]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:22:28 localhost python3.9[151044]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321747.7073126-1418-738027007607/.source.json _original_basename=.edz0brm4 follow=False checksum=38f75f59f5c2ef6b5da12297bfd31cd1e97012ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:22:29 localhost python3.9[151151]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:22:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37499 DF PROTO=TCP SPT=51806 DPT=9101 SEQ=1834210490 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB141E020000000001030307) Nov 28 04:22:32 localhost python3.9[151408]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False Nov 28 04:22:32 localhost python3.9[151500]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 28 04:22:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21783 DF PROTO=TCP SPT=34170 DPT=9102 SEQ=1601837050 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1427020000000001030307) Nov 28 04:22:34 localhost python3.9[151592]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Nov 28 04:22:36 localhost sshd[151634]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:22:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20579 DF PROTO=TCP SPT=59158 DPT=9100 SEQ=703998991 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1432820000000001030307) Nov 28 04:22:38 localhost python3[151713]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Nov 28 04:22:39 localhost python3[151713]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "52cb1910f3f090372807028d1c2aea98d2557b1086636469529f290368ecdf69",#012 "Digest": "sha256:7ab0ee81fdc9b162df9b50eb2e264c777d08f90975a442620ec940edabe300b2",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:7ab0ee81fdc9b162df9b50eb2e264c777d08f90975a442620ec940edabe300b2"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-11-26T06:43:38.999472418Z",#012 "Config": {#012 "User": "root",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 345745352,#012 "VirtualSize": 345745352,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/d63efe17da859108a09d9b90626ba0c433787abe209cd4ac755f6ba2a5206671/diff:/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/d8443c9fdf039c2367e44e0edbe81c941f30f604c3f1eccc2fc81efb5a97a784/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/d8443c9fdf039c2367e44e0edbe81c941f30f604c3f1eccc2fc81efb5a97a784/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",#012 "sha256:1e3477d3ea795ca64b46f28aa9428ba791c4250e0fd05e173a4b9c0fb0bdee23",#012 "sha256:41a433848ac42a81e513766649f77cfa09e37aae045bcbbb33be77f7cf86edc4",#012 "sha256:055d9012b48b3c8064accd40b6372c79c29fedd85061a710ada00677f88b1db9"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "root",#012 "History": [#012 {#012 "created": "2025-11-25T04:02:36.223494528Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:36.223562059Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251125\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:39.054452717Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-11-26T06:10:57.55004106Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550061231Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550071761Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550082711Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550094371Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550104472Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.937139683Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:11:33.845342269Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:11:37.752912815Z",#012 "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-l Nov 28 04:22:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50295 DF PROTO=TCP SPT=53066 DPT=9100 SEQ=1538634092 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB143D820000000001030307) Nov 28 04:22:39 localhost podman[151764]: 2025-11-28 09:22:39.267415631 +0000 UTC m=+0.093466262 container remove 3f1db69a4961ac4f02d2401a741c604c853dbea99051bfe1630d231c17e51c9e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_id=tripleo_step4, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044) Nov 28 04:22:39 localhost python3[151713]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_controller Nov 28 04:22:39 localhost podman[151777]: Nov 28 04:22:39 localhost podman[151777]: 2025-11-28 09:22:39.373524679 +0000 UTC m=+0.086916839 container create 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller) Nov 28 04:22:39 localhost podman[151777]: 2025-11-28 09:22:39.334291437 +0000 UTC m=+0.047683667 image pull quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Nov 28 04:22:39 localhost python3[151713]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Nov 28 04:22:40 localhost python3.9[151906]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:22:41 localhost python3.9[152000]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:22:41 localhost python3.9[152046]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:22:42 localhost python3.9[152137]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764321761.5425732-1682-228749997265211/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:22:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20581 DF PROTO=TCP SPT=59158 DPT=9100 SEQ=703998991 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB144A420000000001030307) Nov 28 04:22:42 localhost python3.9[152183]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 28 04:22:42 localhost systemd[1]: Reloading. Nov 28 04:22:42 localhost systemd-rc-local-generator[152205]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:22:42 localhost systemd-sysv-generator[152209]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:22:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:22:43 localhost python3.9[152265]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:22:43 localhost systemd[1]: Reloading. Nov 28 04:22:43 localhost systemd-sysv-generator[152298]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:22:43 localhost systemd-rc-local-generator[152294]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:22:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:22:43 localhost systemd[1]: Starting ovn_controller container... Nov 28 04:22:44 localhost systemd[1]: Started libcrun container. Nov 28 04:22:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8e76290f26abea91a0c30e5a77de17af49be781827908e370bab34dfbcbda46f/merged/run/ovn supports timestamps until 2038 (0x7fffffff) Nov 28 04:22:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:22:44 localhost podman[152307]: 2025-11-28 09:22:44.156243396 +0000 UTC m=+0.139304869 container init 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Nov 28 04:22:44 localhost ovn_controller[152322]: + sudo -E kolla_set_configs Nov 28 04:22:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:22:44 localhost podman[152307]: 2025-11-28 09:22:44.185034701 +0000 UTC m=+0.168096124 container start 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 28 04:22:44 localhost edpm-start-podman-container[152307]: ovn_controller Nov 28 04:22:44 localhost systemd[1]: Created slice User Slice of UID 0. Nov 28 04:22:44 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Nov 28 04:22:44 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Nov 28 04:22:44 localhost systemd[1]: Starting User Manager for UID 0... Nov 28 04:22:44 localhost podman[152329]: 2025-11-28 09:22:44.288340193 +0000 UTC m=+0.098373323 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, container_name=ovn_controller, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:22:44 localhost podman[152329]: 2025-11-28 09:22:44.332249059 +0000 UTC m=+0.142282149 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 04:22:44 localhost edpm-start-podman-container[152306]: Creating additional drop-in dependency for "ovn_controller" (9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c) Nov 28 04:22:44 localhost podman[152329]: unhealthy Nov 28 04:22:44 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:22:44 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Failed with result 'exit-code'. Nov 28 04:22:44 localhost systemd[1]: Reloading. Nov 28 04:22:44 localhost systemd[152352]: Queued start job for default target Main User Target. Nov 28 04:22:44 localhost systemd[152352]: Created slice User Application Slice. Nov 28 04:22:44 localhost systemd-journald[47227]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation. Nov 28 04:22:44 localhost systemd-journald[47227]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 28 04:22:44 localhost systemd[152352]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Nov 28 04:22:44 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 28 04:22:44 localhost systemd[152352]: Started Daily Cleanup of User's Temporary Directories. Nov 28 04:22:44 localhost systemd[152352]: Reached target Paths. Nov 28 04:22:44 localhost systemd[152352]: Reached target Timers. Nov 28 04:22:44 localhost systemd[152352]: Starting D-Bus User Message Bus Socket... Nov 28 04:22:44 localhost systemd[152352]: Starting Create User's Volatile Files and Directories... Nov 28 04:22:44 localhost systemd[152352]: Listening on D-Bus User Message Bus Socket. Nov 28 04:22:44 localhost systemd[152352]: Reached target Sockets. Nov 28 04:22:44 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 28 04:22:44 localhost systemd[152352]: Finished Create User's Volatile Files and Directories. Nov 28 04:22:44 localhost systemd[152352]: Reached target Basic System. Nov 28 04:22:44 localhost systemd[152352]: Reached target Main User Target. Nov 28 04:22:44 localhost systemd[152352]: Startup finished in 138ms. Nov 28 04:22:44 localhost systemd-rc-local-generator[152407]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:22:44 localhost systemd-sysv-generator[152414]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:22:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:22:44 localhost systemd[1]: Started User Manager for UID 0. Nov 28 04:22:44 localhost systemd[1]: Started ovn_controller container. Nov 28 04:22:44 localhost systemd[1]: Started Session c12 of User root. Nov 28 04:22:44 localhost ovn_controller[152322]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 28 04:22:44 localhost ovn_controller[152322]: INFO:__main__:Validating config file Nov 28 04:22:44 localhost ovn_controller[152322]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 28 04:22:44 localhost ovn_controller[152322]: INFO:__main__:Writing out command to execute Nov 28 04:22:44 localhost systemd[1]: session-c12.scope: Deactivated successfully. Nov 28 04:22:44 localhost ovn_controller[152322]: ++ cat /run_command Nov 28 04:22:44 localhost ovn_controller[152322]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock ' Nov 28 04:22:44 localhost ovn_controller[152322]: + ARGS= Nov 28 04:22:44 localhost ovn_controller[152322]: + sudo kolla_copy_cacerts Nov 28 04:22:44 localhost systemd[1]: Started Session c13 of User root. Nov 28 04:22:44 localhost systemd[1]: session-c13.scope: Deactivated successfully. Nov 28 04:22:44 localhost ovn_controller[152322]: + [[ ! -n '' ]] Nov 28 04:22:44 localhost ovn_controller[152322]: + . kolla_extend_start Nov 28 04:22:44 localhost ovn_controller[152322]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock ' Nov 28 04:22:44 localhost ovn_controller[152322]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '\''' Nov 28 04:22:44 localhost ovn_controller[152322]: + umask 0022 Nov 28 04:22:44 localhost ovn_controller[152322]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting... Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8] Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00004|main|INFO|OVS IDL reconnected, force recompute. Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00005|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connecting... Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00006|main|INFO|OVNSB IDL reconnected, force recompute. Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00007|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connected Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00011|features|INFO|OVS Feature: ct_flush, state: supported Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00012|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting... Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00013|main|INFO|OVS feature set changed, force recompute. Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00014|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00015|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00017|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms) Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00018|main|INFO|OVS OpenFlow connection reconnected,force recompute. Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00020|reconnect|INFO|unix:/run/openvswitch/db.sock: connected Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00021|main|INFO|OVS feature set changed, force recompute. Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00022|ovn_bfd|INFO|Disabled BFD on interface ovn-07900d-0 Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00023|ovn_bfd|INFO|Disabled BFD on interface ovn-c3237d-0 Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00024|ovn_bfd|INFO|Disabled BFD on interface ovn-11aa47-0 Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00025|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4 Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00026|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00027|binding|INFO|Claiming lport 09612b07-5142-4b0f-9dab-74bf4403f69f for this chassis. Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00028|binding|INFO|09612b07-5142-4b0f-9dab-74bf4403f69f: Claiming fa:16:3e:f4:fc:6c 192.168.0.142 Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00029|binding|INFO|Removing lport 09612b07-5142-4b0f-9dab-74bf4403f69f ovn-installed in OVS Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00030|ovn_bfd|INFO|Enabled BFD on interface ovn-07900d-0 Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00031|ovn_bfd|INFO|Enabled BFD on interface ovn-c3237d-0 Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00032|ovn_bfd|INFO|Enabled BFD on interface ovn-11aa47-0 Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00033|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00034|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00035|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00036|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00037|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00038|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 04:22:44 localhost ovn_controller[152322]: 2025-11-28T09:22:44Z|00039|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 04:22:45 localhost ovn_controller[152322]: 2025-11-28T09:22:45Z|00040|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 04:22:45 localhost python3.9[152523]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:22:45 localhost ovs-vsctl[152524]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload Nov 28 04:22:45 localhost ovn_controller[152322]: 2025-11-28T09:22:45Z|00041|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 04:22:45 localhost ovn_controller[152322]: 2025-11-28T09:22:45Z|00042|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 04:22:46 localhost python3.9[152616]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:22:46 localhost ovs-vsctl[152618]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids Nov 28 04:22:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1580 DF PROTO=TCP SPT=59038 DPT=9882 SEQ=781079388 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB145CC30000000001030307) Nov 28 04:22:47 localhost python3.9[152711]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:22:47 localhost ovs-vsctl[152712]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options Nov 28 04:22:48 localhost systemd[1]: session-49.scope: Deactivated successfully. Nov 28 04:22:48 localhost systemd[1]: session-49.scope: Consumed 39.933s CPU time. Nov 28 04:22:48 localhost systemd-logind[764]: Session 49 logged out. Waiting for processes to exit. Nov 28 04:22:48 localhost systemd-logind[764]: Removed session 49. Nov 28 04:22:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17756 DF PROTO=TCP SPT=41240 DPT=9105 SEQ=822122634 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1464420000000001030307) Nov 28 04:22:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17757 DF PROTO=TCP SPT=41240 DPT=9105 SEQ=822122634 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB146C430000000001030307) Nov 28 04:22:52 localhost ovn_controller[152322]: 2025-11-28T09:22:52Z|00043|binding|INFO|Setting lport 09612b07-5142-4b0f-9dab-74bf4403f69f ovn-installed in OVS Nov 28 04:22:52 localhost ovn_controller[152322]: 2025-11-28T09:22:52Z|00044|binding|INFO|Setting lport 09612b07-5142-4b0f-9dab-74bf4403f69f up in Southbound Nov 28 04:22:54 localhost sshd[152727]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:22:54 localhost systemd[1]: Stopping User Manager for UID 0... Nov 28 04:22:54 localhost systemd[152352]: Activating special unit Exit the Session... Nov 28 04:22:54 localhost systemd[152352]: Stopped target Main User Target. Nov 28 04:22:54 localhost systemd[152352]: Stopped target Basic System. Nov 28 04:22:54 localhost systemd[152352]: Stopped target Paths. Nov 28 04:22:54 localhost systemd[152352]: Stopped target Sockets. Nov 28 04:22:54 localhost systemd[152352]: Stopped target Timers. Nov 28 04:22:54 localhost systemd[152352]: Stopped Daily Cleanup of User's Temporary Directories. Nov 28 04:22:54 localhost systemd[152352]: Closed D-Bus User Message Bus Socket. Nov 28 04:22:54 localhost systemd[152352]: Stopped Create User's Volatile Files and Directories. Nov 28 04:22:54 localhost systemd[152352]: Removed slice User Application Slice. Nov 28 04:22:54 localhost systemd[152352]: Reached target Shutdown. Nov 28 04:22:54 localhost systemd[152352]: Finished Exit the Session. Nov 28 04:22:54 localhost systemd[152352]: Reached target Exit the Session. Nov 28 04:22:55 localhost systemd-logind[764]: New session 51 of user zuul. Nov 28 04:22:55 localhost systemd[1]: Started Session 51 of User zuul. Nov 28 04:22:55 localhost systemd[1]: user@0.service: Deactivated successfully. Nov 28 04:22:55 localhost systemd[1]: Stopped User Manager for UID 0. Nov 28 04:22:55 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Nov 28 04:22:55 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Nov 28 04:22:55 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Nov 28 04:22:55 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Nov 28 04:22:55 localhost systemd[1]: Removed slice User Slice of UID 0. Nov 28 04:22:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17758 DF PROTO=TCP SPT=41240 DPT=9105 SEQ=822122634 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB147C030000000001030307) Nov 28 04:22:56 localhost python3.9[152824]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:22:57 localhost python3.9[152920]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 28 04:22:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46699 DF PROTO=TCP SPT=42092 DPT=9101 SEQ=2393569372 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB14872A0000000001030307) Nov 28 04:22:58 localhost python3.9[153012]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:22:59 localhost python3.9[153104]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:23:00 localhost python3.9[153196]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:23:00 localhost python3.9[153288]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:23:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46701 DF PROTO=TCP SPT=42092 DPT=9101 SEQ=2393569372 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1493420000000001030307) Nov 28 04:23:01 localhost python3.9[153378]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:23:02 localhost python3.9[153470]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Nov 28 04:23:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17759 DF PROTO=TCP SPT=41240 DPT=9105 SEQ=822122634 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB149B820000000001030307) Nov 28 04:23:03 localhost python3.9[153561]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:23:04 localhost python3.9[153634]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321782.9446158-218-87392814631384/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:23:04 localhost python3.9[153724]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:23:05 localhost python3.9[153797]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321784.3537-263-196086026412499/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:23:06 localhost python3.9[153889]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 28 04:23:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16218 DF PROTO=TCP SPT=57288 DPT=9102 SEQ=1664019527 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB14A7830000000001030307) Nov 28 04:23:07 localhost python3.9[153943]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 28 04:23:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25993 DF PROTO=TCP SPT=41342 DPT=9100 SEQ=1338736920 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB14B3820000000001030307) Nov 28 04:23:11 localhost python3.9[154037]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 28 04:23:11 localhost python3.9[154130]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:23:12 localhost python3.9[154201]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321791.4540122-374-146874500130316/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:23:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19397 DF PROTO=TCP SPT=57738 DPT=9100 SEQ=970466610 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB14BF820000000001030307) Nov 28 04:23:13 localhost python3.9[154291]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:23:13 localhost python3.9[154362]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321792.4888258-374-92163016567378/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:23:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:23:14 localhost python3.9[154452]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:23:14 localhost podman[154453]: 2025-11-28 09:23:14.876362224 +0000 UTC m=+0.105230194 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 28 04:23:14 localhost ovn_controller[152322]: 2025-11-28T09:23:14Z|00045|memory|INFO|16988 kB peak resident set size after 30.1 seconds Nov 28 04:23:14 localhost ovn_controller[152322]: 2025-11-28T09:23:14Z|00046|memory|INFO|idl-cells-OVN_Southbound:4028 idl-cells-Open_vSwitch:1045 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:76 lflow-cache-entries-cache-matches:195 lflow-cache-size-KB:289 local_datapath_usage-KB:1 ofctrl_desired_flow_usage-KB:154 ofctrl_installed_flow_usage-KB:111 ofctrl_sb_flow_ref_usage-KB:67 Nov 28 04:23:14 localhost podman[154453]: 2025-11-28 09:23:14.919424098 +0000 UTC m=+0.148292058 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:23:14 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:23:15 localhost python3.9[154548]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321794.4513798-506-46654658540954/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=aa9e89725fbcebf7a5c773d7b97083445b7b7759 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:23:15 localhost python3.9[154638]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:23:16 localhost python3.9[154709]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321795.5571332-506-117245659445644/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=979187b925479d81d0609f4188e5b95fe1f92c18 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:23:17 localhost python3.9[154799]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:23:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56445 DF PROTO=TCP SPT=54420 DPT=9882 SEQ=1838728032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB14D2030000000001030307) Nov 28 04:23:17 localhost python3.9[154893]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:23:19 localhost python3.9[154985]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:23:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18393 DF PROTO=TCP SPT=54340 DPT=9105 SEQ=1226643617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB14D9820000000001030307) Nov 28 04:23:19 localhost python3.9[155033]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:23:20 localhost python3.9[155125]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:23:20 localhost python3.9[155173]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:23:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18394 DF PROTO=TCP SPT=54340 DPT=9105 SEQ=1226643617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB14E1820000000001030307) Nov 28 04:23:21 localhost python3.9[155265]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:23:22 localhost python3.9[155357]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:23:22 localhost ovn_controller[152322]: 2025-11-28T09:23:22Z|00047|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory Nov 28 04:23:23 localhost python3.9[155405]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:23:23 localhost python3.9[155497]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:23:24 localhost python3.9[155545]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:23:24 localhost python3.9[155637]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:23:24 localhost systemd[1]: Reloading. Nov 28 04:23:25 localhost systemd-rc-local-generator[155658]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:23:25 localhost systemd-sysv-generator[155665]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:23:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:23:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18395 DF PROTO=TCP SPT=54340 DPT=9105 SEQ=1226643617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB14F1420000000001030307) Nov 28 04:23:25 localhost python3.9[155767]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:23:26 localhost python3.9[155815]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:23:27 localhost python3.9[155907]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:23:27 localhost python3.9[155955]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:23:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37344 DF PROTO=TCP SPT=49172 DPT=9101 SEQ=1056236817 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB14FC5A0000000001030307) Nov 28 04:23:28 localhost python3.9[156047]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:23:28 localhost systemd[1]: Reloading. Nov 28 04:23:28 localhost systemd-rc-local-generator[156073]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:23:28 localhost systemd-sysv-generator[156076]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:23:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:23:28 localhost systemd[1]: Starting Create netns directory... Nov 28 04:23:28 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 28 04:23:28 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 28 04:23:28 localhost systemd[1]: Finished Create netns directory. Nov 28 04:23:29 localhost python3.9[156211]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:23:30 localhost systemd[1]: tmp-crun.NQYsSb.mount: Deactivated successfully. Nov 28 04:23:30 localhost podman[156365]: 2025-11-28 09:23:30.172984062 +0000 UTC m=+0.111030158 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, vcs-type=git, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, release=553, distribution-scope=public, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., RELEASE=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, name=rhceph, GIT_BRANCH=main, version=7) Nov 28 04:23:30 localhost python3.9[156386]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:23:30 localhost podman[156365]: 2025-11-28 09:23:30.314660051 +0000 UTC m=+0.252706197 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., version=7, architecture=x86_64, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, ceph=True, RELEASE=main, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 28 04:23:30 localhost python3.9[156532]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764321809.8248074-959-142873168532102/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 28 04:23:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37346 DF PROTO=TCP SPT=49172 DPT=9101 SEQ=1056236817 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1508420000000001030307) Nov 28 04:23:32 localhost python3.9[156682]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:23:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19342 DF PROTO=TCP SPT=40296 DPT=9102 SEQ=1626653550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1511430000000001030307) Nov 28 04:23:33 localhost python3.9[156774]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:23:34 localhost python3.9[156849]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764321812.986586-1034-109923243887108/.source.json _original_basename=.w7ks84j_ follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:23:34 localhost python3.9[156941]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:23:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61330 DF PROTO=TCP SPT=47258 DPT=9100 SEQ=1542535815 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB151CC20000000001030307) Nov 28 04:23:36 localhost python3.9[157198]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False Nov 28 04:23:37 localhost python3.9[157290]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 28 04:23:38 localhost python3.9[157382]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Nov 28 04:23:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19344 DF PROTO=TCP SPT=40296 DPT=9102 SEQ=1626653550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1529020000000001030307) Nov 28 04:23:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61332 DF PROTO=TCP SPT=47258 DPT=9100 SEQ=1542535815 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1534820000000001030307) Nov 28 04:23:42 localhost python3[157501]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Nov 28 04:23:43 localhost python3[157501]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071",#012 "Digest": "sha256:2b8255d3a22035616e569dbe22862a2560e15cdaefedae0059a354d558788e1e",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:2b8255d3a22035616e569dbe22862a2560e15cdaefedae0059a354d558788e1e"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-11-26T06:34:14.989876147Z",#012 "Config": {#012 "User": "neutron",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 784145152,#012 "VirtualSize": 784145152,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/f04f6aa8018da724c9daa5ca37db7cd13477323f1b725eec5dac97862d883048/diff:/var/lib/containers/storage/overlay/47afe78ba3ac18f156703d7ad9e4be64941a9d1bd472a4c2a59f4f2c3531ee35/diff:/var/lib/containers/storage/overlay/f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a/diff:/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/b574f97f279779c52df37c61d993141d596fdb6544fa700fbddd8f35f27a4d3b/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/b574f97f279779c52df37c61d993141d596fdb6544fa700fbddd8f35f27a4d3b/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",#012 "sha256:1e3477d3ea795ca64b46f28aa9428ba791c4250e0fd05e173a4b9c0fb0bdee23",#012 "sha256:c136b33417f134a3b932677bcf7a2df089c29f20eca250129eafd2132d4708bb",#012 "sha256:bc63f71478d9d90db803b468b28e5d9e0268adbace958b608ab10bd0819798bd",#012 "sha256:3277562ff4450bdcd859dd0b0be874b10dd6f3502be711d42aab9ff44a85cf28",#012 "sha256:982219792b3d83fa04ae12d0161dd3b982e7e3ed68293e6c876d50161b73746b"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "neutron",#012 "History": [#012 {#012 "created": "2025-11-25T04:02:36.223494528Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:36.223562059Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251125\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:39.054452717Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-11-26T06:10:57.55004106Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550061231Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550071761Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550082711Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550094371Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550104472Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.937139683Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:11:33.845342269Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf Nov 28 04:23:43 localhost podman[157550]: 2025-11-28 09:23:43.197051557 +0000 UTC m=+0.089198084 container remove 1cfeaaf9a2b799a596e9956cbec7fdb1cf974ded033ccd3b6377c0b6f5457633 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'dfc67f7a8d1f67548a53836c6db3b704'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team) Nov 28 04:23:43 localhost python3[157501]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_metadata_agent Nov 28 04:23:43 localhost podman[157563]: Nov 28 04:23:43 localhost podman[157563]: 2025-11-28 09:23:43.341364658 +0000 UTC m=+0.125884008 container create ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_managed=true) Nov 28 04:23:43 localhost podman[157563]: 2025-11-28 09:23:43.260918003 +0000 UTC m=+0.045437363 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Nov 28 04:23:43 localhost python3[157501]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311 --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Nov 28 04:23:44 localhost python3.9[157687]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:23:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:23:45 localhost podman[157781]: 2025-11-28 09:23:45.237521862 +0000 UTC m=+0.089090440 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Nov 28 04:23:45 localhost podman[157781]: 2025-11-28 09:23:45.3154679 +0000 UTC m=+0.167036498 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Nov 28 04:23:45 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:23:45 localhost python3.9[157782]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:23:45 localhost python3.9[157852]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:23:47 localhost python3.9[157943]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764321825.8203-1298-116854091113190/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:23:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26702 DF PROTO=TCP SPT=51334 DPT=9882 SEQ=1327201159 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1547030000000001030307) Nov 28 04:23:47 localhost python3.9[157989]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 28 04:23:47 localhost systemd[1]: Reloading. Nov 28 04:23:47 localhost systemd-rc-local-generator[158012]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:23:47 localhost systemd-sysv-generator[158019]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:23:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:23:48 localhost python3.9[158071]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:23:48 localhost systemd[1]: Reloading. Nov 28 04:23:48 localhost systemd-rc-local-generator[158098]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:23:48 localhost systemd-sysv-generator[158104]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:23:48 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:23:48 localhost systemd[1]: Starting ovn_metadata_agent container... Nov 28 04:23:49 localhost systemd[1]: tmp-crun.XnaYak.mount: Deactivated successfully. Nov 28 04:23:49 localhost systemd[1]: Started libcrun container. Nov 28 04:23:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51878c7750469bb637905b80c196d06233ea74ae919e9279342c4fa33ab172a0/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Nov 28 04:23:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51878c7750469bb637905b80c196d06233ea74ae919e9279342c4fa33ab172a0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 04:23:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:23:49 localhost podman[158113]: 2025-11-28 09:23:49.068002365 +0000 UTC m=+0.157847267 container init ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: + sudo -E kolla_set_configs Nov 28 04:23:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:23:49 localhost podman[158113]: 2025-11-28 09:23:49.114144649 +0000 UTC m=+0.203989511 container start ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 28 04:23:49 localhost edpm-start-podman-container[158113]: ovn_metadata_agent Nov 28 04:23:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41542 DF PROTO=TCP SPT=47128 DPT=9105 SEQ=1499306541 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB154EC20000000001030307) Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: INFO:__main__:Validating config file Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: INFO:__main__:Copying service configuration files Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: INFO:__main__:Writing out command to execute Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron/external Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934 Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/40d5da59-6201-424a-8380-80ecc3d67c7e.pid.haproxy Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/40d5da59-6201-424a-8380-80ecc3d67c7e.conf Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: ++ cat /run_command Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: + CMD=neutron-ovn-metadata-agent Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: + ARGS= Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: + sudo kolla_copy_cacerts Nov 28 04:23:49 localhost podman[158133]: 2025-11-28 09:23:49.200518493 +0000 UTC m=+0.084914233 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=starting, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:23:49 localhost podman[158133]: 2025-11-28 09:23:49.205649563 +0000 UTC m=+0.090045233 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS) Nov 28 04:23:49 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: + [[ ! -n '' ]] Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: + . kolla_extend_start Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: Running command: 'neutron-ovn-metadata-agent' Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\''' Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: + umask 0022 Nov 28 04:23:49 localhost ovn_metadata_agent[158125]: + exec neutron-ovn-metadata-agent Nov 28 04:23:49 localhost edpm-start-podman-container[158112]: Creating additional drop-in dependency for "ovn_metadata_agent" (ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8) Nov 28 04:23:49 localhost systemd[1]: Reloading. Nov 28 04:23:49 localhost systemd-sysv-generator[158204]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:23:49 localhost systemd-rc-local-generator[158201]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:23:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:23:49 localhost systemd[1]: tmp-crun.FAfUYS.mount: Deactivated successfully. Nov 28 04:23:49 localhost systemd[1]: Started ovn_metadata_agent container. Nov 28 04:23:50 localhost systemd-logind[764]: Session 51 logged out. Waiting for processes to exit. Nov 28 04:23:50 localhost systemd[1]: session-51.scope: Deactivated successfully. Nov 28 04:23:50 localhost systemd[1]: session-51.scope: Consumed 31.577s CPU time. Nov 28 04:23:50 localhost systemd-logind[764]: Removed session 51. Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.744 158130 INFO neutron.common.config [-] Logging enabled!#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.744 158130 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.745 158130 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.745 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.745 158130 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.745 158130 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.745 158130 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.745 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.746 158130 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.746 158130 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.746 158130 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.746 158130 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.746 158130 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.746 158130 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.746 158130 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.746 158130 DEBUG neutron.agent.ovn.metadata_agent [-] backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.746 158130 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.746 158130 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.747 158130 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.747 158130 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.747 158130 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.747 158130 DEBUG neutron.agent.ovn.metadata_agent [-] config_file = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.747 158130 DEBUG neutron.agent.ovn.metadata_agent [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.747 158130 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.747 158130 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.747 158130 DEBUG neutron.agent.ovn.metadata_agent [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.747 158130 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.747 158130 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.748 158130 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.748 158130 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.748 158130 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.748 158130 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.748 158130 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.748 158130 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.748 158130 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.748 158130 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.748 158130 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.748 158130 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.749 158130 DEBUG neutron.agent.ovn.metadata_agent [-] host = np0005538513.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.749 158130 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.749 158130 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.749 158130 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.749 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.749 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.749 158130 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.749 158130 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.749 158130 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.749 158130 DEBUG neutron.agent.ovn.metadata_agent [-] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.750 158130 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.750 158130 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.750 158130 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.750 158130 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.750 158130 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.750 158130 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.750 158130 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.750 158130 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.750 158130 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.750 158130 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.751 158130 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.751 158130 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.751 158130 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.751 158130 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.751 158130 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.751 158130 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.751 158130 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.751 158130 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.751 158130 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.751 158130 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.752 158130 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.752 158130 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.752 158130 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.752 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.752 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.752 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.752 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.752 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.752 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.752 158130 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.753 158130 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.753 158130 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.753 158130 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.753 158130 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.753 158130 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.753 158130 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.753 158130 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.753 158130 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.753 158130 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.753 158130 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.754 158130 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.754 158130 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.754 158130 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.754 158130 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.754 158130 DEBUG neutron.agent.ovn.metadata_agent [-] state_path = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.754 158130 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.754 158130 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.754 158130 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.754 158130 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.754 158130 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.754 158130 DEBUG neutron.agent.ovn.metadata_agent [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.755 158130 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.755 158130 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.755 158130 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.755 158130 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.755 158130 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.755 158130 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.755 158130 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.755 158130 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.755 158130 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.755 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.756 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.756 158130 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.756 158130 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.756 158130 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.756 158130 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.756 158130 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.756 158130 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.756 158130 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.756 158130 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.756 158130 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.757 158130 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.757 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.757 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.757 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.757 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.757 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.757 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.757 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.757 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.758 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.758 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.758 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.758 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.758 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.758 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.758 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.758 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.758 158130 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.758 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.759 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.759 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.759 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.759 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.759 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.759 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.759 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.759 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.759 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.759 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.760 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.760 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.760 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.760 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.760 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.760 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.760 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.760 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.760 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.760 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.761 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.761 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.761 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.761 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.761 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.761 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.761 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.761 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.761 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.761 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.762 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.762 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.762 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.762 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.762 158130 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.762 158130 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.762 158130 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.762 158130 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.762 158130 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.762 158130 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.763 158130 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.763 158130 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.763 158130 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.763 158130 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.763 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.763 158130 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.763 158130 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.763 158130 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.763 158130 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.763 158130 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.764 158130 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.764 158130 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.764 158130 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.764 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.764 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.764 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.764 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.764 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.764 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.764 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.764 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.765 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.765 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.765 158130 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.765 158130 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.765 158130 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.765 158130 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.765 158130 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.765 158130 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.765 158130 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.765 158130 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.766 158130 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.766 158130 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.766 158130 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.766 158130 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.766 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.766 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.766 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.766 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.766 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.766 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.767 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.767 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.767 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.767 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.767 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.767 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.767 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.767 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.767 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.767 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.768 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.768 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.768 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.768 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.768 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.768 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.768 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.768 158130 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.768 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.768 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.769 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.769 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.769 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.769 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.769 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.769 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.769 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.769 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.769 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.769 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.770 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.770 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.770 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.770 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.770 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.770 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.770 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.770 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.770 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.770 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.771 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.771 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.771 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.771 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.771 158130 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.771 158130 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.771 158130 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.771 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.771 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.771 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.772 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.772 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.772 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.772 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.772 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.772 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.772 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.772 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.772 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.772 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.773 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.773 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.773 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.773 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.773 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.773 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.773 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.773 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.773 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.774 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.774 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.774 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.774 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.774 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.774 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.774 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.774 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.774 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.774 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.775 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.775 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.775 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.775 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.775 158130 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.775 158130 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.826 158130 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.826 158130 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.826 158130 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.826 158130 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.826 158130 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.842 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name c85299c6-8e38-42c8-8509-2eaaf15c050c (UUID: c85299c6-8e38-42c8-8509-2eaaf15c050c) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.856 158130 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.857 158130 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.857 158130 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.857 158130 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.859 158130 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.862 158130 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.874 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: PortBindingCreateWithChassis(events=('create',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:fc:6c 192.168.0.142'], port_security=['fa:16:3e:f4:fc:6c 192.168.0.142'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.142/24', 'neutron:device_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005538513.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-40d5da59-6201-424a-8380-80ecc3d67c7e', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '9dda653c53224db086060962b0702694', 'neutron:revision_number': '7', 'neutron:security_group_ids': '6d2c5a31-c9e5-413a-bccf-f97c7687bd94 b3c60f08-3369-426b-b744-9cef04caaa7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f3122580-f73f-40fa-a838-6bad2ff9da2f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=09612b07-5142-4b0f-9dab-74bf4403f69f) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.874 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'c85299c6-8e38-42c8-8509-2eaaf15c050c'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[], external_ids={'neutron:ovn-metadata-id': '68f92086-2b44-5496-b923-e898b18e44d4', 'neutron:ovn-metadata-sb-cfg': '1'}, name=c85299c6-8e38-42c8-8509-2eaaf15c050c, nb_cfg_timestamp=1764321773767, nb_cfg=4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.875 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 09612b07-5142-4b0f-9dab-74bf4403f69f in datapath 40d5da59-6201-424a-8380-80ecc3d67c7e bound to our chassis on insert#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.875 158130 DEBUG neutron_lib.callbacks.manager [-] Subscribe: > process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.876 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.876 158130 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.876 158130 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.876 158130 INFO oslo_service.service [-] Starting 1 workers#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.879 158130 DEBUG oslo_service.service [-] Started child 158228 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.881 158130 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 40d5da59-6201-424a-8380-80ecc3d67c7e#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.883 158130 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp5o39znh6/privsep.sock']#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.883 158228 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-938909'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.914 158228 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.914 158228 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.914 158228 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.917 158228 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.919 158228 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected#033[00m Nov 28 04:23:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:50.934 158228 INFO eventlet.wsgi.server [-] (158228) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m Nov 28 04:23:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41543 DF PROTO=TCP SPT=47128 DPT=9105 SEQ=1499306541 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1556C20000000001030307) Nov 28 04:23:51 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:51.522 158130 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Nov 28 04:23:51 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:51.523 158130 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp5o39znh6/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Nov 28 04:23:51 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:51.420 158233 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 28 04:23:51 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:51.426 158233 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 28 04:23:51 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:51.430 158233 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m Nov 28 04:23:51 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:51.430 158233 INFO oslo.privsep.daemon [-] privsep daemon running as pid 158233#033[00m Nov 28 04:23:51 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:51.527 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[fc30ef09-60d6-4096-93ef-f4f2e7dcfede]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:23:51 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:51.939 158233 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:23:51 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:51.939 158233 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:23:51 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:51.939 158233 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:23:52 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:52.386 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[5834f15f-82c4-4260-a200-4368ee9e96ce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:23:52 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:52.388 158130 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpw1frvsfb/privsep.sock']#033[00m Nov 28 04:23:52 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:52.958 158130 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Nov 28 04:23:52 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:52.959 158130 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpw1frvsfb/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Nov 28 04:23:52 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:52.852 158244 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 28 04:23:52 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:52.859 158244 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 28 04:23:52 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:52.863 158244 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m Nov 28 04:23:52 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:52.863 158244 INFO oslo.privsep.daemon [-] privsep daemon running as pid 158244#033[00m Nov 28 04:23:52 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:52.962 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[ab96dbf2-acfe-432c-abee-ed751524657a]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:23:53 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:53.369 158244 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:23:53 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:53.369 158244 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:23:53 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:53.369 158244 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:23:53 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:53.824 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[4bca20db-92d9-4951-ac75-93c5bcdc5bf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:23:53 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:53.827 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[5a6d563e-0729-423e-8c23-5929ca2e50ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:23:53 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:53.847 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[d31dfd4b-d810-4361-880d-827cd6d6bab9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:23:53 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:53.864 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[b8d32f5d-eab5-4e5c-b4c5-1c99b295e0b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap40d5da59-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:28:4d:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 68, 'rx_bytes': 8926, 'tx_bytes': 7143, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 68, 'rx_bytes': 8926, 'tx_bytes': 7143, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483664], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662048, 'reachable_time': 36277, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 17, 'outoctets': 1164, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 17, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 1164, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 17, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 158254, 'error': None, 'target': 'ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:23:53 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:53.881 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[3b2c7a7f-a7b4-41b7-8415-5f2b42567c76]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap40d5da59-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662058, 'tstamp': 662058}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 158255, 'error': None, 'target': 'ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.0.2'], ['IFA_LOCAL', '192.168.0.2'], ['IFA_BROADCAST', '192.168.0.255'], ['IFA_LABEL', 'tap40d5da59-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662060, 'tstamp': 662060}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 158255, 'error': None, 'target': 'ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::a9fe:a9fe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662064, 'tstamp': 662064}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 158255, 'error': None, 'target': 'ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe28:4d05'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662048, 'tstamp': 662048}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 158255, 'error': None, 'target': 'ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:23:53 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:53.940 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[73dbde93-01db-4b03-9d28-bbec95e16c00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:23:53 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:53.941 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap40d5da59-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:23:53 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:53.947 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap40d5da59-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:23:53 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:53.947 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 28 04:23:53 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:53.948 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap40d5da59-60, col_values=(('external_ids', {'iface-id': '3ff57c88-06c6-4894-984a-80ce116d1456'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:23:53 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:53.948 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 28 04:23:53 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:53.952 158130 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp42n1wdi2/privsep.sock']#033[00m Nov 28 04:23:54 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:54.534 158130 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Nov 28 04:23:54 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:54.535 158130 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp42n1wdi2/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Nov 28 04:23:54 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:54.442 158264 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 28 04:23:54 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:54.447 158264 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 28 04:23:54 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:54.450 158264 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m Nov 28 04:23:54 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:54.451 158264 INFO oslo.privsep.daemon [-] privsep daemon running as pid 158264#033[00m Nov 28 04:23:54 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:54.537 158264 DEBUG oslo.privsep.daemon [-] privsep: reply[76a75132-fcb3-41f8-b8fa-318b80958263]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:23:54 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:54.974 158264 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:23:54 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:54.974 158264 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:23:54 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:54.974 158264 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:23:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41544 DF PROTO=TCP SPT=47128 DPT=9105 SEQ=1499306541 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1566830000000001030307) Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.409 158264 DEBUG oslo.privsep.daemon [-] privsep: reply[8ed73b80-1d33-48af-bf4a-d01dc14a1d28]: (4, ['ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e']) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.412 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, column=external_ids, values=({'neutron:ovn-metadata-id': '68f92086-2b44-5496-b923-e898b18e44d4'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.413 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.414 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.430 158130 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.430 158130 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.430 158130 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.430 158130 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.430 158130 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.430 158130 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.431 158130 DEBUG oslo_service.service [-] agent_down_time = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.431 158130 DEBUG oslo_service.service [-] allow_bulk = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.431 158130 DEBUG oslo_service.service [-] api_extensions_path = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.432 158130 DEBUG oslo_service.service [-] api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.432 158130 DEBUG oslo_service.service [-] api_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.432 158130 DEBUG oslo_service.service [-] auth_ca_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.432 158130 DEBUG oslo_service.service [-] auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.432 158130 DEBUG oslo_service.service [-] backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.433 158130 DEBUG oslo_service.service [-] base_mac = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.433 158130 DEBUG oslo_service.service [-] bind_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.433 158130 DEBUG oslo_service.service [-] bind_port = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.434 158130 DEBUG oslo_service.service [-] client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.434 158130 DEBUG oslo_service.service [-] config_dir = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.434 158130 DEBUG oslo_service.service [-] config_file = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.434 158130 DEBUG oslo_service.service [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.435 158130 DEBUG oslo_service.service [-] control_exchange = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.435 158130 DEBUG oslo_service.service [-] core_plugin = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.435 158130 DEBUG oslo_service.service [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.435 158130 DEBUG oslo_service.service [-] default_availability_zones = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.436 158130 DEBUG oslo_service.service [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.436 158130 DEBUG oslo_service.service [-] dhcp_agent_notification = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.436 158130 DEBUG oslo_service.service [-] dhcp_lease_duration = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.436 158130 DEBUG oslo_service.service [-] dhcp_load_type = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.437 158130 DEBUG oslo_service.service [-] dns_domain = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.437 158130 DEBUG oslo_service.service [-] enable_new_agents = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.437 158130 DEBUG oslo_service.service [-] enable_traditional_dhcp = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.437 158130 DEBUG oslo_service.service [-] external_dns_driver = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.438 158130 DEBUG oslo_service.service [-] external_pids = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.438 158130 DEBUG oslo_service.service [-] filter_validation = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.438 158130 DEBUG oslo_service.service [-] global_physnet_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.438 158130 DEBUG oslo_service.service [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.439 158130 DEBUG oslo_service.service [-] host = np0005538513.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.439 158130 DEBUG oslo_service.service [-] http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.439 158130 DEBUG oslo_service.service [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.439 158130 DEBUG oslo_service.service [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.439 158130 DEBUG oslo_service.service [-] ipam_driver = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.440 158130 DEBUG oslo_service.service [-] ipv6_pd_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.440 158130 DEBUG oslo_service.service [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.440 158130 DEBUG oslo_service.service [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.440 158130 DEBUG oslo_service.service [-] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.440 158130 DEBUG oslo_service.service [-] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.441 158130 DEBUG oslo_service.service [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.441 158130 DEBUG oslo_service.service [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.441 158130 DEBUG oslo_service.service [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.441 158130 DEBUG oslo_service.service [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.442 158130 DEBUG oslo_service.service [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.442 158130 DEBUG oslo_service.service [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.442 158130 DEBUG oslo_service.service [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.442 158130 DEBUG oslo_service.service [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.442 158130 DEBUG oslo_service.service [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.443 158130 DEBUG oslo_service.service [-] max_dns_nameservers = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.443 158130 DEBUG oslo_service.service [-] max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.443 158130 DEBUG oslo_service.service [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.443 158130 DEBUG oslo_service.service [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.443 158130 DEBUG oslo_service.service [-] max_subnet_host_routes = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.444 158130 DEBUG oslo_service.service [-] metadata_backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.444 158130 DEBUG oslo_service.service [-] metadata_proxy_group = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.444 158130 DEBUG oslo_service.service [-] metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.444 158130 DEBUG oslo_service.service [-] metadata_proxy_socket = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.445 158130 DEBUG oslo_service.service [-] metadata_proxy_socket_mode = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.445 158130 DEBUG oslo_service.service [-] metadata_proxy_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.445 158130 DEBUG oslo_service.service [-] metadata_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.445 158130 DEBUG oslo_service.service [-] network_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.446 158130 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.446 158130 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.446 158130 DEBUG oslo_service.service [-] nova_client_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.446 158130 DEBUG oslo_service.service [-] nova_client_priv_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.446 158130 DEBUG oslo_service.service [-] nova_metadata_host = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.447 158130 DEBUG oslo_service.service [-] nova_metadata_insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.447 158130 DEBUG oslo_service.service [-] nova_metadata_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.447 158130 DEBUG oslo_service.service [-] nova_metadata_protocol = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.447 158130 DEBUG oslo_service.service [-] pagination_max_limit = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.448 158130 DEBUG oslo_service.service [-] periodic_fuzzy_delay = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.448 158130 DEBUG oslo_service.service [-] periodic_interval = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.448 158130 DEBUG oslo_service.service [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.448 158130 DEBUG oslo_service.service [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.448 158130 DEBUG oslo_service.service [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.449 158130 DEBUG oslo_service.service [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.449 158130 DEBUG oslo_service.service [-] retry_until_window = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.449 158130 DEBUG oslo_service.service [-] rpc_resources_processing_step = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.449 158130 DEBUG oslo_service.service [-] rpc_response_max_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.449 158130 DEBUG oslo_service.service [-] rpc_state_report_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.450 158130 DEBUG oslo_service.service [-] rpc_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.450 158130 DEBUG oslo_service.service [-] send_events_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.450 158130 DEBUG oslo_service.service [-] service_plugins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.450 158130 DEBUG oslo_service.service [-] setproctitle = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.451 158130 DEBUG oslo_service.service [-] state_path = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.451 158130 DEBUG oslo_service.service [-] syslog_log_facility = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.451 158130 DEBUG oslo_service.service [-] tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.451 158130 DEBUG oslo_service.service [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.451 158130 DEBUG oslo_service.service [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.452 158130 DEBUG oslo_service.service [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.452 158130 DEBUG oslo_service.service [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.452 158130 DEBUG oslo_service.service [-] use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.452 158130 DEBUG oslo_service.service [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.452 158130 DEBUG oslo_service.service [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.453 158130 DEBUG oslo_service.service [-] vlan_transparent = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.453 158130 DEBUG oslo_service.service [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.453 158130 DEBUG oslo_service.service [-] wsgi_default_pool_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.453 158130 DEBUG oslo_service.service [-] wsgi_keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.453 158130 DEBUG oslo_service.service [-] wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.454 158130 DEBUG oslo_service.service [-] wsgi_server_debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.454 158130 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.454 158130 DEBUG oslo_service.service [-] oslo_concurrency.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.454 158130 DEBUG oslo_service.service [-] profiler.connection_string = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.455 158130 DEBUG oslo_service.service [-] profiler.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.455 158130 DEBUG oslo_service.service [-] profiler.es_doc_type = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.455 158130 DEBUG oslo_service.service [-] profiler.es_scroll_size = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.455 158130 DEBUG oslo_service.service [-] profiler.es_scroll_time = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.456 158130 DEBUG oslo_service.service [-] profiler.filter_error_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.456 158130 DEBUG oslo_service.service [-] profiler.hmac_keys = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.456 158130 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.456 158130 DEBUG oslo_service.service [-] profiler.socket_timeout = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.457 158130 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.457 158130 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.457 158130 DEBUG oslo_service.service [-] oslo_policy.enforce_scope = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.457 158130 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.458 158130 DEBUG oslo_service.service [-] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.458 158130 DEBUG oslo_service.service [-] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.458 158130 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.458 158130 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.458 158130 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.459 158130 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.459 158130 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.459 158130 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.460 158130 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.460 158130 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.460 158130 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.460 158130 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.461 158130 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.461 158130 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.461 158130 DEBUG oslo_service.service [-] privsep.capabilities = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.461 158130 DEBUG oslo_service.service [-] privsep.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.462 158130 DEBUG oslo_service.service [-] privsep.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.462 158130 DEBUG oslo_service.service [-] privsep.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.462 158130 DEBUG oslo_service.service [-] privsep.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.462 158130 DEBUG oslo_service.service [-] privsep.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.462 158130 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.463 158130 DEBUG oslo_service.service [-] privsep_dhcp_release.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.463 158130 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.463 158130 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.463 158130 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.463 158130 DEBUG oslo_service.service [-] privsep_dhcp_release.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.464 158130 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.464 158130 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.464 158130 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.464 158130 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.464 158130 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.465 158130 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.465 158130 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.465 158130 DEBUG oslo_service.service [-] privsep_namespace.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.465 158130 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.465 158130 DEBUG oslo_service.service [-] privsep_namespace.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.466 158130 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.466 158130 DEBUG oslo_service.service [-] privsep_namespace.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.466 158130 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.466 158130 DEBUG oslo_service.service [-] privsep_conntrack.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.467 158130 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.467 158130 DEBUG oslo_service.service [-] privsep_conntrack.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.467 158130 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.467 158130 DEBUG oslo_service.service [-] privsep_conntrack.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.467 158130 DEBUG oslo_service.service [-] privsep_link.capabilities = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.468 158130 DEBUG oslo_service.service [-] privsep_link.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.468 158130 DEBUG oslo_service.service [-] privsep_link.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.468 158130 DEBUG oslo_service.service [-] privsep_link.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.468 158130 DEBUG oslo_service.service [-] privsep_link.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.468 158130 DEBUG oslo_service.service [-] privsep_link.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.469 158130 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.469 158130 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.469 158130 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.469 158130 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.470 158130 DEBUG oslo_service.service [-] AGENT.kill_scripts_path = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.470 158130 DEBUG oslo_service.service [-] AGENT.root_helper = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.470 158130 DEBUG oslo_service.service [-] AGENT.root_helper_daemon = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.470 158130 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.470 158130 DEBUG oslo_service.service [-] AGENT.use_random_fully = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.471 158130 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.471 158130 DEBUG oslo_service.service [-] QUOTAS.default_quota = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.471 158130 DEBUG oslo_service.service [-] QUOTAS.quota_driver = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.471 158130 DEBUG oslo_service.service [-] QUOTAS.quota_network = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.471 158130 DEBUG oslo_service.service [-] QUOTAS.quota_port = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.472 158130 DEBUG oslo_service.service [-] QUOTAS.quota_security_group = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.472 158130 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.472 158130 DEBUG oslo_service.service [-] QUOTAS.quota_subnet = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.472 158130 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.473 158130 DEBUG oslo_service.service [-] nova.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.473 158130 DEBUG oslo_service.service [-] nova.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.473 158130 DEBUG oslo_service.service [-] nova.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.473 158130 DEBUG oslo_service.service [-] nova.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.474 158130 DEBUG oslo_service.service [-] nova.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.474 158130 DEBUG oslo_service.service [-] nova.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.474 158130 DEBUG oslo_service.service [-] nova.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.474 158130 DEBUG oslo_service.service [-] nova.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.474 158130 DEBUG oslo_service.service [-] nova.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.475 158130 DEBUG oslo_service.service [-] nova.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.475 158130 DEBUG oslo_service.service [-] nova.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.475 158130 DEBUG oslo_service.service [-] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.475 158130 DEBUG oslo_service.service [-] placement.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.475 158130 DEBUG oslo_service.service [-] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.476 158130 DEBUG oslo_service.service [-] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.476 158130 DEBUG oslo_service.service [-] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.476 158130 DEBUG oslo_service.service [-] placement.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.476 158130 DEBUG oslo_service.service [-] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.476 158130 DEBUG oslo_service.service [-] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.477 158130 DEBUG oslo_service.service [-] placement.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.477 158130 DEBUG oslo_service.service [-] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.477 158130 DEBUG oslo_service.service [-] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.477 158130 DEBUG oslo_service.service [-] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.478 158130 DEBUG oslo_service.service [-] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.478 158130 DEBUG oslo_service.service [-] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.478 158130 DEBUG oslo_service.service [-] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.478 158130 DEBUG oslo_service.service [-] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.478 158130 DEBUG oslo_service.service [-] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.478 158130 DEBUG oslo_service.service [-] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.479 158130 DEBUG oslo_service.service [-] ironic.enable_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.479 158130 DEBUG oslo_service.service [-] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.479 158130 DEBUG oslo_service.service [-] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.479 158130 DEBUG oslo_service.service [-] ironic.interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.479 158130 DEBUG oslo_service.service [-] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.479 158130 DEBUG oslo_service.service [-] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.479 158130 DEBUG oslo_service.service [-] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.479 158130 DEBUG oslo_service.service [-] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.480 158130 DEBUG oslo_service.service [-] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.480 158130 DEBUG oslo_service.service [-] ironic.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.480 158130 DEBUG oslo_service.service [-] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.480 158130 DEBUG oslo_service.service [-] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.480 158130 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.480 158130 DEBUG oslo_service.service [-] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.480 158130 DEBUG oslo_service.service [-] ironic.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.480 158130 DEBUG oslo_service.service [-] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.481 158130 DEBUG oslo_service.service [-] cli_script.dry_run = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.481 158130 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.481 158130 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.481 158130 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.481 158130 DEBUG oslo_service.service [-] ovn.dns_servers = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.481 158130 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.481 158130 DEBUG oslo_service.service [-] ovn.neutron_sync_mode = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.482 158130 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.482 158130 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.482 158130 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.482 158130 DEBUG oslo_service.service [-] ovn.ovn_l3_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.482 158130 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.482 158130 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.482 158130 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.482 158130 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.483 158130 DEBUG oslo_service.service [-] ovn.ovn_nb_connection = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.483 158130 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.483 158130 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.483 158130 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.483 158130 DEBUG oslo_service.service [-] ovn.ovn_sb_connection = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.483 158130 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.483 158130 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.483 158130 DEBUG oslo_service.service [-] ovn.ovsdb_log_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.484 158130 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.484 158130 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.484 158130 DEBUG oslo_service.service [-] ovn.vhost_sock_dir = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.484 158130 DEBUG oslo_service.service [-] ovn.vif_type = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.484 158130 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.484 158130 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.484 158130 DEBUG oslo_service.service [-] OVS.ovsdb_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.485 158130 DEBUG oslo_service.service [-] ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.485 158130 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.485 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.485 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.485 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.485 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.485 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.486 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.486 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.486 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.486 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.486 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.486 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.486 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.487 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.487 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.487 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.487 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.487 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.487 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.487 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.488 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.488 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.488 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.488 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.488 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.488 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.488 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.488 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.489 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.489 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.489 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.489 158130 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.489 158130 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.489 158130 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.489 158130 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.490 158130 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:23:55 localhost ovn_metadata_agent[158125]: 2025-11-28 09:23:55.490 158130 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Nov 28 04:23:56 localhost sshd[158270]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:23:56 localhost systemd-logind[764]: New session 52 of user zuul. Nov 28 04:23:57 localhost systemd[1]: Started Session 52 of User zuul. Nov 28 04:23:57 localhost python3.9[158363]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:23:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17377 DF PROTO=TCP SPT=34796 DPT=9101 SEQ=678617077 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1571890000000001030307) Nov 28 04:23:59 localhost python3.9[158459]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:24:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17379 DF PROTO=TCP SPT=34796 DPT=9101 SEQ=678617077 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB157D830000000001030307) Nov 28 04:24:01 localhost python3.9[158564]: ansible-ansible.legacy.command Invoked with _raw_params=podman stop nova_virtlogd _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:24:01 localhost systemd[1]: libpod-4ec37aacc81c30d8f8c1872da98bbae1184ac4d79565b18c976c336f5619bd9a.scope: Deactivated successfully. Nov 28 04:24:01 localhost podman[158565]: 2025-11-28 09:24:01.566576115 +0000 UTC m=+0.070989906 container died 4ec37aacc81c30d8f8c1872da98bbae1184ac4d79565b18c976c336f5619bd9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64) Nov 28 04:24:01 localhost podman[158565]: 2025-11-28 09:24:01.612101348 +0000 UTC m=+0.116515129 container cleanup 4ec37aacc81c30d8f8c1872da98bbae1184ac4d79565b18c976c336f5619bd9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-19T00:35:22Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044) Nov 28 04:24:01 localhost podman[158580]: 2025-11-28 09:24:01.65554264 +0000 UTC m=+0.083906459 container remove 4ec37aacc81c30d8f8c1872da98bbae1184ac4d79565b18c976c336f5619bd9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, build-date=2025-11-19T00:35:22Z, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt) Nov 28 04:24:01 localhost systemd[1]: libpod-conmon-4ec37aacc81c30d8f8c1872da98bbae1184ac4d79565b18c976c336f5619bd9a.scope: Deactivated successfully. Nov 28 04:24:02 localhost systemd[1]: var-lib-containers-storage-overlay-2373ee3a3140a9b51206ab4b5b4d03e16415bec87f47ab14239ebc6234afc02f-merged.mount: Deactivated successfully. Nov 28 04:24:02 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4ec37aacc81c30d8f8c1872da98bbae1184ac4d79565b18c976c336f5619bd9a-userdata-shm.mount: Deactivated successfully. Nov 28 04:24:02 localhost python3.9[158687]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 28 04:24:02 localhost systemd[1]: Reloading. Nov 28 04:24:03 localhost systemd-sysv-generator[158718]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:24:03 localhost systemd-rc-local-generator[158715]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:24:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:24:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55235 DF PROTO=TCP SPT=43058 DPT=9102 SEQ=4203539387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1586830000000001030307) Nov 28 04:24:04 localhost python3.9[158813]: ansible-ansible.builtin.service_facts Invoked Nov 28 04:24:04 localhost network[158830]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 28 04:24:04 localhost network[158831]: 'network-scripts' will be removed from distribution in near future. Nov 28 04:24:04 localhost network[158832]: It is advised to switch to 'NetworkManager' instead for network management. Nov 28 04:24:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36086 DF PROTO=TCP SPT=50676 DPT=9102 SEQ=2753317606 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1591830000000001030307) Nov 28 04:24:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:24:08 localhost python3.9[159033]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:24:08 localhost systemd[1]: Reloading. Nov 28 04:24:08 localhost systemd-sysv-generator[159067]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:24:08 localhost systemd-rc-local-generator[159064]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:24:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:24:08 localhost systemd[1]: Stopped target tripleo_nova_libvirt.target. Nov 28 04:24:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19400 DF PROTO=TCP SPT=57738 DPT=9100 SEQ=970466610 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB159D820000000001030307) Nov 28 04:24:10 localhost python3.9[159164]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:24:11 localhost python3.9[159257]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:24:11 localhost python3.9[159350]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:24:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63320 DF PROTO=TCP SPT=46898 DPT=9100 SEQ=3905351516 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB15A9C20000000001030307) Nov 28 04:24:12 localhost python3.9[159443]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:24:13 localhost python3.9[159536]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:24:14 localhost python3.9[159629]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:24:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:24:15 localhost podman[159645]: 2025-11-28 09:24:15.858635433 +0000 UTC m=+0.089528935 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true) Nov 28 04:24:15 localhost podman[159645]: 2025-11-28 09:24:15.943329408 +0000 UTC m=+0.174222880 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 28 04:24:15 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:24:17 localhost python3.9[159747]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:24:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32600 DF PROTO=TCP SPT=56834 DPT=9882 SEQ=1286129200 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB15BC420000000001030307) Nov 28 04:24:17 localhost python3.9[159839]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:24:18 localhost python3.9[159931]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:24:18 localhost python3.9[160023]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:24:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64366 DF PROTO=TCP SPT=37896 DPT=9105 SEQ=1913146142 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB15C3C30000000001030307) Nov 28 04:24:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:24:19 localhost systemd[1]: tmp-crun.C3no49.mount: Deactivated successfully. Nov 28 04:24:19 localhost podman[160116]: 2025-11-28 09:24:19.493044142 +0000 UTC m=+0.087090140 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Nov 28 04:24:19 localhost podman[160116]: 2025-11-28 09:24:19.501410585 +0000 UTC m=+0.095456583 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 04:24:19 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:24:19 localhost python3.9[160115]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:24:20 localhost python3.9[160224]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:24:20 localhost python3.9[160316]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:24:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64367 DF PROTO=TCP SPT=37896 DPT=9105 SEQ=1913146142 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB15CBC20000000001030307) Nov 28 04:24:21 localhost python3.9[160408]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:24:22 localhost python3.9[160500]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:24:22 localhost python3.9[160592]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:24:23 localhost python3.9[160684]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:24:23 localhost python3.9[160776]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:24:24 localhost python3.9[160868]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:24:25 localhost python3.9[160960]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:24:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64368 DF PROTO=TCP SPT=37896 DPT=9105 SEQ=1913146142 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB15DB830000000001030307) Nov 28 04:24:25 localhost python3.9[161052]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:24:26 localhost python3.9[161144]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 28 04:24:27 localhost python3.9[161236]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 28 04:24:27 localhost systemd[1]: Reloading. Nov 28 04:24:27 localhost systemd-rc-local-generator[161263]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:24:27 localhost systemd-sysv-generator[161266]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:24:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:24:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58530 DF PROTO=TCP SPT=33760 DPT=9101 SEQ=3918545985 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB15E6B90000000001030307) Nov 28 04:24:28 localhost python3.9[161364]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:24:29 localhost python3.9[161457]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:24:30 localhost python3.9[161550]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:24:30 localhost python3.9[161643]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:24:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58532 DF PROTO=TCP SPT=33760 DPT=9101 SEQ=3918545985 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB15F2C20000000001030307) Nov 28 04:24:31 localhost python3.9[161736]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:24:31 localhost python3.9[161829]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:24:32 localhost python3.9[161953]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:24:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64369 DF PROTO=TCP SPT=37896 DPT=9105 SEQ=1913146142 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB15FB820000000001030307) Nov 28 04:24:36 localhost python3.9[162092]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None Nov 28 04:24:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37519 DF PROTO=TCP SPT=59926 DPT=9100 SEQ=490293883 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1607420000000001030307) Nov 28 04:24:36 localhost python3.9[162185]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Nov 28 04:24:38 localhost python3.9[162283]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005538513.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None Nov 28 04:24:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61335 DF PROTO=TCP SPT=47258 DPT=9100 SEQ=1542535815 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1613820000000001030307) Nov 28 04:24:39 localhost python3.9[162383]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 28 04:24:40 localhost python3.9[162437]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 28 04:24:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37521 DF PROTO=TCP SPT=59926 DPT=9100 SEQ=490293883 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB161F020000000001030307) Nov 28 04:24:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:24:46 localhost podman[162507]: 2025-11-28 09:24:46.859427347 +0000 UTC m=+0.092609495 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:24:46 localhost podman[162507]: 2025-11-28 09:24:46.898105464 +0000 UTC m=+0.131287662 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 28 04:24:46 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:24:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36853 DF PROTO=TCP SPT=49260 DPT=9882 SEQ=3662475890 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1631820000000001030307) Nov 28 04:24:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59358 DF PROTO=TCP SPT=54070 DPT=9105 SEQ=75823109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1639020000000001030307) Nov 28 04:24:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:24:49 localhost systemd[1]: tmp-crun.OvGzIY.mount: Deactivated successfully. Nov 28 04:24:49 localhost podman[162537]: 2025-11-28 09:24:49.870511609 +0000 UTC m=+0.107849058 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true) Nov 28 04:24:49 localhost podman[162537]: 2025-11-28 09:24:49.880360124 +0000 UTC m=+0.117697603 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:24:49 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:24:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:24:50.776 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:24:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:24:50.777 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:24:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:24:50.778 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:24:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59359 DF PROTO=TCP SPT=54070 DPT=9105 SEQ=75823109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1641020000000001030307) Nov 28 04:24:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59360 DF PROTO=TCP SPT=54070 DPT=9105 SEQ=75823109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1650C20000000001030307) Nov 28 04:24:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28509 DF PROTO=TCP SPT=36002 DPT=9101 SEQ=938269270 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB165BEA0000000001030307) Nov 28 04:25:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28511 DF PROTO=TCP SPT=36002 DPT=9101 SEQ=938269270 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1668020000000001030307) Nov 28 04:25:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59478 DF PROTO=TCP SPT=37558 DPT=9102 SEQ=1286343932 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1670C20000000001030307) Nov 28 04:25:05 localhost kernel: SELinux: Converting 2759 SID table entries... Nov 28 04:25:05 localhost kernel: SELinux: Context system_u:object_r:insights_client_cache_t:s0 became invalid (unmapped). Nov 28 04:25:05 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 28 04:25:05 localhost kernel: SELinux: policy capability open_perms=1 Nov 28 04:25:05 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 28 04:25:05 localhost kernel: SELinux: policy capability always_check_network=0 Nov 28 04:25:05 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 28 04:25:05 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 28 04:25:05 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 28 04:25:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52815 DF PROTO=TCP SPT=37692 DPT=9100 SEQ=1592430692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB167C820000000001030307) Nov 28 04:25:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63323 DF PROTO=TCP SPT=46898 DPT=9100 SEQ=3905351516 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1687830000000001030307) Nov 28 04:25:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52817 DF PROTO=TCP SPT=37692 DPT=9100 SEQ=1592430692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1694420000000001030307) Nov 28 04:25:16 localhost kernel: SELinux: Converting 2762 SID table entries... Nov 28 04:25:16 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 28 04:25:16 localhost kernel: SELinux: policy capability open_perms=1 Nov 28 04:25:16 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 28 04:25:16 localhost kernel: SELinux: policy capability always_check_network=0 Nov 28 04:25:16 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 28 04:25:16 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 28 04:25:16 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 28 04:25:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29416 DF PROTO=TCP SPT=43222 DPT=9882 SEQ=4111102986 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB16A6C30000000001030307) Nov 28 04:25:17 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=20 res=1 Nov 28 04:25:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:25:17 localhost podman[163610]: 2025-11-28 09:25:17.85744877 +0000 UTC m=+0.082188948 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible) Nov 28 04:25:17 localhost podman[163610]: 2025-11-28 09:25:17.941868112 +0000 UTC m=+0.166608310 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:25:17 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:25:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46223 DF PROTO=TCP SPT=46288 DPT=9105 SEQ=3972451082 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB16AE420000000001030307) Nov 28 04:25:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:25:20 localhost systemd[1]: tmp-crun.95slex.mount: Deactivated successfully. Nov 28 04:25:20 localhost podman[163635]: 2025-11-28 09:25:20.84336367 +0000 UTC m=+0.084760606 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 04:25:20 localhost podman[163635]: 2025-11-28 09:25:20.879581963 +0000 UTC m=+0.120978929 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Nov 28 04:25:20 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:25:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46224 DF PROTO=TCP SPT=46288 DPT=9105 SEQ=3972451082 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB16B6430000000001030307) Nov 28 04:25:24 localhost kernel: SELinux: Converting 2762 SID table entries... Nov 28 04:25:24 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 28 04:25:24 localhost kernel: SELinux: policy capability open_perms=1 Nov 28 04:25:24 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 28 04:25:24 localhost kernel: SELinux: policy capability always_check_network=0 Nov 28 04:25:24 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 28 04:25:24 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 28 04:25:24 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 28 04:25:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46225 DF PROTO=TCP SPT=46288 DPT=9105 SEQ=3972451082 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB16C6030000000001030307) Nov 28 04:25:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11502 DF PROTO=TCP SPT=50306 DPT=9101 SEQ=2286596531 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB16D1190000000001030307) Nov 28 04:25:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11504 DF PROTO=TCP SPT=50306 DPT=9101 SEQ=2286596531 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB16DD020000000001030307) Nov 28 04:25:32 localhost kernel: SELinux: Converting 2762 SID table entries... Nov 28 04:25:32 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 28 04:25:32 localhost kernel: SELinux: policy capability open_perms=1 Nov 28 04:25:32 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 28 04:25:32 localhost kernel: SELinux: policy capability always_check_network=0 Nov 28 04:25:32 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 28 04:25:32 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 28 04:25:32 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 28 04:25:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46226 DF PROTO=TCP SPT=46288 DPT=9105 SEQ=3972451082 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB16E5830000000001030307) Nov 28 04:25:33 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=22 res=1 Nov 28 04:25:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41817 DF PROTO=TCP SPT=42968 DPT=9100 SEQ=483642328 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB16F1820000000001030307) Nov 28 04:25:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37524 DF PROTO=TCP SPT=59926 DPT=9100 SEQ=490293883 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB16FD820000000001030307) Nov 28 04:25:41 localhost kernel: SELinux: Converting 2762 SID table entries... Nov 28 04:25:41 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 28 04:25:41 localhost kernel: SELinux: policy capability open_perms=1 Nov 28 04:25:41 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 28 04:25:41 localhost kernel: SELinux: policy capability always_check_network=0 Nov 28 04:25:41 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 28 04:25:41 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 28 04:25:41 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 28 04:25:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41819 DF PROTO=TCP SPT=42968 DPT=9100 SEQ=483642328 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1709420000000001030307) Nov 28 04:25:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5423 DF PROTO=TCP SPT=37314 DPT=9882 SEQ=1941038903 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB171BC20000000001030307) Nov 28 04:25:48 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=23 res=1 Nov 28 04:25:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:25:48 localhost systemd[1]: tmp-crun.bMFeha.mount: Deactivated successfully. Nov 28 04:25:48 localhost podman[163766]: 2025-11-28 09:25:48.853275703 +0000 UTC m=+0.089103333 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:25:48 localhost podman[163766]: 2025-11-28 09:25:48.930430265 +0000 UTC m=+0.166257925 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller) Nov 28 04:25:48 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:25:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18234 DF PROTO=TCP SPT=54654 DPT=9105 SEQ=947938433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1723820000000001030307) Nov 28 04:25:49 localhost kernel: SELinux: Converting 2762 SID table entries... Nov 28 04:25:49 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 28 04:25:49 localhost kernel: SELinux: policy capability open_perms=1 Nov 28 04:25:49 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 28 04:25:49 localhost kernel: SELinux: policy capability always_check_network=0 Nov 28 04:25:49 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 28 04:25:49 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 28 04:25:49 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 28 04:25:50 localhost systemd[1]: Reloading. Nov 28 04:25:50 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=24 res=1 Nov 28 04:25:50 localhost systemd-rc-local-generator[163819]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:25:50 localhost systemd-sysv-generator[163823]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:25:50 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:25:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:25:50.777 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:25:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:25:50.779 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:25:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:25:50.781 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:25:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:25:50 localhost systemd[1]: Reloading. Nov 28 04:25:51 localhost systemd-rc-local-generator[163875]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:25:51 localhost systemd-sysv-generator[163878]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:25:51 localhost podman[163838]: 2025-11-28 09:25:51.035091492 +0000 UTC m=+0.094158080 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:25:51 localhost podman[163838]: 2025-11-28 09:25:51.075406563 +0000 UTC m=+0.134473111 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 28 04:25:51 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:25:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18235 DF PROTO=TCP SPT=54654 DPT=9105 SEQ=947938433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB172B820000000001030307) Nov 28 04:25:51 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:25:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18236 DF PROTO=TCP SPT=54654 DPT=9105 SEQ=947938433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB173B420000000001030307) Nov 28 04:25:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2146 DF PROTO=TCP SPT=44362 DPT=9101 SEQ=87046848 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB17464A0000000001030307) Nov 28 04:25:59 localhost kernel: SELinux: Converting 2763 SID table entries... Nov 28 04:25:59 localhost kernel: SELinux: policy capability network_peer_controls=1 Nov 28 04:25:59 localhost kernel: SELinux: policy capability open_perms=1 Nov 28 04:25:59 localhost kernel: SELinux: policy capability extended_socket_class=1 Nov 28 04:25:59 localhost kernel: SELinux: policy capability always_check_network=0 Nov 28 04:25:59 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Nov 28 04:25:59 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Nov 28 04:25:59 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Nov 28 04:26:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2148 DF PROTO=TCP SPT=44362 DPT=9101 SEQ=87046848 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1752420000000001030307) Nov 28 04:26:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58586 DF PROTO=TCP SPT=53808 DPT=9102 SEQ=2849910390 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB175B420000000001030307) Nov 28 04:26:03 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Nov 28 04:26:03 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=25 res=1 Nov 28 04:26:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16205 DF PROTO=TCP SPT=59800 DPT=9100 SEQ=1526083829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1766C20000000001030307) Nov 28 04:26:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58588 DF PROTO=TCP SPT=53808 DPT=9102 SEQ=2849910390 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1773020000000001030307) Nov 28 04:26:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16207 DF PROTO=TCP SPT=59800 DPT=9100 SEQ=1526083829 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB177E830000000001030307) Nov 28 04:26:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57711 DF PROTO=TCP SPT=47390 DPT=9882 SEQ=4214922872 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1791020000000001030307) Nov 28 04:26:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6058 DF PROTO=TCP SPT=43982 DPT=9105 SEQ=1558679513 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1798820000000001030307) Nov 28 04:26:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:26:19 localhost systemd[1]: tmp-crun.FO9pth.mount: Deactivated successfully. Nov 28 04:26:19 localhost podman[166410]: 2025-11-28 09:26:19.866513566 +0000 UTC m=+0.091962013 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 04:26:19 localhost podman[166410]: 2025-11-28 09:26:19.932707538 +0000 UTC m=+0.158155975 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 28 04:26:19 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:26:21 localhost sshd[167326]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:26:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6059 DF PROTO=TCP SPT=43982 DPT=9105 SEQ=1558679513 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB17A0830000000001030307) Nov 28 04:26:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:26:21 localhost podman[167796]: 2025-11-28 09:26:21.625074852 +0000 UTC m=+0.081332733 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent) Nov 28 04:26:21 localhost podman[167796]: 2025-11-28 09:26:21.660352255 +0000 UTC m=+0.116610126 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 04:26:21 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:26:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6060 DF PROTO=TCP SPT=43982 DPT=9105 SEQ=1558679513 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB17B0420000000001030307) Nov 28 04:26:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3556 DF PROTO=TCP SPT=35458 DPT=9101 SEQ=665330896 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB17BB7A0000000001030307) Nov 28 04:26:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3558 DF PROTO=TCP SPT=35458 DPT=9101 SEQ=665330896 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB17C7830000000001030307) Nov 28 04:26:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27991 DF PROTO=TCP SPT=35778 DPT=9102 SEQ=1355455547 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB17D0820000000001030307) Nov 28 04:26:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36223 DF PROTO=TCP SPT=33558 DPT=9102 SEQ=1782435790 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB17DB830000000001030307) Nov 28 04:26:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41822 DF PROTO=TCP SPT=42968 DPT=9100 SEQ=483642328 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB17E7830000000001030307) Nov 28 04:26:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48818 DF PROTO=TCP SPT=47006 DPT=9100 SEQ=2140497574 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB17F3C20000000001030307) Nov 28 04:26:45 localhost systemd[1]: Stopping OpenSSH server daemon... Nov 28 04:26:45 localhost systemd[1]: sshd.service: Deactivated successfully. Nov 28 04:26:45 localhost systemd[1]: Stopped OpenSSH server daemon. Nov 28 04:26:45 localhost systemd[1]: sshd.service: Consumed 1.138s CPU time, read 32.0K from disk, written 0B to disk. Nov 28 04:26:45 localhost systemd[1]: Stopped target sshd-keygen.target. Nov 28 04:26:45 localhost systemd[1]: Stopping sshd-keygen.target... Nov 28 04:26:45 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 28 04:26:45 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 28 04:26:45 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Nov 28 04:26:45 localhost systemd[1]: Reached target sshd-keygen.target. Nov 28 04:26:45 localhost systemd[1]: Starting OpenSSH server daemon... Nov 28 04:26:45 localhost sshd[181730]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:26:45 localhost systemd[1]: Started OpenSSH server daemon. Nov 28 04:26:46 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:46 localhost systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:46 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:46 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:46 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:46 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:46 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:46 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:46 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:46 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:46 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:46 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:47 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:47 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50588 DF PROTO=TCP SPT=50622 DPT=9882 SEQ=2933895119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1806420000000001030307) Nov 28 04:26:47 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:47 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:47 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 28 04:26:47 localhost systemd[1]: Starting man-db-cache-update.service... Nov 28 04:26:47 localhost systemd[1]: Reloading. Nov 28 04:26:48 localhost systemd-rc-local-generator[181960]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:26:48 localhost systemd-sysv-generator[181963]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:26:48 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:48 localhost systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:48 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:26:48 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:48 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:48 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:48 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:48 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:48 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:48 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:48 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 28 04:26:48 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 28 04:26:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25237 DF PROTO=TCP SPT=53944 DPT=9105 SEQ=2637639503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB180DC20000000001030307) Nov 28 04:26:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:26:50 localhost podman[185156]: 2025-11-28 09:26:50.702326261 +0000 UTC m=+0.176167188 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.schema-version=1.0) Nov 28 04:26:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:26:50.778 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:26:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:26:50.778 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:26:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:26:50.779 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:26:50 localhost podman[185156]: 2025-11-28 09:26:50.790451621 +0000 UTC m=+0.264292548 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 28 04:26:50 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:26:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25238 DF PROTO=TCP SPT=53944 DPT=9105 SEQ=2637639503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1815C20000000001030307) Nov 28 04:26:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:26:51 localhost podman[186458]: 2025-11-28 09:26:51.843012487 +0000 UTC m=+0.080521781 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Nov 28 04:26:51 localhost podman[186458]: 2025-11-28 09:26:51.879417995 +0000 UTC m=+0.116927339 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:26:51 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:26:54 localhost python3.9[188170]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 28 04:26:54 localhost systemd[1]: Reloading. Nov 28 04:26:54 localhost systemd-sysv-generator[188427]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:26:54 localhost systemd-rc-local-generator[188420]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:26:54 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:54 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:26:54 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:54 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:54 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:54 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:54 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:54 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:54 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25239 DF PROTO=TCP SPT=53944 DPT=9105 SEQ=2637639503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1825820000000001030307) Nov 28 04:26:55 localhost python3.9[188895]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 28 04:26:55 localhost systemd[1]: Reloading. Nov 28 04:26:55 localhost systemd-sysv-generator[189125]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:26:55 localhost systemd-rc-local-generator[189122]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:26:55 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:55 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:26:55 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:55 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:55 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:55 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:55 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:55 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:55 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:56 localhost python3.9[189584]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 28 04:26:56 localhost systemd[1]: Reloading. Nov 28 04:26:56 localhost systemd-rc-local-generator[189828]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:26:56 localhost systemd-sysv-generator[189832]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:26:56 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:56 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:26:56 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:56 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:56 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:56 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:56 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:56 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:56 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:57 localhost python3.9[190256]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 28 04:26:57 localhost systemd[1]: Reloading. Nov 28 04:26:57 localhost systemd-sysv-generator[190487]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:26:57 localhost systemd-rc-local-generator[190482]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:26:57 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:26:57 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:57 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:57 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:57 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:57 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:57 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:57 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:26:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4815 DF PROTO=TCP SPT=38690 DPT=9101 SEQ=3405516526 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1830A90000000001030307) Nov 28 04:26:59 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 28 04:26:59 localhost systemd[1]: Finished man-db-cache-update.service. Nov 28 04:26:59 localhost systemd[1]: man-db-cache-update.service: Consumed 14.292s CPU time. Nov 28 04:26:59 localhost systemd[1]: run-r13ed906980034e2db02f77ee5be26132.service: Deactivated successfully. Nov 28 04:26:59 localhost systemd[1]: run-r123d8513e85e4ee8a996436ef10f2026.service: Deactivated successfully. Nov 28 04:27:00 localhost python3.9[191418]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 28 04:27:00 localhost systemd[1]: Reloading. Nov 28 04:27:00 localhost systemd-sysv-generator[191460]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:27:00 localhost systemd-rc-local-generator[191456]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:27:00 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:27:00 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:00 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:00 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:00 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:00 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:00 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:00 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4817 DF PROTO=TCP SPT=38690 DPT=9101 SEQ=3405516526 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB183CC20000000001030307) Nov 28 04:27:01 localhost python3.9[191579]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 28 04:27:01 localhost systemd[1]: Reloading. Nov 28 04:27:02 localhost systemd-sysv-generator[191613]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:27:02 localhost systemd-rc-local-generator[191606]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:27:02 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:02 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:27:02 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:02 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:02 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:02 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:02 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:02 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:03 localhost python3.9[191728]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 28 04:27:03 localhost systemd[1]: Reloading. Nov 28 04:27:03 localhost systemd-rc-local-generator[191757]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:27:03 localhost systemd-sysv-generator[191762]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:27:03 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:03 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:03 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:27:03 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:03 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:03 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:03 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:03 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25240 DF PROTO=TCP SPT=53944 DPT=9105 SEQ=2637639503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1845820000000001030307) Nov 28 04:27:04 localhost python3.9[191877]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 28 04:27:05 localhost python3.9[191990]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 28 04:27:05 localhost systemd[1]: Reloading. Nov 28 04:27:05 localhost systemd-sysv-generator[192020]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:27:05 localhost systemd-rc-local-generator[192017]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:27:05 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:05 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:05 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:05 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:05 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:27:05 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:05 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:05 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:05 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33036 DF PROTO=TCP SPT=37890 DPT=9100 SEQ=2342632123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1851430000000001030307) Nov 28 04:27:09 localhost python3.9[192139]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Nov 28 04:27:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13890 DF PROTO=TCP SPT=41638 DPT=9102 SEQ=1384361588 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB185D420000000001030307) Nov 28 04:27:09 localhost systemd[1]: Reloading. Nov 28 04:27:09 localhost systemd-sysv-generator[192167]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:27:09 localhost systemd-rc-local-generator[192163]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:27:10 localhost python3.9[192287]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 28 04:27:11 localhost python3.9[192400]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 28 04:27:12 localhost python3.9[192513]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 28 04:27:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33038 DF PROTO=TCP SPT=37890 DPT=9100 SEQ=2342632123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1869030000000001030307) Nov 28 04:27:14 localhost python3.9[192626]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 28 04:27:15 localhost python3.9[192739]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 28 04:27:16 localhost python3.9[192852]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 28 04:27:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43654 DF PROTO=TCP SPT=33186 DPT=9882 SEQ=1702076260 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB187B830000000001030307) Nov 28 04:27:17 localhost python3.9[192965]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 28 04:27:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31733 DF PROTO=TCP SPT=60190 DPT=9105 SEQ=955319082 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1883020000000001030307) Nov 28 04:27:19 localhost python3.9[193078]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 28 04:27:20 localhost python3.9[193191]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 28 04:27:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31734 DF PROTO=TCP SPT=60190 DPT=9105 SEQ=955319082 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB188B030000000001030307) Nov 28 04:27:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:27:21 localhost podman[193195]: 2025-11-28 09:27:21.585179181 +0000 UTC m=+0.093791437 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 04:27:21 localhost podman[193195]: 2025-11-28 09:27:21.650516962 +0000 UTC m=+0.159129208 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 28 04:27:21 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:27:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:27:22 localhost podman[193328]: 2025-11-28 09:27:22.047486134 +0000 UTC m=+0.078690121 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Nov 28 04:27:22 localhost podman[193328]: 2025-11-28 09:27:22.08747199 +0000 UTC m=+0.118675927 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 28 04:27:22 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:27:22 localhost python3.9[193329]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 28 04:27:23 localhost python3.9[193458]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 28 04:27:24 localhost python3.9[193571]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 28 04:27:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31735 DF PROTO=TCP SPT=60190 DPT=9105 SEQ=955319082 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB189AC20000000001030307) Nov 28 04:27:25 localhost python3.9[193684]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 28 04:27:26 localhost python3.9[193797]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Nov 28 04:27:27 localhost python3.9[193910]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 28 04:27:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9310 DF PROTO=TCP SPT=39628 DPT=9101 SEQ=1021539114 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB18A5DA0000000001030307) Nov 28 04:27:28 localhost python3.9[194020]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 28 04:27:29 localhost python3.9[194130]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:27:29 localhost python3.9[194240]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:27:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9312 DF PROTO=TCP SPT=39628 DPT=9101 SEQ=1021539114 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB18B2030000000001030307) Nov 28 04:27:31 localhost python3.9[194350]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:27:31 localhost python3.9[194460]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 28 04:27:32 localhost python3.9[194570]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:27:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61812 DF PROTO=TCP SPT=44796 DPT=9102 SEQ=3347864527 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB18BAC30000000001030307) Nov 28 04:27:33 localhost python3.9[194660]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764322052.1793644-1643-204975312148841/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:34 localhost python3.9[194770]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:27:34 localhost python3.9[194860]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764322053.6997561-1643-215479033003466/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:35 localhost python3.9[194970]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:27:36 localhost python3.9[195060]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764322055.1242456-1643-37319029699801/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1155 DF PROTO=TCP SPT=37064 DPT=9100 SEQ=1146749385 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB18C6420000000001030307) Nov 28 04:27:36 localhost python3.9[195170]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:27:37 localhost python3.9[195260]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764322056.2293422-1643-148685320685147/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:37 localhost python3.9[195406]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:27:38 localhost python3.9[195528]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764322057.3880415-1643-30885881471837/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=8d9b2057482987a531d808ceb2ac4bc7d43bf17c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:39 localhost python3.9[195656]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:27:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48821 DF PROTO=TCP SPT=47006 DPT=9100 SEQ=2140497574 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB18D1820000000001030307) Nov 28 04:27:39 localhost python3.9[195746]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764322058.5308483-1643-40897026161398/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:40 localhost python3.9[195856]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:27:41 localhost python3.9[195944]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764322059.7896645-1643-25617959246943/.source.conf follow=False _original_basename=auth.conf checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:42 localhost python3.9[196054]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:27:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1157 DF PROTO=TCP SPT=37064 DPT=9100 SEQ=1146749385 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB18DE020000000001030307) Nov 28 04:27:42 localhost python3.9[196144]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764322061.8411763-1643-275230523206812/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:43 localhost python3.9[196254]: ansible-ansible.builtin.file Invoked with path=/etc/libvirt/passwd.db state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:44 localhost python3.9[196364]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:45 localhost python3.9[196474]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:45 localhost python3.9[196584]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:46 localhost python3.9[196694]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:47 localhost python3.9[196804]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21379 DF PROTO=TCP SPT=47250 DPT=9882 SEQ=3527134190 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB18F0820000000001030307) Nov 28 04:27:47 localhost python3.9[196914]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:48 localhost python3.9[197024]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:48 localhost python3.9[197134]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59164 DF PROTO=TCP SPT=35670 DPT=9105 SEQ=4024786643 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB18F8420000000001030307) Nov 28 04:27:49 localhost python3.9[197244]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:50 localhost python3.9[197354]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:27:50.779 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:27:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:27:50.780 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:27:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:27:50.782 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:27:50 localhost python3.9[197464]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59165 DF PROTO=TCP SPT=35670 DPT=9105 SEQ=4024786643 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1900420000000001030307) Nov 28 04:27:51 localhost python3.9[197574]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:27:51 localhost podman[197617]: 2025-11-28 09:27:51.853324343 +0000 UTC m=+0.087123405 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:27:51 localhost podman[197617]: 2025-11-28 09:27:51.923908838 +0000 UTC m=+0.157707940 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 28 04:27:51 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:27:52 localhost python3.9[197709]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:27:52 localhost podman[197820]: 2025-11-28 09:27:52.807753783 +0000 UTC m=+0.072140814 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 28 04:27:52 localhost podman[197820]: 2025-11-28 09:27:52.843310198 +0000 UTC m=+0.107697149 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 28 04:27:52 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:27:52 localhost python3.9[197819]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:53 localhost python3.9[197947]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:27:55 localhost python3.9[198035]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322073.3285704-2306-277679658601667/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59166 DF PROTO=TCP SPT=35670 DPT=9105 SEQ=4024786643 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1910020000000001030307) Nov 28 04:27:55 localhost python3.9[198145]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:27:57 localhost python3.9[198233]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322075.3073146-2306-273706696229909/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:57 localhost python3.9[198343]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:27:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8493 DF PROTO=TCP SPT=45770 DPT=9101 SEQ=2243947299 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB191B090000000001030307) Nov 28 04:27:58 localhost python3.9[198431]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322077.3913124-2306-269691063136303/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:27:59 localhost python3.9[198541]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:27:59 localhost python3.9[198629]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322078.6479716-2306-144957297635360/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:00 localhost python3.9[198739]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:28:00 localhost python3.9[198827]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322079.8203244-2306-92788893361626/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8495 DF PROTO=TCP SPT=45770 DPT=9101 SEQ=2243947299 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1927020000000001030307) Nov 28 04:28:01 localhost python3.9[198937]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:28:02 localhost python3.9[199025]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322081.0852919-2306-229161486077059/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:02 localhost python3.9[199135]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:28:03 localhost python3.9[199223]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322082.2459536-2306-41145762995919/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59167 DF PROTO=TCP SPT=35670 DPT=9105 SEQ=4024786643 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB192F820000000001030307) Nov 28 04:28:03 localhost python3.9[199333]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:28:04 localhost python3.9[199421]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322083.4281278-2306-30012057901279/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:05 localhost python3.9[199531]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:28:05 localhost python3.9[199619]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322084.7025983-2306-39272947968528/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13893 DF PROTO=TCP SPT=41638 DPT=9102 SEQ=1384361588 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB193B820000000001030307) Nov 28 04:28:06 localhost python3.9[199729]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:28:06 localhost python3.9[199817]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322085.9570284-2306-52237337935485/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:07 localhost python3.9[199927]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:28:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33041 DF PROTO=TCP SPT=37890 DPT=9100 SEQ=2342632123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1947820000000001030307) Nov 28 04:28:09 localhost python3.9[200015]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322087.122525-2306-32809047670558/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:10 localhost python3.9[200125]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:28:10 localhost python3.9[200213]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322089.681376-2306-266653880391226/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:11 localhost python3.9[200323]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:28:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45370 DF PROTO=TCP SPT=43214 DPT=9100 SEQ=785439338 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1953420000000001030307) Nov 28 04:28:12 localhost python3.9[200411]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322091.4093482-2306-220572158321127/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:13 localhost python3.9[200521]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:28:13 localhost python3.9[200609]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322092.648587-2306-23094986687716/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:14 localhost python3.9[200717]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:28:15 localhost python3.9[200830]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False Nov 28 04:28:16 localhost python3.9[200940]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 04:28:16 localhost systemd[1]: Reloading. Nov 28 04:28:16 localhost systemd-rc-local-generator[200964]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:28:16 localhost systemd-sysv-generator[200969]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:28:16 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:16 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:16 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:16 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:16 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:28:16 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:16 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:16 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:16 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:16 localhost systemd[1]: Starting libvirt logging daemon socket... Nov 28 04:28:16 localhost systemd[1]: Listening on libvirt logging daemon socket. Nov 28 04:28:16 localhost systemd[1]: Starting libvirt logging daemon admin socket... Nov 28 04:28:16 localhost systemd[1]: Listening on libvirt logging daemon admin socket. Nov 28 04:28:16 localhost systemd[1]: Starting libvirt logging daemon... Nov 28 04:28:16 localhost systemd[1]: Started libvirt logging daemon. Nov 28 04:28:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=652 DF PROTO=TCP SPT=60246 DPT=9882 SEQ=806108939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1965C20000000001030307) Nov 28 04:28:17 localhost python3.9[201092]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 04:28:17 localhost systemd[1]: Reloading. Nov 28 04:28:17 localhost systemd-sysv-generator[201120]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:28:17 localhost systemd-rc-local-generator[201115]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:28:17 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:17 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:17 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:17 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:28:17 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:17 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:17 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:17 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:18 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Nov 28 04:28:18 localhost systemd[1]: Starting libvirt nodedev daemon socket... Nov 28 04:28:18 localhost systemd[1]: Listening on libvirt nodedev daemon socket. Nov 28 04:28:18 localhost systemd[1]: Starting libvirt nodedev daemon admin socket... Nov 28 04:28:18 localhost systemd[1]: Starting libvirt nodedev daemon read-only socket... Nov 28 04:28:18 localhost systemd[1]: Listening on libvirt nodedev daemon admin socket. Nov 28 04:28:18 localhost systemd[1]: Listening on libvirt nodedev daemon read-only socket. Nov 28 04:28:18 localhost systemd[1]: Started libvirt nodedev daemon. Nov 28 04:28:18 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Nov 28 04:28:18 localhost setroubleshoot[201129]: Deleting alert 96d97920-1546-4f45-b9c9-d0d51c7a6a1d, it is allowed in current policy Nov 28 04:28:18 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service. Nov 28 04:28:18 localhost python3.9[201274]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 04:28:18 localhost systemd[1]: Reloading. Nov 28 04:28:19 localhost systemd-rc-local-generator[201300]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:28:19 localhost systemd-sysv-generator[201305]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:28:19 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:19 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:19 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:19 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:28:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48422 DF PROTO=TCP SPT=44766 DPT=9105 SEQ=3785523750 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB196D420000000001030307) Nov 28 04:28:19 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:19 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:19 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:19 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:19 localhost systemd[1]: Starting libvirt proxy daemon socket... Nov 28 04:28:19 localhost systemd[1]: Listening on libvirt proxy daemon socket. Nov 28 04:28:19 localhost systemd[1]: Starting libvirt proxy daemon admin socket... Nov 28 04:28:19 localhost systemd[1]: Starting libvirt proxy daemon read-only socket... Nov 28 04:28:19 localhost systemd[1]: Listening on libvirt proxy daemon admin socket. Nov 28 04:28:19 localhost systemd[1]: Listening on libvirt proxy daemon read-only socket. Nov 28 04:28:19 localhost systemd[1]: Started libvirt proxy daemon. Nov 28 04:28:19 localhost setroubleshoot[201129]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 7622d5e9-f3a5-42df-957c-0a069946da20 Nov 28 04:28:19 localhost setroubleshoot[201129]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012***** Plugin dac_override (91.4 confidence) suggests **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012***** Plugin catchall (9.59 confidence) suggests **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012 Nov 28 04:28:19 localhost setroubleshoot[201129]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 7622d5e9-f3a5-42df-957c-0a069946da20 Nov 28 04:28:19 localhost setroubleshoot[201129]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012***** Plugin dac_override (91.4 confidence) suggests **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012***** Plugin catchall (9.59 confidence) suggests **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012 Nov 28 04:28:19 localhost python3.9[201448]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 04:28:19 localhost systemd[1]: Reloading. Nov 28 04:28:20 localhost systemd-rc-local-generator[201469]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:28:20 localhost systemd-sysv-generator[201474]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:28:20 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:20 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:20 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:20 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:28:20 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:20 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:20 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:20 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:20 localhost systemd[1]: Listening on libvirt locking daemon socket. Nov 28 04:28:20 localhost systemd[1]: Starting libvirt QEMU daemon socket... Nov 28 04:28:20 localhost systemd[1]: Listening on libvirt QEMU daemon socket. Nov 28 04:28:20 localhost systemd[1]: Starting libvirt QEMU daemon admin socket... Nov 28 04:28:20 localhost systemd[1]: Starting libvirt QEMU daemon read-only socket... Nov 28 04:28:20 localhost systemd[1]: Listening on libvirt QEMU daemon admin socket. Nov 28 04:28:20 localhost systemd[1]: Listening on libvirt QEMU daemon read-only socket. Nov 28 04:28:20 localhost systemd[1]: Started libvirt QEMU daemon. Nov 28 04:28:21 localhost python3.9[201629]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 04:28:21 localhost systemd[1]: Reloading. Nov 28 04:28:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48423 DF PROTO=TCP SPT=44766 DPT=9105 SEQ=3785523750 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1975420000000001030307) Nov 28 04:28:21 localhost systemd-rc-local-generator[201658]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:28:21 localhost systemd-sysv-generator[201662]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:28:21 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:21 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:21 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:21 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:28:21 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:21 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:21 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:21 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:21 localhost systemd[1]: Starting libvirt secret daemon socket... Nov 28 04:28:21 localhost systemd[1]: Listening on libvirt secret daemon socket. Nov 28 04:28:21 localhost systemd[1]: Starting libvirt secret daemon admin socket... Nov 28 04:28:21 localhost systemd[1]: Starting libvirt secret daemon read-only socket... Nov 28 04:28:21 localhost systemd[1]: Listening on libvirt secret daemon admin socket. Nov 28 04:28:21 localhost systemd[1]: Listening on libvirt secret daemon read-only socket. Nov 28 04:28:21 localhost systemd[1]: Started libvirt secret daemon. Nov 28 04:28:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:28:22 localhost podman[201719]: 2025-11-28 09:28:22.851108702 +0000 UTC m=+0.086672131 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Nov 28 04:28:22 localhost podman[201719]: 2025-11-28 09:28:22.88573679 +0000 UTC m=+0.121300279 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:28:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:28:22 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:28:22 localhost systemd[1]: tmp-crun.MtYW9m.mount: Deactivated successfully. Nov 28 04:28:22 localhost podman[201744]: 2025-11-28 09:28:22.994395968 +0000 UTC m=+0.085211017 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125) Nov 28 04:28:23 localhost podman[201744]: 2025-11-28 09:28:23.024468594 +0000 UTC m=+0.115283633 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:28:23 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:28:23 localhost python3.9[201854]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:24 localhost python3.9[201964]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 28 04:28:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48424 DF PROTO=TCP SPT=44766 DPT=9105 SEQ=3785523750 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1985030000000001030307) Nov 28 04:28:25 localhost python3.9[202074]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:28:26 localhost python3.9[202186]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 28 04:28:27 localhost python3.9[202294]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:28:28 localhost python3.9[202380]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322107.030235-3170-23267115983596/.source.xml follow=False _original_basename=secret.xml.j2 checksum=817431989b0a3ade349fa0105099056ad78b021d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17044 DF PROTO=TCP SPT=47494 DPT=9101 SEQ=3657177445 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB19903A0000000001030307) Nov 28 04:28:28 localhost python3.9[202490]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 2c5417c9-00eb-57d5-a565-ddecbc7995c1#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:28:29 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service: Deactivated successfully. Nov 28 04:28:29 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Nov 28 04:28:30 localhost python3.9[202611]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17046 DF PROTO=TCP SPT=47494 DPT=9101 SEQ=3657177445 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB199C420000000001030307) Nov 28 04:28:32 localhost python3.9[202948]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25871 DF PROTO=TCP SPT=52464 DPT=9102 SEQ=3433454233 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB19A5420000000001030307) Nov 28 04:28:33 localhost python3.9[203058]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:28:34 localhost python3.9[203146]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322113.0027072-3335-188960797998421/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=dc5ee7162311c27a6084cbee4052b901d56cb1ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 04:28:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.1 total, 600.0 interval#012Cumulative writes: 4899 writes, 22K keys, 4899 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4899 writes, 608 syncs, 8.06 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55ff50e0b610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_sl Nov 28 04:28:34 localhost python3.9[203256]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:36 localhost sshd[203367]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:28:36 localhost python3.9[203366]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:28:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57212 DF PROTO=TCP SPT=33054 DPT=9100 SEQ=2612205495 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB19B0C30000000001030307) Nov 28 04:28:36 localhost python3.9[203425]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:37 localhost python3.9[203535]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:28:38 localhost python3.9[203592]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.i0b36f21 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:38 localhost python3.9[203702]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:28:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1160 DF PROTO=TCP SPT=37064 DPT=9100 SEQ=1146749385 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB19BB820000000001030307) Nov 28 04:28:39 localhost python3.9[203795]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 04:28:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.3 total, 600.0 interval#012Cumulative writes: 5616 writes, 25K keys, 5616 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5616 writes, 758 syncs, 7.41 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.008 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.008 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.01 0.00 1 0.008 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.3 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.3 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x558439149610#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.3 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_sl Nov 28 04:28:40 localhost python3.9[203936]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:28:41 localhost python3[204065]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Nov 28 04:28:41 localhost python3.9[204175]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:28:42 localhost python3.9[204232]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57214 DF PROTO=TCP SPT=33054 DPT=9100 SEQ=2612205495 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB19C8830000000001030307) Nov 28 04:28:43 localhost python3.9[204342]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:28:43 localhost python3.9[204399]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:44 localhost python3.9[204509]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:28:44 localhost python3.9[204566]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:45 localhost python3.9[204676]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:28:45 localhost python3.9[204733]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:46 localhost python3.9[204843]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:28:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24756 DF PROTO=TCP SPT=41768 DPT=9882 SEQ=3860430018 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB19DB020000000001030307) Nov 28 04:28:47 localhost python3.9[204933]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764322126.3687296-3710-113660550658151/.source.nft follow=False _original_basename=ruleset.j2 checksum=e2e2635f27347d386f310e86d2b40c40289835bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:48 localhost python3.9[205043]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12056 DF PROTO=TCP SPT=60146 DPT=9105 SEQ=2127023387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB19E2830000000001030307) Nov 28 04:28:49 localhost python3.9[205153]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:28:50 localhost python3.9[205266]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:28:50.781 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:28:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:28:50.782 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:28:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:28:50.783 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:28:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12057 DF PROTO=TCP SPT=60146 DPT=9105 SEQ=2127023387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB19EA820000000001030307) Nov 28 04:28:51 localhost python3.9[205376]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:28:52 localhost python3.9[205487]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:28:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:28:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:28:53 localhost systemd[1]: tmp-crun.QgekrN.mount: Deactivated successfully. Nov 28 04:28:53 localhost podman[205600]: 2025-11-28 09:28:53.092697346 +0000 UTC m=+0.100160106 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:28:53 localhost podman[205600]: 2025-11-28 09:28:53.137867976 +0000 UTC m=+0.145330716 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 28 04:28:53 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:28:53 localhost python3.9[205599]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:28:53 localhost podman[205616]: 2025-11-28 09:28:53.232412064 +0000 UTC m=+0.136067204 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 28 04:28:53 localhost podman[205616]: 2025-11-28 09:28:53.26249883 +0000 UTC m=+0.166153960 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125) Nov 28 04:28:53 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:28:53 localhost python3.9[205755]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:54 localhost python3.9[205865]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:28:55 localhost python3.9[205953]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322134.1404881-3926-157309330453287/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12058 DF PROTO=TCP SPT=60146 DPT=9105 SEQ=2127023387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB19FA430000000001030307) Nov 28 04:28:56 localhost python3.9[206063]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:28:56 localhost python3.9[206151]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322135.803836-3971-237242952264560/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:57 localhost python3.9[206261]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:28:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46199 DF PROTO=TCP SPT=52532 DPT=9101 SEQ=2886924465 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A05690000000001030307) Nov 28 04:28:58 localhost python3.9[206349]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322137.0955102-4016-177001413400419/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:28:58 localhost python3.9[206459]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:28:58 localhost systemd[1]: Reloading. Nov 28 04:28:58 localhost systemd-sysv-generator[206485]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:28:59 localhost systemd-rc-local-generator[206480]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:28:59 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:59 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:59 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:59 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:28:59 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:59 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:59 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:59 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:28:59 localhost systemd[1]: Reached target edpm_libvirt.target. Nov 28 04:29:00 localhost python3.9[206608]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None Nov 28 04:29:00 localhost systemd[1]: Reloading. Nov 28 04:29:00 localhost systemd-sysv-generator[206636]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:29:00 localhost systemd-rc-local-generator[206632]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:29:01 localhost systemd[1]: Reloading. Nov 28 04:29:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46201 DF PROTO=TCP SPT=52532 DPT=9101 SEQ=2886924465 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A11820000000001030307) Nov 28 04:29:01 localhost systemd-sysv-generator[206676]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:29:01 localhost systemd-rc-local-generator[206670]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:29:01 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:29:02 localhost systemd[1]: session-52.scope: Deactivated successfully. Nov 28 04:29:02 localhost systemd[1]: session-52.scope: Consumed 3min 37.274s CPU time. Nov 28 04:29:02 localhost systemd-logind[764]: Session 52 logged out. Waiting for processes to exit. Nov 28 04:29:02 localhost systemd-logind[764]: Removed session 52. Nov 28 04:29:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32858 DF PROTO=TCP SPT=37284 DPT=9102 SEQ=3188745367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A1A420000000001030307) Nov 28 04:29:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65121 DF PROTO=TCP SPT=50028 DPT=9102 SEQ=2430455319 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A25820000000001030307) Nov 28 04:29:08 localhost sshd[206699]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:29:08 localhost systemd-logind[764]: New session 53 of user zuul. Nov 28 04:29:08 localhost systemd[1]: Started Session 53 of User zuul. Nov 28 04:29:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45373 DF PROTO=TCP SPT=43214 DPT=9100 SEQ=785439338 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A31820000000001030307) Nov 28 04:29:09 localhost python3.9[206810]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:29:11 localhost python3.9[206922]: ansible-ansible.builtin.service_facts Invoked Nov 28 04:29:11 localhost network[206939]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 28 04:29:11 localhost network[206940]: 'network-scripts' will be removed from distribution in near future. Nov 28 04:29:11 localhost network[206941]: It is advised to switch to 'NetworkManager' instead for network management. Nov 28 04:29:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14522 DF PROTO=TCP SPT=49530 DPT=9100 SEQ=3191443101 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A3DC20000000001030307) Nov 28 04:29:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:29:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7421 DF PROTO=TCP SPT=46362 DPT=9882 SEQ=2306459442 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A50430000000001030307) Nov 28 04:29:17 localhost python3.9[207173]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 28 04:29:18 localhost python3.9[207236]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 28 04:29:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38734 DF PROTO=TCP SPT=50730 DPT=9105 SEQ=3294590754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A57C20000000001030307) Nov 28 04:29:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38735 DF PROTO=TCP SPT=50730 DPT=9105 SEQ=3294590754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A5FC20000000001030307) Nov 28 04:29:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:29:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:29:23 localhost systemd[1]: tmp-crun.KrZMuq.mount: Deactivated successfully. Nov 28 04:29:23 localhost podman[207239]: 2025-11-28 09:29:23.858262462 +0000 UTC m=+0.096063149 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2) Nov 28 04:29:23 localhost podman[207240]: 2025-11-28 09:29:23.90119606 +0000 UTC m=+0.138118929 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent) Nov 28 04:29:23 localhost podman[207240]: 2025-11-28 09:29:23.910361777 +0000 UTC m=+0.147284676 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:29:23 localhost podman[207239]: 2025-11-28 09:29:23.920060402 +0000 UTC m=+0.157861119 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller) Nov 28 04:29:23 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:29:23 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:29:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38736 DF PROTO=TCP SPT=50730 DPT=9105 SEQ=3294590754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A6F820000000001030307) Nov 28 04:29:26 localhost python3.9[207396]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:29:27 localhost python3.9[207508]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi mode=preserve remote_src=True src=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi/ backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:29:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48345 DF PROTO=TCP SPT=41214 DPT=9101 SEQ=1171061035 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A7A9A0000000001030307) Nov 28 04:29:28 localhost python3.9[207618]: ansible-ansible.legacy.command Invoked with _raw_params=mv "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi" "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi.adopted"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:29:29 localhost python3.9[207729]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:29:29 localhost python3.9[207840]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -rF /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:29:30 localhost python3.9[207951]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:29:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48347 DF PROTO=TCP SPT=41214 DPT=9101 SEQ=1171061035 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A86C20000000001030307) Nov 28 04:29:31 localhost python3.9[208063]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:29:33 localhost python3.9[208173]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:29:33 localhost systemd[1]: Listening on Open-iSCSI iscsid Socket. Nov 28 04:29:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56791 DF PROTO=TCP SPT=35450 DPT=9102 SEQ=980278398 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A8F820000000001030307) Nov 28 04:29:34 localhost python3.9[208287]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:29:36 localhost systemd[1]: Reloading. Nov 28 04:29:36 localhost systemd-rc-local-generator[208310]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:29:36 localhost systemd-sysv-generator[208318]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:29:36 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:29:36 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:29:36 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:29:36 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:29:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:29:36 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:29:36 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:29:36 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:29:36 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:29:36 localhost systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi). Nov 28 04:29:36 localhost systemd[1]: Starting Open-iSCSI... Nov 28 04:29:36 localhost iscsid[208328]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi Nov 28 04:29:36 localhost iscsid[208328]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.[:identifier]. Nov 28 04:29:36 localhost iscsid[208328]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6. Nov 28 04:29:36 localhost iscsid[208328]: If using hardware iscsi like qla4xxx this message can be ignored. Nov 28 04:29:36 localhost iscsid[208328]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi Nov 28 04:29:36 localhost iscsid[208328]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf Nov 28 04:29:36 localhost iscsid[208328]: iscsid: can't open iscsid.ipc_auth_uid configuration file /etc/iscsi/iscsid.conf Nov 28 04:29:36 localhost systemd[1]: Started Open-iSCSI. Nov 28 04:29:36 localhost systemd[1]: Starting Logout off all iSCSI sessions on shutdown... Nov 28 04:29:36 localhost systemd[1]: Finished Logout off all iSCSI sessions on shutdown. Nov 28 04:29:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54530 DF PROTO=TCP SPT=43538 DPT=9100 SEQ=1618416508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1A9B020000000001030307) Nov 28 04:29:37 localhost python3.9[208439]: ansible-ansible.builtin.service_facts Invoked Nov 28 04:29:37 localhost network[208456]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 28 04:29:37 localhost network[208457]: 'network-scripts' will be removed from distribution in near future. Nov 28 04:29:37 localhost network[208458]: It is advised to switch to 'NetworkManager' instead for network management. Nov 28 04:29:38 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Nov 28 04:29:38 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Nov 28 04:29:38 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service. Nov 28 04:29:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:29:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56793 DF PROTO=TCP SPT=35450 DPT=9102 SEQ=980278398 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1AA7420000000001030307) Nov 28 04:29:39 localhost setroubleshoot[208474]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 0a93104a-0288-491f-bdb6-6915108f9856 Nov 28 04:29:39 localhost setroubleshoot[208474]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Nov 28 04:29:39 localhost setroubleshoot[208474]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 0a93104a-0288-491f-bdb6-6915108f9856 Nov 28 04:29:39 localhost setroubleshoot[208474]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Nov 28 04:29:39 localhost setroubleshoot[208474]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 0a93104a-0288-491f-bdb6-6915108f9856 Nov 28 04:29:39 localhost setroubleshoot[208474]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Nov 28 04:29:39 localhost setroubleshoot[208474]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 0a93104a-0288-491f-bdb6-6915108f9856 Nov 28 04:29:39 localhost setroubleshoot[208474]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Nov 28 04:29:39 localhost setroubleshoot[208474]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 0a93104a-0288-491f-bdb6-6915108f9856 Nov 28 04:29:39 localhost setroubleshoot[208474]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Nov 28 04:29:39 localhost setroubleshoot[208474]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 0a93104a-0288-491f-bdb6-6915108f9856 Nov 28 04:29:39 localhost setroubleshoot[208474]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Nov 28 04:29:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7423 DF PROTO=TCP SPT=46362 DPT=9882 SEQ=2306459442 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1AB1820000000001030307) Nov 28 04:29:43 localhost python3.9[208793]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Nov 28 04:29:44 localhost python3.9[208903]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled Nov 28 04:29:45 localhost python3.9[209017]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:29:45 localhost python3.9[209105]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322184.7983687-455-247933568056599/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:29:46 localhost python3.9[209215]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:29:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11199 DF PROTO=TCP SPT=45946 DPT=9882 SEQ=968156560 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1AC5430000000001030307) Nov 28 04:29:47 localhost python3.9[209325]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 04:29:47 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Nov 28 04:29:47 localhost systemd[1]: Stopped Load Kernel Modules. Nov 28 04:29:47 localhost systemd[1]: Stopping Load Kernel Modules... Nov 28 04:29:47 localhost systemd[1]: Starting Load Kernel Modules... Nov 28 04:29:47 localhost systemd-modules-load[209329]: Module 'msr' is built in Nov 28 04:29:47 localhost systemd[1]: Finished Load Kernel Modules. Nov 28 04:29:49 localhost python3.9[209439]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:29:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32545 DF PROTO=TCP SPT=38806 DPT=9105 SEQ=4287295748 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1ACD020000000001030307) Nov 28 04:29:49 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service: Deactivated successfully. Nov 28 04:29:49 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Nov 28 04:29:49 localhost python3.9[209549]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:29:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:29:50.783 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:29:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:29:50.784 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:29:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:29:50.785 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:29:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32546 DF PROTO=TCP SPT=38806 DPT=9105 SEQ=4287295748 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1AD5020000000001030307) Nov 28 04:29:51 localhost python3.9[209659]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:29:52 localhost python3.9[209769]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:29:52 localhost python3.9[209857]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322191.888861-629-65031329828859/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:29:53 localhost python3.9[209967]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:29:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:29:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:29:54 localhost podman[210080]: 2025-11-28 09:29:54.294684594 +0000 UTC m=+0.088431778 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 28 04:29:54 localhost podman[210080]: 2025-11-28 09:29:54.329564112 +0000 UTC m=+0.123311306 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true) Nov 28 04:29:54 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:29:54 localhost python3.9[210078]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:29:54 localhost podman[210079]: 2025-11-28 09:29:54.339420316 +0000 UTC m=+0.133671958 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 04:29:54 localhost podman[210079]: 2025-11-28 09:29:54.426454035 +0000 UTC m=+0.220705647 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:29:54 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:29:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32547 DF PROTO=TCP SPT=38806 DPT=9105 SEQ=4287295748 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1AE4C30000000001030307) Nov 28 04:29:55 localhost python3.9[210231]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:29:56 localhost python3.9[210341]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:29:56 localhost python3.9[210451]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:29:57 localhost python3.9[210561]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:29:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61251 DF PROTO=TCP SPT=52818 DPT=9101 SEQ=222613300 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1AEFCA0000000001030307) Nov 28 04:29:58 localhost python3.9[210671]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:29:58 localhost python3.9[210781]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:29:59 localhost python3.9[210891]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:30:00 localhost python3.9[211003]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:30:01 localhost python3.9[211113]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:30:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61253 DF PROTO=TCP SPT=52818 DPT=9101 SEQ=222613300 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1AFBC30000000001030307) Nov 28 04:30:01 localhost python3.9[211223]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:30:03 localhost python3.9[211280]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:30:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14122 DF PROTO=TCP SPT=46484 DPT=9102 SEQ=2017328411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B04C30000000001030307) Nov 28 04:30:03 localhost python3.9[211390]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:30:04 localhost python3.9[211447]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:30:06 localhost python3.9[211557]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:30:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32863 DF PROTO=TCP SPT=37284 DPT=9102 SEQ=3188745367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B0F820000000001030307) Nov 28 04:30:06 localhost python3.9[211667]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:30:07 localhost python3.9[211724]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:30:07 localhost python3.9[211834]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:30:08 localhost python3.9[211891]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:30:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14525 DF PROTO=TCP SPT=49530 DPT=9100 SEQ=3191443101 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B1B820000000001030307) Nov 28 04:30:09 localhost python3.9[212001]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:30:09 localhost systemd[1]: Reloading. Nov 28 04:30:09 localhost systemd-rc-local-generator[212024]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:30:09 localhost systemd-sysv-generator[212029]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:30:09 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:09 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:09 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:09 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:30:09 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:09 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:09 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:09 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:10 localhost python3.9[212149]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:30:10 localhost python3.9[212206]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:30:11 localhost python3.9[212316]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:30:12 localhost python3.9[212373]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:30:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40315 DF PROTO=TCP SPT=42796 DPT=9100 SEQ=2395171770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B28020000000001030307) Nov 28 04:30:12 localhost python3.9[212483]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:30:12 localhost systemd[1]: Reloading. Nov 28 04:30:13 localhost systemd-rc-local-generator[212504]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:30:13 localhost systemd-sysv-generator[212509]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:30:13 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:13 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:13 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:13 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:13 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:30:13 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:13 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:13 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:13 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:13 localhost systemd[1]: Starting Create netns directory... Nov 28 04:30:13 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 28 04:30:13 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 28 04:30:13 localhost systemd[1]: Finished Create netns directory. Nov 28 04:30:14 localhost python3.9[212634]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:30:14 localhost python3.9[212744]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:30:15 localhost python3.9[212832]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322214.5086439-1250-82951937879058/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 28 04:30:16 localhost python3.9[212942]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:30:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59288 DF PROTO=TCP SPT=48552 DPT=9882 SEQ=2146318426 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B3A830000000001030307) Nov 28 04:30:17 localhost python3.9[213052]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:30:18 localhost systemd[1]: virtnodedevd.service: Deactivated successfully. Nov 28 04:30:19 localhost python3.9[213141]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322217.1988795-1325-97036648301814/.source.json _original_basename=._qlnhfg7 follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:30:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60076 DF PROTO=TCP SPT=53610 DPT=9105 SEQ=2367419891 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B42020000000001030307) Nov 28 04:30:19 localhost systemd[1]: virtproxyd.service: Deactivated successfully. Nov 28 04:30:19 localhost python3.9[213252]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:30:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60077 DF PROTO=TCP SPT=53610 DPT=9105 SEQ=2367419891 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B4A020000000001030307) Nov 28 04:30:22 localhost python3.9[213560]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False Nov 28 04:30:23 localhost python3.9[213670]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 28 04:30:24 localhost python3.9[213780]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Nov 28 04:30:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:30:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:30:24 localhost podman[213809]: 2025-11-28 09:30:24.846507941 +0000 UTC m=+0.083662652 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller) Nov 28 04:30:24 localhost systemd[1]: tmp-crun.dgdKyr.mount: Deactivated successfully. Nov 28 04:30:24 localhost podman[213809]: 2025-11-28 09:30:24.909594024 +0000 UTC m=+0.146748735 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:30:24 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:30:24 localhost podman[213812]: 2025-11-28 09:30:24.91003562 +0000 UTC m=+0.146592240 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3) Nov 28 04:30:24 localhost podman[213812]: 2025-11-28 09:30:24.991826535 +0000 UTC m=+0.228383125 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 28 04:30:25 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:30:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60078 DF PROTO=TCP SPT=53610 DPT=9105 SEQ=2367419891 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B59C20000000001030307) Nov 28 04:30:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38726 DF PROTO=TCP SPT=58704 DPT=9101 SEQ=4233587836 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B64F90000000001030307) Nov 28 04:30:28 localhost python3[213960]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Nov 28 04:30:30 localhost podman[213975]: 2025-11-28 09:30:28.903235733 +0000 UTC m=+0.045067445 image pull quay.io/podified-antelope-centos9/openstack-multipathd:current-podified Nov 28 04:30:31 localhost podman[214022]: Nov 28 04:30:31 localhost podman[214022]: 2025-11-28 09:30:31.030770503 +0000 UTC m=+0.062994942 container create 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:30:31 localhost podman[214022]: 2025-11-28 09:30:31.002607219 +0000 UTC m=+0.034831708 image pull quay.io/podified-antelope-centos9/openstack-multipathd:current-podified Nov 28 04:30:31 localhost python3[213960]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified Nov 28 04:30:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38728 DF PROTO=TCP SPT=58704 DPT=9101 SEQ=4233587836 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B71020000000001030307) Nov 28 04:30:31 localhost python3.9[214170]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:30:32 localhost systemd[1]: virtsecretd.service: Deactivated successfully. Nov 28 04:30:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60079 DF PROTO=TCP SPT=53610 DPT=9105 SEQ=2367419891 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B79820000000001030307) Nov 28 04:30:33 localhost python3.9[214283]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:30:35 localhost python3.9[214338]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:30:36 localhost python3.9[214447]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764322235.4866197-1589-1687990589434/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:30:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15130 DF PROTO=TCP SPT=57402 DPT=9100 SEQ=617379117 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B85820000000001030307) Nov 28 04:30:36 localhost python3.9[214502]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 28 04:30:36 localhost systemd[1]: Reloading. Nov 28 04:30:36 localhost systemd-sysv-generator[214528]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:30:36 localhost systemd-rc-local-generator[214524]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:30:36 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:36 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:36 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:36 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:30:36 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:36 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:36 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:36 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:37 localhost python3.9[214592]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:30:37 localhost systemd[1]: Reloading. Nov 28 04:30:37 localhost systemd-rc-local-generator[214619]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:30:37 localhost systemd-sysv-generator[214622]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:30:37 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:37 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:37 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:37 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:30:37 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:37 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:37 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:37 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:37 localhost systemd[1]: Starting multipathd container... Nov 28 04:30:38 localhost systemd[1]: tmp-crun.5NbCSk.mount: Deactivated successfully. Nov 28 04:30:38 localhost systemd[1]: Started libcrun container. Nov 28 04:30:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b94389f32798135ccab4ef5d015a3beb8a228cdedf149e6a2665616bd7ec48ac/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Nov 28 04:30:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b94389f32798135ccab4ef5d015a3beb8a228cdedf149e6a2665616bd7ec48ac/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Nov 28 04:30:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:30:38 localhost podman[214633]: 2025-11-28 09:30:38.10729225 +0000 UTC m=+0.159318344 container init 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3) Nov 28 04:30:38 localhost multipathd[214646]: + sudo -E kolla_set_configs Nov 28 04:30:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:30:38 localhost podman[214633]: 2025-11-28 09:30:38.146183538 +0000 UTC m=+0.198209622 container start 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 04:30:38 localhost podman[214633]: multipathd Nov 28 04:30:38 localhost systemd[1]: Started multipathd container. Nov 28 04:30:38 localhost podman[214653]: 2025-11-28 09:30:38.21439741 +0000 UTC m=+0.063170876 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:30:38 localhost multipathd[214646]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 28 04:30:38 localhost multipathd[214646]: INFO:__main__:Validating config file Nov 28 04:30:38 localhost multipathd[214646]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 28 04:30:38 localhost multipathd[214646]: INFO:__main__:Writing out command to execute Nov 28 04:30:38 localhost multipathd[214646]: ++ cat /run_command Nov 28 04:30:38 localhost multipathd[214646]: + CMD='/usr/sbin/multipathd -d' Nov 28 04:30:38 localhost multipathd[214646]: + ARGS= Nov 28 04:30:38 localhost multipathd[214646]: + sudo kolla_copy_cacerts Nov 28 04:30:38 localhost podman[214653]: 2025-11-28 09:30:38.240432259 +0000 UTC m=+0.089205795 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd) Nov 28 04:30:38 localhost podman[214653]: unhealthy Nov 28 04:30:38 localhost multipathd[214646]: + [[ ! -n '' ]] Nov 28 04:30:38 localhost multipathd[214646]: + . kolla_extend_start Nov 28 04:30:38 localhost multipathd[214646]: Running command: '/usr/sbin/multipathd -d' Nov 28 04:30:38 localhost multipathd[214646]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\''' Nov 28 04:30:38 localhost multipathd[214646]: + umask 0022 Nov 28 04:30:38 localhost multipathd[214646]: + exec /usr/sbin/multipathd -d Nov 28 04:30:38 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:30:38 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Failed with result 'exit-code'. Nov 28 04:30:38 localhost multipathd[214646]: 10072.434832 | --------start up-------- Nov 28 04:30:38 localhost systemd-journald[47227]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation. Nov 28 04:30:38 localhost systemd-journald[47227]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 28 04:30:38 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 28 04:30:38 localhost multipathd[214646]: 10072.434902 | read /etc/multipath.conf Nov 28 04:30:38 localhost multipathd[214646]: 10072.438780 | path checkers start up Nov 28 04:30:38 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 28 04:30:38 localhost python3.9[214795]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:30:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54535 DF PROTO=TCP SPT=43538 DPT=9100 SEQ=1618416508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B91820000000001030307) Nov 28 04:30:40 localhost python3.9[214907]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:30:40 localhost python3.9[215029]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 04:30:41 localhost systemd[1]: Stopping multipathd container... Nov 28 04:30:42 localhost multipathd[214646]: 10076.243414 | exit (signal) Nov 28 04:30:42 localhost multipathd[214646]: 10076.243814 | --------shut down------- Nov 28 04:30:42 localhost systemd[1]: tmp-crun.OAqVUv.mount: Deactivated successfully. Nov 28 04:30:42 localhost systemd[1]: libpod-7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.scope: Deactivated successfully. Nov 28 04:30:42 localhost podman[215033]: 2025-11-28 09:30:42.102911839 +0000 UTC m=+0.103043870 container died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true) Nov 28 04:30:42 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.timer: Deactivated successfully. Nov 28 04:30:42 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:30:42 localhost podman[215033]: 2025-11-28 09:30:42.278739338 +0000 UTC m=+0.278871379 container cleanup 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 28 04:30:42 localhost podman[215033]: multipathd Nov 28 04:30:42 localhost podman[215097]: 2025-11-28 09:30:42.348232695 +0000 UTC m=+0.044890078 container cleanup 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible) Nov 28 04:30:42 localhost podman[215097]: multipathd Nov 28 04:30:42 localhost systemd[1]: edpm_multipathd.service: Deactivated successfully. Nov 28 04:30:42 localhost systemd[1]: Stopped multipathd container. Nov 28 04:30:42 localhost systemd[1]: Starting multipathd container... Nov 28 04:30:42 localhost systemd[1]: Started libcrun container. Nov 28 04:30:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b94389f32798135ccab4ef5d015a3beb8a228cdedf149e6a2665616bd7ec48ac/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Nov 28 04:30:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b94389f32798135ccab4ef5d015a3beb8a228cdedf149e6a2665616bd7ec48ac/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Nov 28 04:30:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15132 DF PROTO=TCP SPT=57402 DPT=9100 SEQ=617379117 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1B9D420000000001030307) Nov 28 04:30:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:30:42 localhost podman[215109]: 2025-11-28 09:30:42.484801293 +0000 UTC m=+0.109427212 container init 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:30:42 localhost multipathd[215121]: + sudo -E kolla_set_configs Nov 28 04:30:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:30:42 localhost podman[215109]: 2025-11-28 09:30:42.524621344 +0000 UTC m=+0.149247263 container start 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:30:42 localhost podman[215109]: multipathd Nov 28 04:30:42 localhost systemd[1]: Started multipathd container. Nov 28 04:30:42 localhost multipathd[215121]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 28 04:30:42 localhost multipathd[215121]: INFO:__main__:Validating config file Nov 28 04:30:42 localhost multipathd[215121]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 28 04:30:42 localhost multipathd[215121]: INFO:__main__:Writing out command to execute Nov 28 04:30:42 localhost multipathd[215121]: ++ cat /run_command Nov 28 04:30:42 localhost multipathd[215121]: + CMD='/usr/sbin/multipathd -d' Nov 28 04:30:42 localhost multipathd[215121]: + ARGS= Nov 28 04:30:42 localhost multipathd[215121]: + sudo kolla_copy_cacerts Nov 28 04:30:42 localhost podman[215139]: 2025-11-28 09:30:42.587549381 +0000 UTC m=+0.061958234 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Nov 28 04:30:42 localhost podman[215139]: 2025-11-28 09:30:42.595308002 +0000 UTC m=+0.069716855 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 28 04:30:42 localhost multipathd[215121]: + [[ ! -n '' ]] Nov 28 04:30:42 localhost multipathd[215121]: + . kolla_extend_start Nov 28 04:30:42 localhost multipathd[215121]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\''' Nov 28 04:30:42 localhost multipathd[215121]: Running command: '/usr/sbin/multipathd -d' Nov 28 04:30:42 localhost multipathd[215121]: + umask 0022 Nov 28 04:30:42 localhost multipathd[215121]: + exec /usr/sbin/multipathd -d Nov 28 04:30:42 localhost podman[215139]: unhealthy Nov 28 04:30:42 localhost multipathd[215121]: 10076.779974 | --------start up-------- Nov 28 04:30:42 localhost multipathd[215121]: 10076.779994 | read /etc/multipath.conf Nov 28 04:30:42 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:30:42 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Failed with result 'exit-code'. Nov 28 04:30:42 localhost multipathd[215121]: 10076.783212 | path checkers start up Nov 28 04:30:44 localhost python3.9[215302]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:30:45 localhost python3.9[215412]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Nov 28 04:30:46 localhost python3.9[215540]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled Nov 28 04:30:46 localhost python3.9[215658]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:30:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34905 DF PROTO=TCP SPT=48764 DPT=9882 SEQ=1475787364 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1BAFC30000000001030307) Nov 28 04:30:47 localhost python3.9[215746]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322246.4281995-1829-121453552736592/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:30:48 localhost python3.9[215856]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:30:48 localhost python3.9[215966]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 04:30:48 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Nov 28 04:30:48 localhost systemd[1]: Stopped Load Kernel Modules. Nov 28 04:30:48 localhost systemd[1]: Stopping Load Kernel Modules... Nov 28 04:30:48 localhost systemd[1]: Starting Load Kernel Modules... Nov 28 04:30:48 localhost systemd-modules-load[215970]: Module 'msr' is built in Nov 28 04:30:48 localhost systemd[1]: Finished Load Kernel Modules. Nov 28 04:30:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1827 DF PROTO=TCP SPT=44680 DPT=9105 SEQ=2909593796 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1BB7420000000001030307) Nov 28 04:30:50 localhost python3.9[216080]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 28 04:30:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:30:50.784 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:30:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:30:50.785 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:30:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:30:50.787 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:30:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1828 DF PROTO=TCP SPT=44680 DPT=9105 SEQ=2909593796 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1BBF420000000001030307) Nov 28 04:30:54 localhost systemd[1]: Reloading. Nov 28 04:30:54 localhost systemd-sysv-generator[216115]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:30:54 localhost systemd-rc-local-generator[216112]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:54 localhost systemd[1]: Reloading. Nov 28 04:30:54 localhost systemd-rc-local-generator[216149]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:30:54 localhost systemd-sysv-generator[216152]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:54 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:54 localhost systemd-logind[764]: Watching system buttons on /dev/input/event0 (Power Button) Nov 28 04:30:54 localhost systemd-logind[764]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Nov 28 04:30:54 localhost lvm[216207]: PV /dev/loop3 online, VG ceph_vg0 is complete. Nov 28 04:30:54 localhost lvm[216207]: VG ceph_vg0 finished Nov 28 04:30:54 localhost lvm[216205]: PV /dev/loop4 online, VG ceph_vg1 is complete. Nov 28 04:30:54 localhost lvm[216205]: VG ceph_vg1 finished Nov 28 04:30:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:30:54 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Nov 28 04:30:54 localhost systemd[1]: Starting man-db-cache-update.service... Nov 28 04:30:54 localhost systemd[1]: Reloading. Nov 28 04:30:55 localhost podman[216223]: 2025-11-28 09:30:55.036196142 +0000 UTC m=+0.087990147 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:30:55 localhost systemd-rc-local-generator[216268]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:30:55 localhost systemd-sysv-generator[216271]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:30:55 localhost podman[216223]: 2025-11-28 09:30:55.113296888 +0000 UTC m=+0.165090893 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible) Nov 28 04:30:55 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:55 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:55 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:55 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:55 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:30:55 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:55 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:55 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:55 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1829 DF PROTO=TCP SPT=44680 DPT=9105 SEQ=2909593796 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1BCF020000000001030307) Nov 28 04:30:55 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:30:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:30:55 localhost systemd[1]: Queuing reload/restart jobs for marked units… Nov 28 04:30:55 localhost systemd[1]: tmp-crun.0XK5ma.mount: Deactivated successfully. Nov 28 04:30:55 localhost podman[216608]: 2025-11-28 09:30:55.435101392 +0000 UTC m=+0.093733698 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent) Nov 28 04:30:55 localhost podman[216608]: 2025-11-28 09:30:55.46936172 +0000 UTC m=+0.127994026 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:30:55 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:30:56 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Nov 28 04:30:56 localhost systemd[1]: Finished man-db-cache-update.service. Nov 28 04:30:56 localhost systemd[1]: man-db-cache-update.service: Consumed 1.255s CPU time. Nov 28 04:30:56 localhost systemd[1]: run-rbba2a76fc91d4edead9817eb37db3b7c.service: Deactivated successfully. Nov 28 04:30:57 localhost python3.9[217538]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:30:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32477 DF PROTO=TCP SPT=49288 DPT=9101 SEQ=2341876394 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1BDA290000000001030307) Nov 28 04:30:58 localhost python3.9[217652]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:30:59 localhost python3.9[217762]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 28 04:30:59 localhost systemd[1]: Reloading. Nov 28 04:30:59 localhost systemd-sysv-generator[217790]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:30:59 localhost systemd-rc-local-generator[217784]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:30:59 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:59 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:59 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:59 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:30:59 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:59 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:59 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:30:59 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:31:00 localhost python3.9[217906]: ansible-ansible.builtin.service_facts Invoked Nov 28 04:31:00 localhost network[217923]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 28 04:31:00 localhost network[217924]: 'network-scripts' will be removed from distribution in near future. Nov 28 04:31:00 localhost network[217925]: It is advised to switch to 'NetworkManager' instead for network management. Nov 28 04:31:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32479 DF PROTO=TCP SPT=49288 DPT=9101 SEQ=2341876394 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1BE6420000000001030307) Nov 28 04:31:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:31:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57399 DF PROTO=TCP SPT=41920 DPT=9102 SEQ=4180756437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1BEF030000000001030307) Nov 28 04:31:05 localhost python3.9[218160]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:31:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49676 DF PROTO=TCP SPT=38158 DPT=9100 SEQ=3893746115 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1BFAC20000000001030307) Nov 28 04:31:06 localhost python3.9[218271]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:31:07 localhost python3.9[218382]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:31:07 localhost python3.9[218493]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:31:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40318 DF PROTO=TCP SPT=42796 DPT=9100 SEQ=2395171770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C05830000000001030307) Nov 28 04:31:09 localhost python3.9[218604]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:31:10 localhost python3.9[218715]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:31:10 localhost python3.9[218826]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:31:11 localhost python3.9[218937]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:31:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49678 DF PROTO=TCP SPT=38158 DPT=9100 SEQ=3893746115 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C12820000000001030307) Nov 28 04:31:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:31:12 localhost podman[218956]: 2025-11-28 09:31:12.859173046 +0000 UTC m=+0.092218969 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 28 04:31:12 localhost podman[218956]: 2025-11-28 09:31:12.875426782 +0000 UTC m=+0.108472685 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd) Nov 28 04:31:12 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:31:14 localhost python3.9[219068]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:31:14 localhost python3.9[219178]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:31:15 localhost python3.9[219288]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:31:16 localhost python3.9[219398]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:31:16 localhost python3.9[219508]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:31:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10347 DF PROTO=TCP SPT=43626 DPT=9882 SEQ=2467435932 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C25020000000001030307) Nov 28 04:31:17 localhost python3.9[219618]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:31:18 localhost python3.9[219728]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:31:18 localhost python3.9[219838]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:31:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9103 DF PROTO=TCP SPT=33126 DPT=9105 SEQ=4190552098 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C2C820000000001030307) Nov 28 04:31:19 localhost python3.9[219948]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:31:20 localhost python3.9[220058]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:31:20 localhost python3.9[220168]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:31:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9104 DF PROTO=TCP SPT=33126 DPT=9105 SEQ=4190552098 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C34820000000001030307) Nov 28 04:31:21 localhost python3.9[220278]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:31:22 localhost python3.9[220388]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:31:23 localhost python3.9[220498]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:31:24 localhost python3.9[220608]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:31:24 localhost python3.9[220718]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:31:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9105 DF PROTO=TCP SPT=33126 DPT=9105 SEQ=4190552098 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C44420000000001030307) Nov 28 04:31:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:31:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:31:25 localhost podman[220828]: 2025-11-28 09:31:25.83062986 +0000 UTC m=+0.072655817 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 28 04:31:25 localhost podman[220828]: 2025-11-28 09:31:25.864401939 +0000 UTC m=+0.106427896 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:31:25 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:31:25 localhost podman[220829]: 2025-11-28 09:31:25.950008084 +0000 UTC m=+0.184571724 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 04:31:25 localhost python3.9[220835]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:31:25 localhost podman[220829]: 2025-11-28 09:31:25.97822858 +0000 UTC m=+0.212792290 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true) Nov 28 04:31:25 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:31:26 localhost python3.9[220980]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 28 04:31:27 localhost python3.9[221090]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 28 04:31:27 localhost systemd[1]: Reloading. Nov 28 04:31:27 localhost systemd-rc-local-generator[221116]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:31:27 localhost systemd-sysv-generator[221119]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:31:28 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:31:28 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:31:28 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:31:28 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:31:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:31:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51913 DF PROTO=TCP SPT=52198 DPT=9101 SEQ=2425600492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C4F5A0000000001030307) Nov 28 04:31:28 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:31:28 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:31:28 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:31:28 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:31:28 localhost python3.9[221236]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:31:29 localhost python3.9[221347]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:31:30 localhost python3.9[221458]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:31:30 localhost python3.9[221569]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:31:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51915 DF PROTO=TCP SPT=52198 DPT=9101 SEQ=2425600492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C5B420000000001030307) Nov 28 04:31:31 localhost python3.9[221680]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:31:31 localhost python3.9[221791]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:31:32 localhost python3.9[221902]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:31:33 localhost python3.9[222013]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:31:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60885 DF PROTO=TCP SPT=53580 DPT=9102 SEQ=1058156247 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C64430000000001030307) Nov 28 04:31:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30628 DF PROTO=TCP SPT=43092 DPT=9102 SEQ=1955098286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C6F830000000001030307) Nov 28 04:31:37 localhost python3.9[222124]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:31:38 localhost python3.9[222234]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:31:39 localhost python3.9[222344]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:31:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15135 DF PROTO=TCP SPT=57402 DPT=9100 SEQ=617379117 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C7B820000000001030307) Nov 28 04:31:39 localhost python3.9[222454]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:31:40 localhost python3.9[222564]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:31:41 localhost python3.9[222674]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:31:41 localhost python3.9[222784]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:31:42 localhost python3.9[222894]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 28 04:31:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61555 DF PROTO=TCP SPT=50872 DPT=9100 SEQ=1856325133 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C87830000000001030307) Nov 28 04:31:42 localhost python3.9[223004]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 28 04:31:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:31:43 localhost systemd[1]: tmp-crun.pd6aJa.mount: Deactivated successfully. Nov 28 04:31:43 localhost podman[223115]: 2025-11-28 09:31:43.468747741 +0000 UTC m=+0.097850645 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:31:43 localhost podman[223115]: 2025-11-28 09:31:43.484366213 +0000 UTC m=+0.113469167 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true) Nov 28 04:31:43 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:31:43 localhost python3.9[223114]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 28 04:31:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53072 DF PROTO=TCP SPT=60710 DPT=9882 SEQ=2672581527 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1C9A030000000001030307) Nov 28 04:31:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35722 DF PROTO=TCP SPT=44990 DPT=9105 SEQ=3272907869 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1CA1C20000000001030307) Nov 28 04:31:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:31:50.785 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:31:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:31:50.786 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:31:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:31:50.787 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:31:51 localhost python3.9[223386]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None Nov 28 04:31:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35723 DF PROTO=TCP SPT=44990 DPT=9105 SEQ=3272907869 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1CA9C30000000001030307) Nov 28 04:31:51 localhost python3.9[223497]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Nov 28 04:31:52 localhost python3.9[223613]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005538513.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None Nov 28 04:31:54 localhost sshd[223639]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:31:54 localhost systemd-logind[764]: New session 54 of user zuul. Nov 28 04:31:54 localhost systemd[1]: Started Session 54 of User zuul. Nov 28 04:31:54 localhost systemd[1]: session-54.scope: Deactivated successfully. Nov 28 04:31:54 localhost systemd-logind[764]: Session 54 logged out. Waiting for processes to exit. Nov 28 04:31:54 localhost systemd-logind[764]: Removed session 54. Nov 28 04:31:54 localhost python3.9[223750]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:31:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35724 DF PROTO=TCP SPT=44990 DPT=9105 SEQ=3272907869 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1CB9820000000001030307) Nov 28 04:31:55 localhost python3.9[223836]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322314.5339255-3388-134022272078712/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:31:56 localhost python3.9[223944]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:31:56 localhost python3.9[223999]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:31:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:31:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:31:56 localhost podman[224034]: 2025-11-28 09:31:56.854779178 +0000 UTC m=+0.081694004 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 28 04:31:56 localhost podman[224034]: 2025-11-28 09:31:56.865227639 +0000 UTC m=+0.092142485 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 28 04:31:56 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:31:56 localhost podman[224033]: 2025-11-28 09:31:56.957973589 +0000 UTC m=+0.185128616 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 04:31:57 localhost podman[224033]: 2025-11-28 09:31:57.055423328 +0000 UTC m=+0.282578355 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 28 04:31:57 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:31:57 localhost python3.9[224147]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:31:57 localhost python3.9[224233]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322316.7633526-3388-32646983963191/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:31:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50495 DF PROTO=TCP SPT=56108 DPT=9101 SEQ=244356425 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1CC48A0000000001030307) Nov 28 04:31:58 localhost python3.9[224341]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:31:58 localhost python3.9[224427]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322317.9220626-3388-117429173261899/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=534005c01c7af821d962fad87e973f668cecbdc9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:31:59 localhost python3.9[224535]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:32:01 localhost python3.9[224621]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322319.1117601-3388-256889393724881/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:32:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50497 DF PROTO=TCP SPT=56108 DPT=9101 SEQ=244356425 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1CD0820000000001030307) Nov 28 04:32:01 localhost python3.9[224729]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:32:02 localhost python3.9[224815]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322321.25727-3388-240308374367014/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:32:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35725 DF PROTO=TCP SPT=44990 DPT=9105 SEQ=3272907869 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1CD9820000000001030307) Nov 28 04:32:03 localhost python3.9[224925]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:32:04 localhost python3.9[225035]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:32:05 localhost python3.9[225145]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:32:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32372 DF PROTO=TCP SPT=44028 DPT=9100 SEQ=3257820138 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1CE5020000000001030307) Nov 28 04:32:06 localhost python3.9[225257]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:32:07 localhost python3.9[225365]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:32:07 localhost python3.9[225475]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:32:08 localhost python3.9[225561]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322327.497679-3763-213543367310891/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:32:09 localhost python3.9[225669]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:32:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54205 DF PROTO=TCP SPT=34742 DPT=9102 SEQ=829525828 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1CF1430000000001030307) Nov 28 04:32:09 localhost python3.9[225755]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322328.8099177-3808-180727474085391/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:32:10 localhost python3.9[225865]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False Nov 28 04:32:11 localhost python3.9[225975]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 28 04:32:12 localhost python3[226085]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False Nov 28 04:32:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32374 DF PROTO=TCP SPT=44028 DPT=9100 SEQ=3257820138 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1CFCC20000000001030307) Nov 28 04:32:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:32:13 localhost podman[226113]: 2025-11-28 09:32:13.866659643 +0000 UTC m=+0.099119765 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:32:13 localhost podman[226113]: 2025-11-28 09:32:13.878490119 +0000 UTC m=+0.110950221 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 04:32:13 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:32:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32162 DF PROTO=TCP SPT=46022 DPT=9882 SEQ=3945402709 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D0F420000000001030307) Nov 28 04:32:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24295 DF PROTO=TCP SPT=39316 DPT=9105 SEQ=366948878 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D16C20000000001030307) Nov 28 04:32:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24296 DF PROTO=TCP SPT=39316 DPT=9105 SEQ=366948878 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D1EC20000000001030307) Nov 28 04:32:22 localhost podman[226100]: 2025-11-28 09:32:12.584122143 +0000 UTC m=+0.048020215 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Nov 28 04:32:22 localhost podman[226181]: Nov 28 04:32:23 localhost podman[226181]: 2025-11-28 09:32:23.053778194 +0000 UTC m=+0.124617755 container create f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible) Nov 28 04:32:23 localhost podman[226181]: 2025-11-28 09:32:22.965682059 +0000 UTC m=+0.036521660 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Nov 28 04:32:23 localhost python3[226085]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init Nov 28 04:32:24 localhost python3.9[226328]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:32:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24297 DF PROTO=TCP SPT=39316 DPT=9105 SEQ=366948878 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D2E830000000001030307) Nov 28 04:32:25 localhost python3.9[226440]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False Nov 28 04:32:26 localhost python3.9[226550]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 28 04:32:27 localhost python3[226660]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False Nov 28 04:32:27 localhost python3[226660]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a",#012 "Digest": "sha256:647f1d5dc1b70ffa3e1832199619d57bfaeceac8823ff53ece64b8e42cc9688e",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:647f1d5dc1b70ffa3e1832199619d57bfaeceac8823ff53ece64b8e42cc9688e"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-11-26T06:36:07.10279245Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1211782527,#012 "VirtualSize": 1211782527,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309/diff:/var/lib/containers/storage/overlay/f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a/diff:/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",#012 "sha256:1e3477d3ea795ca64b46f28aa9428ba791c4250e0fd05e173a4b9c0fb0bdee23",#012 "sha256:c136b33417f134a3b932677bcf7a2df089c29f20eca250129eafd2132d4708bb",#012 "sha256:7913bde445307e7f24767d9149b2e7f498930793ac9f073ccec69b608c009d31",#012 "sha256:084b2323a717fe711217b0ec21da61f4804f7a0d506adae935888421b80809cf"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2025-11-25T04:02:36.223494528Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:36.223562059Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251125\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:39.054452717Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-11-26T06:10:57.55004106Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550061231Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550071761Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550082711Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550094371Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550104472Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.937139683Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:11:33.845342269Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Nov 28 04:32:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:32:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:32:27 localhost podman[226710]: 2025-11-28 09:32:27.523179155 +0000 UTC m=+0.111435376 container remove c02210c167d9e7a3f114a7f912e92750bb55981744ab9dede5df79d52cbfcda6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'c9c242145d21d40ef98889981c05ca84-0f0904943dda1bf1d123bdf96d71020f'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Nov 28 04:32:27 localhost python3[226660]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute Nov 28 04:32:27 localhost podman[226722]: 2025-11-28 09:32:27.580919737 +0000 UTC m=+0.090717129 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 04:32:27 localhost podman[226723]: 2025-11-28 09:32:27.651123724 +0000 UTC m=+0.160354328 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:32:27 localhost podman[226741]: Nov 28 04:32:27 localhost podman[226741]: 2025-11-28 09:32:27.669737565 +0000 UTC m=+0.126502245 container create 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251125, tcib_managed=true, container_name=nova_compute, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 28 04:32:27 localhost podman[226741]: 2025-11-28 09:32:27.593429324 +0000 UTC m=+0.050194054 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Nov 28 04:32:27 localhost podman[226722]: 2025-11-28 09:32:27.674370862 +0000 UTC m=+0.184168224 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:32:27 localhost python3[226660]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start Nov 28 04:32:27 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:32:27 localhost podman[226723]: 2025-11-28 09:32:27.689761321 +0000 UTC m=+0.198991905 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:32:27 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:32:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36869 DF PROTO=TCP SPT=59624 DPT=9101 SEQ=2115230041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D39BA0000000001030307) Nov 28 04:32:28 localhost python3.9[226912]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:32:29 localhost python3.9[227024]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:32:30 localhost python3.9[227133]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764322349.6011608-4084-267492742123445/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:32:30 localhost python3.9[227188]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 28 04:32:30 localhost systemd[1]: Reloading. Nov 28 04:32:30 localhost systemd-rc-local-generator[227214]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:32:30 localhost systemd-sysv-generator[227218]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:32:30 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:32:30 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:32:30 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:32:30 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:32:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:32:31 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:32:31 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:32:31 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:32:31 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:32:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36871 DF PROTO=TCP SPT=59624 DPT=9101 SEQ=2115230041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D45C30000000001030307) Nov 28 04:32:31 localhost python3.9[227278]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:32:31 localhost systemd[1]: Reloading. Nov 28 04:32:31 localhost systemd-rc-local-generator[227305]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:32:31 localhost systemd-sysv-generator[227310]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:32:32 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:32:32 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:32:32 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:32:32 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:32:32 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:32:32 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:32:32 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:32:32 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:32:32 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:32:32 localhost systemd[1]: Starting nova_compute container... Nov 28 04:32:32 localhost systemd[1]: Started libcrun container. Nov 28 04:32:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Nov 28 04:32:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Nov 28 04:32:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 04:32:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Nov 28 04:32:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 28 04:32:32 localhost podman[227319]: 2025-11-28 09:32:32.346936079 +0000 UTC m=+0.126438723 container init 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible) Nov 28 04:32:32 localhost podman[227319]: 2025-11-28 09:32:32.355758878 +0000 UTC m=+0.135261512 container start 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Nov 28 04:32:32 localhost podman[227319]: nova_compute Nov 28 04:32:32 localhost nova_compute[227332]: + sudo -E kolla_set_configs Nov 28 04:32:32 localhost systemd[1]: Started nova_compute container. Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Validating config file Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Copying service configuration files Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Deleting /etc/nova/nova.conf Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Setting permission for /etc/nova/nova.conf Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Deleting /etc/ceph Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Creating directory /etc/ceph Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Setting permission for /etc/ceph Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Deleting /usr/sbin/iscsiadm Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Writing out command to execute Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Nov 28 04:32:32 localhost nova_compute[227332]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Nov 28 04:32:32 localhost nova_compute[227332]: ++ cat /run_command Nov 28 04:32:32 localhost nova_compute[227332]: + CMD=nova-compute Nov 28 04:32:32 localhost nova_compute[227332]: + ARGS= Nov 28 04:32:32 localhost nova_compute[227332]: + sudo kolla_copy_cacerts Nov 28 04:32:32 localhost nova_compute[227332]: + [[ ! -n '' ]] Nov 28 04:32:32 localhost nova_compute[227332]: + . kolla_extend_start Nov 28 04:32:32 localhost nova_compute[227332]: Running command: 'nova-compute' Nov 28 04:32:32 localhost nova_compute[227332]: + echo 'Running command: '\''nova-compute'\''' Nov 28 04:32:32 localhost nova_compute[227332]: + umask 0022 Nov 28 04:32:32 localhost nova_compute[227332]: + exec nova-compute Nov 28 04:32:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23238 DF PROTO=TCP SPT=43890 DPT=9102 SEQ=1949467689 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D4EC20000000001030307) Nov 28 04:32:33 localhost python3.9[227452]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.032 227336 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.032 227336 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.032 227336 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.033 227336 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.146 227336 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.153 227336 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.154 227336 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.617 227336 INFO nova.virt.driver [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.901 227336 INFO nova.compute.provider_config [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.913 227336 WARNING nova.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.913 227336 DEBUG oslo_concurrency.lockutils [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.914 227336 DEBUG oslo_concurrency.lockutils [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.914 227336 DEBUG oslo_concurrency.lockutils [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.915 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.915 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.915 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.915 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.916 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.916 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.916 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.916 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.917 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.917 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.917 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.917 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.918 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.918 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.918 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.918 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.919 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.919 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.919 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.919 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] console_host = np0005538513.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.920 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.920 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.920 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.920 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.921 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.921 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.921 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.921 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.922 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.922 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.922 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.922 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.923 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.923 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.923 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.923 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.924 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.924 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.924 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] host = np0005538513.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.924 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.925 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.925 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.925 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.926 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.926 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.926 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.926 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.927 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.927 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.927 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.927 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.928 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.928 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.928 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.928 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.929 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.929 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.929 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.929 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.930 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.930 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.930 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.930 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.931 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.931 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.931 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.931 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.932 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.932 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.932 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.932 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.933 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.933 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.933 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.934 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.934 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.934 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.934 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.935 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.935 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.935 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] my_block_storage_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.935 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] my_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.936 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.936 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.936 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.936 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.937 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.937 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.937 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.937 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.938 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.938 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.938 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.938 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.939 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.939 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.939 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.939 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.940 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.940 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.940 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.940 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.941 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.941 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.941 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.941 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.942 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.942 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.942 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.942 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.943 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.943 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.943 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.943 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.944 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.944 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.944 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.944 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.944 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.945 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.945 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.945 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.945 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.946 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.946 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.946 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.946 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.947 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.947 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.947 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.947 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.948 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.948 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.948 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.948 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.949 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.949 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.949 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.949 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.950 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.950 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.950 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.950 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.951 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.951 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.951 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.951 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.952 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.952 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.952 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.952 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.953 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.953 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.953 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.954 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.954 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.954 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.954 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.955 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.955 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.955 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.955 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.956 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.956 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.956 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.956 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.957 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.957 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.957 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.957 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.958 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.958 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.958 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.958 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.959 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.959 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.959 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.959 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.960 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.960 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.960 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.960 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.961 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.961 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.961 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.961 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.962 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.962 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.962 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.962 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.963 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.963 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.963 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.963 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.963 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.964 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.964 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.964 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.964 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.964 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.964 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.965 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.965 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.965 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.965 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.965 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.965 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.966 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.966 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.966 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.966 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.966 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.967 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.967 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.967 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.967 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.967 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.967 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.968 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.968 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.968 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.968 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.968 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.os_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.968 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.969 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.969 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.969 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.969 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.969 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.970 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.970 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.970 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.970 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.970 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.970 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.971 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.971 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.971 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.971 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.971 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.971 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.972 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.972 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.972 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.972 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.972 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.973 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.973 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.973 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.973 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.973 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.974 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.974 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.974 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.974 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.974 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.974 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.975 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.975 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.975 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.975 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.975 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.976 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.976 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.976 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.976 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.976 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.976 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.977 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.977 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.977 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.977 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.977 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.977 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.978 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.978 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.978 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.978 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.978 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.979 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.979 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.979 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.979 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.979 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.979 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.980 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.980 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.980 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.980 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.980 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.980 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.981 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.981 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.981 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.981 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.981 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.982 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.982 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.982 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.982 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.982 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.982 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.983 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.983 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.983 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.983 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.983 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.984 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.984 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.984 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.984 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.984 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.985 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.985 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.985 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.985 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.985 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.985 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.986 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.986 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.986 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.986 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.986 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.987 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.987 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.987 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.987 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.988 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.988 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.988 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.988 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.989 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.989 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.989 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.989 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.989 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.990 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.990 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.990 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.990 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.990 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.990 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.991 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.991 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.991 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.991 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.991 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.991 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.992 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.992 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.992 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.992 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.992 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.993 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.993 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.993 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.993 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.993 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.994 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.994 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.994 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.994 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.994 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.994 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.995 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.995 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.995 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.995 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.995 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.996 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.996 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.996 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.996 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.996 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.996 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.997 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.997 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.997 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.997 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.997 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.997 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.998 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.998 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.998 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.998 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.998 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.998 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.999 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.999 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.999 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:34 localhost nova_compute[227332]: 2025-11-28 09:32:34.999 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:34.999 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.000 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.000 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.000 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.barbican_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.000 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.000 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.000 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.001 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.001 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.001 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.001 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.001 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.001 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.002 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.002 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.002 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.002 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.002 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.003 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.003 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.003 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.003 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.003 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.003 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.004 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.004 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.004 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.004 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.004 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.004 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.005 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.005 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.005 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.005 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.005 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.005 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.006 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.006 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.006 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.006 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.006 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.007 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.007 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.007 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.007 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.007 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.007 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.008 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.008 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.008 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.008 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.008 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.008 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.009 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.009 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.009 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.009 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.009 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.009 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.010 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.010 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.010 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.010 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.010 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.011 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.011 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.011 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.011 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.011 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.011 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.012 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.012 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.012 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.012 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.012 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.012 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.013 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.013 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.013 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.013 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.013 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.014 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.014 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.014 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.014 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.014 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.014 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.015 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.015 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.015 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.015 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.015 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.015 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.016 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.016 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.016 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.016 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.016 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.016 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.017 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.017 227336 WARNING oslo_config.cfg [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Nov 28 04:32:35 localhost nova_compute[227332]: live_migration_uri is deprecated for removal in favor of two other options that Nov 28 04:32:35 localhost nova_compute[227332]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Nov 28 04:32:35 localhost nova_compute[227332]: and ``live_migration_inbound_addr`` respectively. Nov 28 04:32:35 localhost nova_compute[227332]: ). Its value may be silently ignored in the future.#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.017 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.017 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.018 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.018 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.018 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.018 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.018 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.018 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.019 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.019 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.019 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.019 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.019 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.020 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.020 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.020 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.020 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.020 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.020 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.rbd_secret_uuid = 2c5417c9-00eb-57d5-a565-ddecbc7995c1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.021 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.021 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.021 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.021 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.021 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.022 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.022 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.022 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.022 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.022 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.022 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.023 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.023 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.023 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.023 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.024 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.024 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.024 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.024 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.024 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.024 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.025 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.025 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.025 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.025 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.025 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.026 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.026 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.026 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.026 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.026 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.026 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.027 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.027 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.027 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.027 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.027 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.027 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.028 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.028 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.028 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.028 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.028 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.029 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.029 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.029 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.029 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.029 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.029 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.030 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.030 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.030 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.030 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.030 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.030 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.031 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.031 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.031 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.031 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.031 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.031 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.032 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.032 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.032 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.032 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.032 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.033 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.033 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.033 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.033 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.033 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.033 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.034 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.034 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.034 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.034 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.034 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.035 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.035 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.035 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.035 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.035 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.035 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.036 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.036 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.036 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.036 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.036 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.036 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.037 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.037 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.037 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.037 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.037 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.038 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.038 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.038 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.038 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.038 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.038 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.039 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.039 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.039 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.039 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.039 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.039 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.040 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.040 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.040 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.040 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.040 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.041 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.041 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.041 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.041 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.041 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.041 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.042 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.042 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.042 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.042 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.042 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.043 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.043 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.043 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.043 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.043 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.043 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.044 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.044 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.044 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.044 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.044 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.045 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.045 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.045 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.045 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.045 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.045 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.046 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.046 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.046 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.046 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.046 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.047 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.047 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.047 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.047 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.047 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.047 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.048 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.048 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.048 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.048 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.048 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.048 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.049 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.049 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.049 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.049 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.049 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.050 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.050 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.050 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.050 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.050 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.050 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.051 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.051 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.051 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.051 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.051 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.052 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.052 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.052 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.052 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.052 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.052 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.053 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.053 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.053 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.053 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.053 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.054 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.054 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.054 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.054 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.054 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.054 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.055 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.055 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.055 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.055 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.055 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.055 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.056 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.056 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.056 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.056 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.056 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.057 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.057 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.057 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.057 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.057 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.057 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.058 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.058 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.058 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.058 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.058 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.058 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.059 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.059 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.059 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.059 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.059 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.059 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.060 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.060 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.060 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.060 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.060 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.060 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.061 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.061 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.061 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.061 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.061 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.062 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.062 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.062 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.062 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.062 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.063 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.063 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.063 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.063 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.063 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.063 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.064 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.064 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.064 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.064 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.064 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.065 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.065 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.065 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.065 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.065 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.065 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.066 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.066 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.066 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.066 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.066 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.066 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.067 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.067 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.067 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.067 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.067 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.067 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.068 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.068 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.068 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.068 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.068 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.069 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.069 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.069 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.069 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.069 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.069 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.070 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.070 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.071 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.071 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.071 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.072 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.072 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.072 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.073 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.073 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.073 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.074 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.074 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.074 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.074 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.075 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.075 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.075 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.076 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.076 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.076 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.077 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.077 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.077 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.078 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.078 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.078 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.078 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.079 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.079 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.079 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.080 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.080 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.080 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.081 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.081 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.081 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.082 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.082 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.082 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.083 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.083 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.083 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.084 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.084 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.084 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.085 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.085 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.085 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.086 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.086 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.086 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.086 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.087 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.087 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.087 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.088 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.088 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.088 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.088 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.089 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.089 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.089 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.090 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.090 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.090 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.091 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.091 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.091 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.091 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.092 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.092 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.092 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.093 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.093 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.093 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.094 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.094 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.094 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.095 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.095 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.095 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.096 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.096 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.096 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.096 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.097 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.097 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.097 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.098 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.098 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.098 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.099 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.099 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.099 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.100 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.100 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.100 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.101 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.101 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.101 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.102 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.102 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.102 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.102 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.103 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.103 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.103 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.104 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.104 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.104 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.105 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.105 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.105 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.105 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.106 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.106 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.106 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.107 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.107 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.107 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.108 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.108 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.108 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.108 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.109 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.109 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.109 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.110 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.110 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.110 227336 DEBUG oslo_service.service [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.112 227336 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.156 227336 INFO nova.virt.node [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Determined node identity 35fead26-0bad-4950-b646-987079d58a17 from /var/lib/nova/compute_id#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.157 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.158 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.158 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.159 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.172 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.175 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.177 227336 INFO nova.virt.libvirt.driver [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Connection event '1' reason 'None'#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.198 227336 DEBUG nova.virt.libvirt.volume.mount [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.202 227336 INFO nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Libvirt host capabilities Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: eb468aed-e0e9-4528-988f-9267a3530b7a Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: x86_64 Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome-v4 Nov 28 04:32:35 localhost nova_compute[227332]: AMD Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: tcp Nov 28 04:32:35 localhost nova_compute[227332]: rdma Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: 16116612 Nov 28 04:32:35 localhost nova_compute[227332]: 4029153 Nov 28 04:32:35 localhost nova_compute[227332]: 0 Nov 28 04:32:35 localhost nova_compute[227332]: 0 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: selinux Nov 28 04:32:35 localhost nova_compute[227332]: 0 Nov 28 04:32:35 localhost nova_compute[227332]: system_u:system_r:svirt_t:s0 Nov 28 04:32:35 localhost nova_compute[227332]: system_u:system_r:svirt_tcg_t:s0 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: dac Nov 28 04:32:35 localhost nova_compute[227332]: 0 Nov 28 04:32:35 localhost nova_compute[227332]: +107:+107 Nov 28 04:32:35 localhost nova_compute[227332]: +107:+107 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: hvm Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: 32 Nov 28 04:32:35 localhost nova_compute[227332]: /usr/libexec/qemu-kvm Nov 28 04:32:35 localhost nova_compute[227332]: pc-i440fx-rhel7.6.0 Nov 28 04:32:35 localhost nova_compute[227332]: pc Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel9.8.0 Nov 28 04:32:35 localhost nova_compute[227332]: q35 Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel9.6.0 Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel8.6.0 Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel9.4.0 Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel8.5.0 Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel8.3.0 Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel7.6.0 Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel8.4.0 Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel9.2.0 Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel8.2.0 Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel9.0.0 Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel8.0.0 Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel8.1.0 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: hvm Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: 64 Nov 28 04:32:35 localhost nova_compute[227332]: /usr/libexec/qemu-kvm Nov 28 04:32:35 localhost nova_compute[227332]: pc-i440fx-rhel7.6.0 Nov 28 04:32:35 localhost nova_compute[227332]: pc Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel9.8.0 Nov 28 04:32:35 localhost nova_compute[227332]: q35 Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel9.6.0 Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel8.6.0 Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel9.4.0 Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel8.5.0 Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel8.3.0 Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel7.6.0 Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel8.4.0 Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel9.2.0 Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel8.2.0 Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel9.0.0 Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel8.0.0 Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel8.1.0 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: #033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.215 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.233 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: /usr/libexec/qemu-kvm Nov 28 04:32:35 localhost nova_compute[227332]: kvm Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel9.8.0 Nov 28 04:32:35 localhost nova_compute[227332]: i686 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: rom Nov 28 04:32:35 localhost nova_compute[227332]: pflash Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: yes Nov 28 04:32:35 localhost nova_compute[227332]: no Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: no Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: on Nov 28 04:32:35 localhost nova_compute[227332]: off Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: on Nov 28 04:32:35 localhost nova_compute[227332]: off Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome Nov 28 04:32:35 localhost nova_compute[227332]: AMD Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: 486 Nov 28 04:32:35 localhost nova_compute[227332]: 486-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-noTSX Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-noTSX-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server-noTSX Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server-v5 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Conroe Nov 28 04:32:35 localhost nova_compute[227332]: Conroe-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Cooperlake Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cooperlake-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cooperlake-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Denverton Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Denverton-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Denverton-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Denverton-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Dhyana Nov 28 04:32:35 localhost nova_compute[227332]: Dhyana-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Dhyana-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Genoa Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Genoa-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-IBPB Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Milan Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Milan-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Milan-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome-v4 Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-v1 Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-v2 Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: GraniteRapids Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: GraniteRapids-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: GraniteRapids-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-noTSX Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-noTSX-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-noTSX Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v5 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v6 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v7 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: IvyBridge Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: IvyBridge-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: IvyBridge-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: IvyBridge-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: KnightsMill Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: KnightsMill-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nehalem Nov 28 04:32:35 localhost nova_compute[227332]: Nehalem-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nehalem-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nehalem-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G1 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G1-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G2 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G2-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G3 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G3-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G4-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G5 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G5-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Penryn Nov 28 04:32:35 localhost nova_compute[227332]: Penryn-v1 Nov 28 04:32:35 localhost nova_compute[227332]: SandyBridge Nov 28 04:32:35 localhost nova_compute[227332]: SandyBridge-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: SandyBridge-v1 Nov 28 04:32:35 localhost nova_compute[227332]: SandyBridge-v2 Nov 28 04:32:35 localhost nova_compute[227332]: SapphireRapids Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: SapphireRapids-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: SapphireRapids-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: SapphireRapids-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: SierraForest Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: SierraForest-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client-noTSX-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-noTSX-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-v5 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Snowridge Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Snowridge-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Snowridge-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Snowridge-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Snowridge-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Westmere Nov 28 04:32:35 localhost nova_compute[227332]: Westmere-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Westmere-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Westmere-v2 Nov 28 04:32:35 localhost nova_compute[227332]: athlon Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: athlon-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: core2duo Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: core2duo-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: coreduo Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: coreduo-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: kvm32 Nov 28 04:32:35 localhost nova_compute[227332]: kvm32-v1 Nov 28 04:32:35 localhost nova_compute[227332]: kvm64 Nov 28 04:32:35 localhost nova_compute[227332]: kvm64-v1 Nov 28 04:32:35 localhost nova_compute[227332]: n270 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: n270-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: pentium Nov 28 04:32:35 localhost nova_compute[227332]: pentium-v1 Nov 28 04:32:35 localhost nova_compute[227332]: pentium2 Nov 28 04:32:35 localhost nova_compute[227332]: pentium2-v1 Nov 28 04:32:35 localhost nova_compute[227332]: pentium3 Nov 28 04:32:35 localhost nova_compute[227332]: pentium3-v1 Nov 28 04:32:35 localhost nova_compute[227332]: phenom Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: phenom-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: qemu32 Nov 28 04:32:35 localhost nova_compute[227332]: qemu32-v1 Nov 28 04:32:35 localhost nova_compute[227332]: qemu64 Nov 28 04:32:35 localhost nova_compute[227332]: qemu64-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: file Nov 28 04:32:35 localhost nova_compute[227332]: anonymous Nov 28 04:32:35 localhost nova_compute[227332]: memfd Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: disk Nov 28 04:32:35 localhost nova_compute[227332]: cdrom Nov 28 04:32:35 localhost nova_compute[227332]: floppy Nov 28 04:32:35 localhost nova_compute[227332]: lun Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: fdc Nov 28 04:32:35 localhost nova_compute[227332]: scsi Nov 28 04:32:35 localhost nova_compute[227332]: virtio Nov 28 04:32:35 localhost nova_compute[227332]: usb Nov 28 04:32:35 localhost nova_compute[227332]: sata Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: virtio Nov 28 04:32:35 localhost nova_compute[227332]: virtio-transitional Nov 28 04:32:35 localhost nova_compute[227332]: virtio-non-transitional Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: vnc Nov 28 04:32:35 localhost nova_compute[227332]: egl-headless Nov 28 04:32:35 localhost nova_compute[227332]: dbus Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: subsystem Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: default Nov 28 04:32:35 localhost nova_compute[227332]: mandatory Nov 28 04:32:35 localhost nova_compute[227332]: requisite Nov 28 04:32:35 localhost nova_compute[227332]: optional Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: usb Nov 28 04:32:35 localhost nova_compute[227332]: pci Nov 28 04:32:35 localhost nova_compute[227332]: scsi Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: virtio Nov 28 04:32:35 localhost nova_compute[227332]: virtio-transitional Nov 28 04:32:35 localhost nova_compute[227332]: virtio-non-transitional Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: random Nov 28 04:32:35 localhost nova_compute[227332]: egd Nov 28 04:32:35 localhost nova_compute[227332]: builtin Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: path Nov 28 04:32:35 localhost nova_compute[227332]: handle Nov 28 04:32:35 localhost nova_compute[227332]: virtiofs Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: tpm-tis Nov 28 04:32:35 localhost nova_compute[227332]: tpm-crb Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: emulator Nov 28 04:32:35 localhost nova_compute[227332]: external Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: 2.0 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: usb Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: pty Nov 28 04:32:35 localhost nova_compute[227332]: unix Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: qemu Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: builtin Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: default Nov 28 04:32:35 localhost nova_compute[227332]: passt Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: isa Nov 28 04:32:35 localhost nova_compute[227332]: hyperv Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: null Nov 28 04:32:35 localhost nova_compute[227332]: vc Nov 28 04:32:35 localhost nova_compute[227332]: pty Nov 28 04:32:35 localhost nova_compute[227332]: dev Nov 28 04:32:35 localhost nova_compute[227332]: file Nov 28 04:32:35 localhost nova_compute[227332]: pipe Nov 28 04:32:35 localhost nova_compute[227332]: stdio Nov 28 04:32:35 localhost nova_compute[227332]: udp Nov 28 04:32:35 localhost nova_compute[227332]: tcp Nov 28 04:32:35 localhost nova_compute[227332]: unix Nov 28 04:32:35 localhost nova_compute[227332]: qemu-vdagent Nov 28 04:32:35 localhost nova_compute[227332]: dbus Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: relaxed Nov 28 04:32:35 localhost nova_compute[227332]: vapic Nov 28 04:32:35 localhost nova_compute[227332]: spinlocks Nov 28 04:32:35 localhost nova_compute[227332]: vpindex Nov 28 04:32:35 localhost nova_compute[227332]: runtime Nov 28 04:32:35 localhost nova_compute[227332]: synic Nov 28 04:32:35 localhost nova_compute[227332]: stimer Nov 28 04:32:35 localhost nova_compute[227332]: reset Nov 28 04:32:35 localhost nova_compute[227332]: vendor_id Nov 28 04:32:35 localhost nova_compute[227332]: frequencies Nov 28 04:32:35 localhost nova_compute[227332]: reenlightenment Nov 28 04:32:35 localhost nova_compute[227332]: tlbflush Nov 28 04:32:35 localhost nova_compute[227332]: ipi Nov 28 04:32:35 localhost nova_compute[227332]: avic Nov 28 04:32:35 localhost nova_compute[227332]: emsr_bitmap Nov 28 04:32:35 localhost nova_compute[227332]: xmm_input Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: 4095 Nov 28 04:32:35 localhost nova_compute[227332]: on Nov 28 04:32:35 localhost nova_compute[227332]: off Nov 28 04:32:35 localhost nova_compute[227332]: off Nov 28 04:32:35 localhost nova_compute[227332]: Linux KVM Hv Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: tdx Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.240 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: /usr/libexec/qemu-kvm Nov 28 04:32:35 localhost nova_compute[227332]: kvm Nov 28 04:32:35 localhost nova_compute[227332]: pc-i440fx-rhel7.6.0 Nov 28 04:32:35 localhost nova_compute[227332]: i686 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: rom Nov 28 04:32:35 localhost nova_compute[227332]: pflash Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: yes Nov 28 04:32:35 localhost nova_compute[227332]: no Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: no Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: on Nov 28 04:32:35 localhost nova_compute[227332]: off Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: on Nov 28 04:32:35 localhost nova_compute[227332]: off Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome Nov 28 04:32:35 localhost nova_compute[227332]: AMD Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: 486 Nov 28 04:32:35 localhost nova_compute[227332]: 486-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-noTSX Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-noTSX-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server-noTSX Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server-v5 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Conroe Nov 28 04:32:35 localhost nova_compute[227332]: Conroe-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Cooperlake Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cooperlake-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cooperlake-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Denverton Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Denverton-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Denverton-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Denverton-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Dhyana Nov 28 04:32:35 localhost nova_compute[227332]: Dhyana-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Dhyana-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Genoa Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Genoa-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-IBPB Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Milan Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Milan-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Milan-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome-v4 Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-v1 Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-v2 Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: GraniteRapids Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: GraniteRapids-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: GraniteRapids-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-noTSX Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-noTSX-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-noTSX Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v5 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v6 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v7 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: IvyBridge Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: IvyBridge-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: IvyBridge-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: IvyBridge-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: KnightsMill Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: KnightsMill-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nehalem Nov 28 04:32:35 localhost nova_compute[227332]: Nehalem-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nehalem-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nehalem-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G1 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G1-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G2 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G2-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G3 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G3-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G4-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G5 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G5-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Penryn Nov 28 04:32:35 localhost nova_compute[227332]: Penryn-v1 Nov 28 04:32:35 localhost nova_compute[227332]: SandyBridge Nov 28 04:32:35 localhost nova_compute[227332]: SandyBridge-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: SandyBridge-v1 Nov 28 04:32:35 localhost nova_compute[227332]: SandyBridge-v2 Nov 28 04:32:35 localhost nova_compute[227332]: SapphireRapids Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: SapphireRapids-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: SapphireRapids-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: SapphireRapids-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: SierraForest Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: SierraForest-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client-noTSX-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-noTSX-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-v5 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Snowridge Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Snowridge-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Snowridge-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Snowridge-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Snowridge-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Westmere Nov 28 04:32:35 localhost nova_compute[227332]: Westmere-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Westmere-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Westmere-v2 Nov 28 04:32:35 localhost nova_compute[227332]: athlon Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: athlon-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: core2duo Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: core2duo-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: coreduo Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: coreduo-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: kvm32 Nov 28 04:32:35 localhost nova_compute[227332]: kvm32-v1 Nov 28 04:32:35 localhost nova_compute[227332]: kvm64 Nov 28 04:32:35 localhost nova_compute[227332]: kvm64-v1 Nov 28 04:32:35 localhost nova_compute[227332]: n270 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: n270-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: pentium Nov 28 04:32:35 localhost nova_compute[227332]: pentium-v1 Nov 28 04:32:35 localhost nova_compute[227332]: pentium2 Nov 28 04:32:35 localhost nova_compute[227332]: pentium2-v1 Nov 28 04:32:35 localhost nova_compute[227332]: pentium3 Nov 28 04:32:35 localhost nova_compute[227332]: pentium3-v1 Nov 28 04:32:35 localhost nova_compute[227332]: phenom Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: phenom-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: qemu32 Nov 28 04:32:35 localhost nova_compute[227332]: qemu32-v1 Nov 28 04:32:35 localhost nova_compute[227332]: qemu64 Nov 28 04:32:35 localhost nova_compute[227332]: qemu64-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: file Nov 28 04:32:35 localhost nova_compute[227332]: anonymous Nov 28 04:32:35 localhost nova_compute[227332]: memfd Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: disk Nov 28 04:32:35 localhost nova_compute[227332]: cdrom Nov 28 04:32:35 localhost nova_compute[227332]: floppy Nov 28 04:32:35 localhost nova_compute[227332]: lun Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: ide Nov 28 04:32:35 localhost nova_compute[227332]: fdc Nov 28 04:32:35 localhost nova_compute[227332]: scsi Nov 28 04:32:35 localhost nova_compute[227332]: virtio Nov 28 04:32:35 localhost nova_compute[227332]: usb Nov 28 04:32:35 localhost nova_compute[227332]: sata Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: virtio Nov 28 04:32:35 localhost nova_compute[227332]: virtio-transitional Nov 28 04:32:35 localhost nova_compute[227332]: virtio-non-transitional Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: vnc Nov 28 04:32:35 localhost nova_compute[227332]: egl-headless Nov 28 04:32:35 localhost nova_compute[227332]: dbus Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: subsystem Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: default Nov 28 04:32:35 localhost nova_compute[227332]: mandatory Nov 28 04:32:35 localhost nova_compute[227332]: requisite Nov 28 04:32:35 localhost nova_compute[227332]: optional Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: usb Nov 28 04:32:35 localhost nova_compute[227332]: pci Nov 28 04:32:35 localhost nova_compute[227332]: scsi Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: virtio Nov 28 04:32:35 localhost nova_compute[227332]: virtio-transitional Nov 28 04:32:35 localhost nova_compute[227332]: virtio-non-transitional Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: random Nov 28 04:32:35 localhost nova_compute[227332]: egd Nov 28 04:32:35 localhost nova_compute[227332]: builtin Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: path Nov 28 04:32:35 localhost nova_compute[227332]: handle Nov 28 04:32:35 localhost nova_compute[227332]: virtiofs Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: tpm-tis Nov 28 04:32:35 localhost nova_compute[227332]: tpm-crb Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: emulator Nov 28 04:32:35 localhost nova_compute[227332]: external Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: 2.0 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: usb Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: pty Nov 28 04:32:35 localhost nova_compute[227332]: unix Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: qemu Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: builtin Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: default Nov 28 04:32:35 localhost nova_compute[227332]: passt Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: isa Nov 28 04:32:35 localhost nova_compute[227332]: hyperv Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: null Nov 28 04:32:35 localhost nova_compute[227332]: vc Nov 28 04:32:35 localhost nova_compute[227332]: pty Nov 28 04:32:35 localhost nova_compute[227332]: dev Nov 28 04:32:35 localhost nova_compute[227332]: file Nov 28 04:32:35 localhost nova_compute[227332]: pipe Nov 28 04:32:35 localhost nova_compute[227332]: stdio Nov 28 04:32:35 localhost nova_compute[227332]: udp Nov 28 04:32:35 localhost nova_compute[227332]: tcp Nov 28 04:32:35 localhost nova_compute[227332]: unix Nov 28 04:32:35 localhost nova_compute[227332]: qemu-vdagent Nov 28 04:32:35 localhost nova_compute[227332]: dbus Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: relaxed Nov 28 04:32:35 localhost nova_compute[227332]: vapic Nov 28 04:32:35 localhost nova_compute[227332]: spinlocks Nov 28 04:32:35 localhost nova_compute[227332]: vpindex Nov 28 04:32:35 localhost nova_compute[227332]: runtime Nov 28 04:32:35 localhost nova_compute[227332]: synic Nov 28 04:32:35 localhost nova_compute[227332]: stimer Nov 28 04:32:35 localhost nova_compute[227332]: reset Nov 28 04:32:35 localhost nova_compute[227332]: vendor_id Nov 28 04:32:35 localhost nova_compute[227332]: frequencies Nov 28 04:32:35 localhost nova_compute[227332]: reenlightenment Nov 28 04:32:35 localhost nova_compute[227332]: tlbflush Nov 28 04:32:35 localhost nova_compute[227332]: ipi Nov 28 04:32:35 localhost nova_compute[227332]: avic Nov 28 04:32:35 localhost nova_compute[227332]: emsr_bitmap Nov 28 04:32:35 localhost nova_compute[227332]: xmm_input Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: 4095 Nov 28 04:32:35 localhost nova_compute[227332]: on Nov 28 04:32:35 localhost nova_compute[227332]: off Nov 28 04:32:35 localhost nova_compute[227332]: off Nov 28 04:32:35 localhost nova_compute[227332]: Linux KVM Hv Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: tdx Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.286 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.290 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: /usr/libexec/qemu-kvm Nov 28 04:32:35 localhost nova_compute[227332]: kvm Nov 28 04:32:35 localhost nova_compute[227332]: pc-q35-rhel9.8.0 Nov 28 04:32:35 localhost nova_compute[227332]: x86_64 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: efi Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Nov 28 04:32:35 localhost nova_compute[227332]: /usr/share/edk2/ovmf/OVMF_CODE.fd Nov 28 04:32:35 localhost nova_compute[227332]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Nov 28 04:32:35 localhost nova_compute[227332]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: rom Nov 28 04:32:35 localhost nova_compute[227332]: pflash Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: yes Nov 28 04:32:35 localhost nova_compute[227332]: no Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: yes Nov 28 04:32:35 localhost nova_compute[227332]: no Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: on Nov 28 04:32:35 localhost nova_compute[227332]: off Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: on Nov 28 04:32:35 localhost nova_compute[227332]: off Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome Nov 28 04:32:35 localhost nova_compute[227332]: AMD Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: 486 Nov 28 04:32:35 localhost nova_compute[227332]: 486-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-noTSX Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-noTSX-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server-noTSX Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server-v5 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Conroe Nov 28 04:32:35 localhost nova_compute[227332]: Conroe-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Cooperlake Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cooperlake-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cooperlake-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Denverton Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Denverton-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Denverton-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Denverton-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Dhyana Nov 28 04:32:35 localhost nova_compute[227332]: Dhyana-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Dhyana-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Genoa Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Genoa-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-IBPB Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Milan Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Milan-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Milan-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome-v4 Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-v1 Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-v2 Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: GraniteRapids Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: GraniteRapids-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: GraniteRapids-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-noTSX Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-noTSX-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-noTSX Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v5 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v6 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v7 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: IvyBridge Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: IvyBridge-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: IvyBridge-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: IvyBridge-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: KnightsMill Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: KnightsMill-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nehalem Nov 28 04:32:35 localhost nova_compute[227332]: Nehalem-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nehalem-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nehalem-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G1 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G1-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G2 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G2-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G3 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G3-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G4-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G5 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G5-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Penryn Nov 28 04:32:35 localhost nova_compute[227332]: Penryn-v1 Nov 28 04:32:35 localhost nova_compute[227332]: SandyBridge Nov 28 04:32:35 localhost nova_compute[227332]: SandyBridge-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: SandyBridge-v1 Nov 28 04:32:35 localhost nova_compute[227332]: SandyBridge-v2 Nov 28 04:32:35 localhost nova_compute[227332]: SapphireRapids Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: SapphireRapids-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: SapphireRapids-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: SapphireRapids-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: SierraForest Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: SierraForest-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client-noTSX-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-noTSX-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-v5 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Snowridge Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Snowridge-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Snowridge-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Snowridge-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Snowridge-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Westmere Nov 28 04:32:35 localhost nova_compute[227332]: Westmere-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Westmere-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Westmere-v2 Nov 28 04:32:35 localhost nova_compute[227332]: athlon Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: athlon-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: core2duo Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: core2duo-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: coreduo Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: coreduo-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: kvm32 Nov 28 04:32:35 localhost nova_compute[227332]: kvm32-v1 Nov 28 04:32:35 localhost nova_compute[227332]: kvm64 Nov 28 04:32:35 localhost nova_compute[227332]: kvm64-v1 Nov 28 04:32:35 localhost nova_compute[227332]: n270 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: n270-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: pentium Nov 28 04:32:35 localhost nova_compute[227332]: pentium-v1 Nov 28 04:32:35 localhost nova_compute[227332]: pentium2 Nov 28 04:32:35 localhost nova_compute[227332]: pentium2-v1 Nov 28 04:32:35 localhost nova_compute[227332]: pentium3 Nov 28 04:32:35 localhost nova_compute[227332]: pentium3-v1 Nov 28 04:32:35 localhost nova_compute[227332]: phenom Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: phenom-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: qemu32 Nov 28 04:32:35 localhost nova_compute[227332]: qemu32-v1 Nov 28 04:32:35 localhost nova_compute[227332]: qemu64 Nov 28 04:32:35 localhost nova_compute[227332]: qemu64-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: file Nov 28 04:32:35 localhost nova_compute[227332]: anonymous Nov 28 04:32:35 localhost nova_compute[227332]: memfd Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: disk Nov 28 04:32:35 localhost nova_compute[227332]: cdrom Nov 28 04:32:35 localhost nova_compute[227332]: floppy Nov 28 04:32:35 localhost nova_compute[227332]: lun Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: fdc Nov 28 04:32:35 localhost nova_compute[227332]: scsi Nov 28 04:32:35 localhost nova_compute[227332]: virtio Nov 28 04:32:35 localhost nova_compute[227332]: usb Nov 28 04:32:35 localhost nova_compute[227332]: sata Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: virtio Nov 28 04:32:35 localhost nova_compute[227332]: virtio-transitional Nov 28 04:32:35 localhost nova_compute[227332]: virtio-non-transitional Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: vnc Nov 28 04:32:35 localhost nova_compute[227332]: egl-headless Nov 28 04:32:35 localhost nova_compute[227332]: dbus Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: subsystem Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: default Nov 28 04:32:35 localhost nova_compute[227332]: mandatory Nov 28 04:32:35 localhost nova_compute[227332]: requisite Nov 28 04:32:35 localhost nova_compute[227332]: optional Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: usb Nov 28 04:32:35 localhost nova_compute[227332]: pci Nov 28 04:32:35 localhost nova_compute[227332]: scsi Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: virtio Nov 28 04:32:35 localhost nova_compute[227332]: virtio-transitional Nov 28 04:32:35 localhost nova_compute[227332]: virtio-non-transitional Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: random Nov 28 04:32:35 localhost nova_compute[227332]: egd Nov 28 04:32:35 localhost nova_compute[227332]: builtin Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: path Nov 28 04:32:35 localhost nova_compute[227332]: handle Nov 28 04:32:35 localhost nova_compute[227332]: virtiofs Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: tpm-tis Nov 28 04:32:35 localhost nova_compute[227332]: tpm-crb Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: emulator Nov 28 04:32:35 localhost nova_compute[227332]: external Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: 2.0 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: usb Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: pty Nov 28 04:32:35 localhost nova_compute[227332]: unix Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: qemu Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: builtin Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: default Nov 28 04:32:35 localhost nova_compute[227332]: passt Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: isa Nov 28 04:32:35 localhost nova_compute[227332]: hyperv Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: null Nov 28 04:32:35 localhost nova_compute[227332]: vc Nov 28 04:32:35 localhost nova_compute[227332]: pty Nov 28 04:32:35 localhost nova_compute[227332]: dev Nov 28 04:32:35 localhost nova_compute[227332]: file Nov 28 04:32:35 localhost nova_compute[227332]: pipe Nov 28 04:32:35 localhost nova_compute[227332]: stdio Nov 28 04:32:35 localhost nova_compute[227332]: udp Nov 28 04:32:35 localhost nova_compute[227332]: tcp Nov 28 04:32:35 localhost nova_compute[227332]: unix Nov 28 04:32:35 localhost nova_compute[227332]: qemu-vdagent Nov 28 04:32:35 localhost nova_compute[227332]: dbus Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: relaxed Nov 28 04:32:35 localhost nova_compute[227332]: vapic Nov 28 04:32:35 localhost nova_compute[227332]: spinlocks Nov 28 04:32:35 localhost nova_compute[227332]: vpindex Nov 28 04:32:35 localhost nova_compute[227332]: runtime Nov 28 04:32:35 localhost nova_compute[227332]: synic Nov 28 04:32:35 localhost nova_compute[227332]: stimer Nov 28 04:32:35 localhost nova_compute[227332]: reset Nov 28 04:32:35 localhost nova_compute[227332]: vendor_id Nov 28 04:32:35 localhost nova_compute[227332]: frequencies Nov 28 04:32:35 localhost nova_compute[227332]: reenlightenment Nov 28 04:32:35 localhost nova_compute[227332]: tlbflush Nov 28 04:32:35 localhost nova_compute[227332]: ipi Nov 28 04:32:35 localhost nova_compute[227332]: avic Nov 28 04:32:35 localhost nova_compute[227332]: emsr_bitmap Nov 28 04:32:35 localhost nova_compute[227332]: xmm_input Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: 4095 Nov 28 04:32:35 localhost nova_compute[227332]: on Nov 28 04:32:35 localhost nova_compute[227332]: off Nov 28 04:32:35 localhost nova_compute[227332]: off Nov 28 04:32:35 localhost nova_compute[227332]: Linux KVM Hv Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: tdx Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.344 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: /usr/libexec/qemu-kvm Nov 28 04:32:35 localhost nova_compute[227332]: kvm Nov 28 04:32:35 localhost nova_compute[227332]: pc-i440fx-rhel7.6.0 Nov 28 04:32:35 localhost nova_compute[227332]: x86_64 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: rom Nov 28 04:32:35 localhost nova_compute[227332]: pflash Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: yes Nov 28 04:32:35 localhost nova_compute[227332]: no Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: no Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: on Nov 28 04:32:35 localhost nova_compute[227332]: off Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: on Nov 28 04:32:35 localhost nova_compute[227332]: off Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome Nov 28 04:32:35 localhost nova_compute[227332]: AMD Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: 486 Nov 28 04:32:35 localhost nova_compute[227332]: 486-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-noTSX Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-noTSX-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Broadwell-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server-noTSX Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cascadelake-Server-v5 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Conroe Nov 28 04:32:35 localhost nova_compute[227332]: Conroe-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Cooperlake Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cooperlake-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Cooperlake-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Denverton Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Denverton-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Denverton-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Denverton-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Dhyana Nov 28 04:32:35 localhost nova_compute[227332]: Dhyana-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Dhyana-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Genoa Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Genoa-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-IBPB Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Milan Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Milan-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Milan-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-Rome-v4 Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-v1 Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-v2 Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: EPYC-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: GraniteRapids Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: GraniteRapids-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: GraniteRapids-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-noTSX Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-noTSX-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Haswell-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-noTSX Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v5 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v6 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Icelake-Server-v7 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: IvyBridge Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: IvyBridge-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: IvyBridge-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: IvyBridge-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: KnightsMill Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: KnightsMill-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nehalem Nov 28 04:32:35 localhost nova_compute[227332]: Nehalem-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nehalem-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nehalem-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G1 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G1-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G2 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G2-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G3 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G3-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G4-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G5 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Opteron_G5-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Penryn Nov 28 04:32:35 localhost nova_compute[227332]: Penryn-v1 Nov 28 04:32:35 localhost nova_compute[227332]: SandyBridge Nov 28 04:32:35 localhost nova_compute[227332]: SandyBridge-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: SandyBridge-v1 Nov 28 04:32:35 localhost nova_compute[227332]: SandyBridge-v2 Nov 28 04:32:35 localhost nova_compute[227332]: SapphireRapids Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: SapphireRapids-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: SapphireRapids-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: SapphireRapids-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: SierraForest Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: SierraForest-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client-noTSX-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Client-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-noTSX-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Skylake-Server-v5 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Snowridge Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Snowridge-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Snowridge-v2 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Snowridge-v3 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Snowridge-v4 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Westmere Nov 28 04:32:35 localhost nova_compute[227332]: Westmere-IBRS Nov 28 04:32:35 localhost nova_compute[227332]: Westmere-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Westmere-v2 Nov 28 04:32:35 localhost nova_compute[227332]: athlon Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: athlon-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: core2duo Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost python3.9[227564]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: core2duo-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: coreduo Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: coreduo-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: kvm32 Nov 28 04:32:35 localhost nova_compute[227332]: kvm32-v1 Nov 28 04:32:35 localhost nova_compute[227332]: kvm64 Nov 28 04:32:35 localhost nova_compute[227332]: kvm64-v1 Nov 28 04:32:35 localhost nova_compute[227332]: n270 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: n270-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: pentium Nov 28 04:32:35 localhost nova_compute[227332]: pentium-v1 Nov 28 04:32:35 localhost nova_compute[227332]: pentium2 Nov 28 04:32:35 localhost nova_compute[227332]: pentium2-v1 Nov 28 04:32:35 localhost nova_compute[227332]: pentium3 Nov 28 04:32:35 localhost nova_compute[227332]: pentium3-v1 Nov 28 04:32:35 localhost nova_compute[227332]: phenom Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: phenom-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: qemu32 Nov 28 04:32:35 localhost nova_compute[227332]: qemu32-v1 Nov 28 04:32:35 localhost nova_compute[227332]: qemu64 Nov 28 04:32:35 localhost nova_compute[227332]: qemu64-v1 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: file Nov 28 04:32:35 localhost nova_compute[227332]: anonymous Nov 28 04:32:35 localhost nova_compute[227332]: memfd Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: disk Nov 28 04:32:35 localhost nova_compute[227332]: cdrom Nov 28 04:32:35 localhost nova_compute[227332]: floppy Nov 28 04:32:35 localhost nova_compute[227332]: lun Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: ide Nov 28 04:32:35 localhost nova_compute[227332]: fdc Nov 28 04:32:35 localhost nova_compute[227332]: scsi Nov 28 04:32:35 localhost nova_compute[227332]: virtio Nov 28 04:32:35 localhost nova_compute[227332]: usb Nov 28 04:32:35 localhost nova_compute[227332]: sata Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: virtio Nov 28 04:32:35 localhost nova_compute[227332]: virtio-transitional Nov 28 04:32:35 localhost nova_compute[227332]: virtio-non-transitional Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: vnc Nov 28 04:32:35 localhost nova_compute[227332]: egl-headless Nov 28 04:32:35 localhost nova_compute[227332]: dbus Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: subsystem Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: default Nov 28 04:32:35 localhost nova_compute[227332]: mandatory Nov 28 04:32:35 localhost nova_compute[227332]: requisite Nov 28 04:32:35 localhost nova_compute[227332]: optional Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: usb Nov 28 04:32:35 localhost nova_compute[227332]: pci Nov 28 04:32:35 localhost nova_compute[227332]: scsi Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: virtio Nov 28 04:32:35 localhost nova_compute[227332]: virtio-transitional Nov 28 04:32:35 localhost nova_compute[227332]: virtio-non-transitional Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: random Nov 28 04:32:35 localhost nova_compute[227332]: egd Nov 28 04:32:35 localhost nova_compute[227332]: builtin Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: path Nov 28 04:32:35 localhost nova_compute[227332]: handle Nov 28 04:32:35 localhost nova_compute[227332]: virtiofs Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: tpm-tis Nov 28 04:32:35 localhost nova_compute[227332]: tpm-crb Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: emulator Nov 28 04:32:35 localhost nova_compute[227332]: external Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: 2.0 Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: usb Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: pty Nov 28 04:32:35 localhost nova_compute[227332]: unix Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: qemu Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: builtin Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: default Nov 28 04:32:35 localhost nova_compute[227332]: passt Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: isa Nov 28 04:32:35 localhost nova_compute[227332]: hyperv Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: null Nov 28 04:32:35 localhost nova_compute[227332]: vc Nov 28 04:32:35 localhost nova_compute[227332]: pty Nov 28 04:32:35 localhost nova_compute[227332]: dev Nov 28 04:32:35 localhost nova_compute[227332]: file Nov 28 04:32:35 localhost nova_compute[227332]: pipe Nov 28 04:32:35 localhost nova_compute[227332]: stdio Nov 28 04:32:35 localhost nova_compute[227332]: udp Nov 28 04:32:35 localhost nova_compute[227332]: tcp Nov 28 04:32:35 localhost nova_compute[227332]: unix Nov 28 04:32:35 localhost nova_compute[227332]: qemu-vdagent Nov 28 04:32:35 localhost nova_compute[227332]: dbus Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: relaxed Nov 28 04:32:35 localhost nova_compute[227332]: vapic Nov 28 04:32:35 localhost nova_compute[227332]: spinlocks Nov 28 04:32:35 localhost nova_compute[227332]: vpindex Nov 28 04:32:35 localhost nova_compute[227332]: runtime Nov 28 04:32:35 localhost nova_compute[227332]: synic Nov 28 04:32:35 localhost nova_compute[227332]: stimer Nov 28 04:32:35 localhost nova_compute[227332]: reset Nov 28 04:32:35 localhost nova_compute[227332]: vendor_id Nov 28 04:32:35 localhost nova_compute[227332]: frequencies Nov 28 04:32:35 localhost nova_compute[227332]: reenlightenment Nov 28 04:32:35 localhost nova_compute[227332]: tlbflush Nov 28 04:32:35 localhost nova_compute[227332]: ipi Nov 28 04:32:35 localhost nova_compute[227332]: avic Nov 28 04:32:35 localhost nova_compute[227332]: emsr_bitmap Nov 28 04:32:35 localhost nova_compute[227332]: xmm_input Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: 4095 Nov 28 04:32:35 localhost nova_compute[227332]: on Nov 28 04:32:35 localhost nova_compute[227332]: off Nov 28 04:32:35 localhost nova_compute[227332]: off Nov 28 04:32:35 localhost nova_compute[227332]: Linux KVM Hv Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: tdx Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: Nov 28 04:32:35 localhost nova_compute[227332]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.395 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.395 227336 INFO nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Secure Boot support detected#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.397 227336 INFO nova.virt.libvirt.driver [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.398 227336 INFO nova.virt.libvirt.driver [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.418 227336 DEBUG nova.virt.libvirt.driver [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.482 227336 INFO nova.virt.node [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Determined node identity 35fead26-0bad-4950-b646-987079d58a17 from /var/lib/nova/compute_id#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.501 227336 DEBUG nova.compute.manager [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Verified node 35fead26-0bad-4950-b646-987079d58a17 matches my host np0005538513.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.539 227336 DEBUG nova.compute.manager [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.543 227336 DEBUG nova.virt.libvirt.vif [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T08:32:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=,hidden=False,host='np0005538513.localdomain',hostname='test',id=2,image_ref='391767f1-35f2-4b68-ae15-e0b29db66dcb',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-28T08:33:06Z,launched_on='np0005538513.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=,node='np0005538513.localdomain',numa_topology=None,old_flavor=,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='9dda653c53224db086060962b0702694',ramdisk_id='',reservation_id='r-a3c307c0',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata=,tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2025-11-28T08:33:07Z,user_data=None,user_id='4d9169247d4447d0a8dd4c33f8b23dee',uuid=c2f0c7d6-df5f-4541-8b2c-bc1eaf805812,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.543 227336 DEBUG nova.network.os_vif_util [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Converting VIF {"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.544 227336 DEBUG nova.network.os_vif_util [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.545 227336 DEBUG os_vif [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.612 227336 DEBUG ovsdbapp.backend.ovs_idl [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.612 227336 DEBUG ovsdbapp.backend.ovs_idl [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.612 227336 DEBUG ovsdbapp.backend.ovs_idl [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.612 227336 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.613 227336 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.613 227336 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.613 227336 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.615 227336 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.618 227336 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.633 227336 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.633 227336 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.634 227336 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 28 04:32:35 localhost nova_compute[227332]: 2025-11-28 09:32:35.641 227336 INFO oslo.privsep.daemon [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp70qswwnh/privsep.sock']#033[00m Nov 28 04:32:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60890 DF PROTO=TCP SPT=53580 DPT=9102 SEQ=1058156247 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D59820000000001030307) Nov 28 04:32:36 localhost nova_compute[227332]: 2025-11-28 09:32:36.251 227336 INFO oslo.privsep.daemon [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Nov 28 04:32:36 localhost nova_compute[227332]: 2025-11-28 09:32:36.146 227849 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 28 04:32:36 localhost nova_compute[227332]: 2025-11-28 09:32:36.149 227849 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 28 04:32:36 localhost nova_compute[227332]: 2025-11-28 09:32:36.152 227849 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m Nov 28 04:32:36 localhost nova_compute[227332]: 2025-11-28 09:32:36.152 227849 INFO oslo.privsep.daemon [-] privsep daemon running as pid 227849#033[00m Nov 28 04:32:36 localhost nova_compute[227332]: 2025-11-28 09:32:36.535 227336 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:32:36 localhost nova_compute[227332]: 2025-11-28 09:32:36.536 227336 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09612b07-51, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:32:36 localhost nova_compute[227332]: 2025-11-28 09:32:36.536 227336 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap09612b07-51, col_values=(('external_ids', {'iface-id': '09612b07-5142-4b0f-9dab-74bf4403f69f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f4:fc:6c', 'vm-uuid': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:32:36 localhost nova_compute[227332]: 2025-11-28 09:32:36.537 227336 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 28 04:32:36 localhost nova_compute[227332]: 2025-11-28 09:32:36.538 227336 INFO os_vif [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51')#033[00m Nov 28 04:32:36 localhost nova_compute[227332]: 2025-11-28 09:32:36.538 227336 DEBUG nova.compute.manager [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 04:32:36 localhost nova_compute[227332]: 2025-11-28 09:32:36.542 227336 DEBUG nova.compute.manager [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304#033[00m Nov 28 04:32:36 localhost nova_compute[227332]: 2025-11-28 09:32:36.542 227336 INFO nova.compute.manager [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m Nov 28 04:32:36 localhost nova_compute[227332]: 2025-11-28 09:32:36.905 227336 INFO nova.service [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Updating service version for nova-compute on np0005538513.localdomain from 57 to 66#033[00m Nov 28 04:32:36 localhost nova_compute[227332]: 2025-11-28 09:32:36.933 227336 DEBUG oslo_concurrency.lockutils [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:32:36 localhost nova_compute[227332]: 2025-11-28 09:32:36.933 227336 DEBUG oslo_concurrency.lockutils [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:32:36 localhost nova_compute[227332]: 2025-11-28 09:32:36.934 227336 DEBUG oslo_concurrency.lockutils [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:32:36 localhost nova_compute[227332]: 2025-11-28 09:32:36.934 227336 DEBUG nova.compute.resource_tracker [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 04:32:36 localhost nova_compute[227332]: 2025-11-28 09:32:36.935 227336 DEBUG oslo_concurrency.processutils [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:32:37 localhost nova_compute[227332]: 2025-11-28 09:32:37.389 227336 DEBUG oslo_concurrency.processutils [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:32:37 localhost nova_compute[227332]: 2025-11-28 09:32:37.455 227336 DEBUG nova.virt.libvirt.driver [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:32:37 localhost nova_compute[227332]: 2025-11-28 09:32:37.455 227336 DEBUG nova.virt.libvirt.driver [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:32:37 localhost systemd[1]: Started libvirt nodedev daemon. Nov 28 04:32:37 localhost nova_compute[227332]: 2025-11-28 09:32:37.836 227336 WARNING nova.virt.libvirt.driver [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 04:32:37 localhost nova_compute[227332]: 2025-11-28 09:32:37.837 227336 DEBUG nova.compute.resource_tracker [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12907MB free_disk=41.837093353271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 04:32:37 localhost nova_compute[227332]: 2025-11-28 09:32:37.838 227336 DEBUG oslo_concurrency.lockutils [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:32:37 localhost nova_compute[227332]: 2025-11-28 09:32:37.838 227336 DEBUG oslo_concurrency.lockutils [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:32:37 localhost nova_compute[227332]: 2025-11-28 09:32:37.974 227336 DEBUG nova.compute.resource_tracker [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 04:32:37 localhost nova_compute[227332]: 2025-11-28 09:32:37.975 227336 DEBUG nova.compute.resource_tracker [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 04:32:37 localhost nova_compute[227332]: 2025-11-28 09:32:37.975 227336 DEBUG nova.compute.resource_tracker [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 04:32:37 localhost nova_compute[227332]: 2025-11-28 09:32:37.990 227336 DEBUG nova.scheduler.client.report [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Refreshing inventories for resource provider 35fead26-0bad-4950-b646-987079d58a17 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 28 04:32:38 localhost nova_compute[227332]: 2025-11-28 09:32:38.046 227336 DEBUG nova.scheduler.client.report [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Updating ProviderTree inventory for provider 35fead26-0bad-4950-b646-987079d58a17 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 28 04:32:38 localhost nova_compute[227332]: 2025-11-28 09:32:38.047 227336 DEBUG nova.compute.provider_tree [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 28 04:32:38 localhost nova_compute[227332]: 2025-11-28 09:32:38.060 227336 DEBUG nova.scheduler.client.report [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Refreshing aggregate associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 28 04:32:38 localhost nova_compute[227332]: 2025-11-28 09:32:38.083 227336 DEBUG nova.scheduler.client.report [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Refreshing trait associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, traits: COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE,HW_CPU_X86_FMA3,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_ACCELERATORS,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_CLMUL,HW_CPU_X86_SVM,HW_CPU_X86_AESNI,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_BMI,HW_CPU_X86_AVX,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_LAN9118,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_ABM,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,HW_CPU_X86_F16C,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AVX2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 28 04:32:38 localhost nova_compute[227332]: 2025-11-28 09:32:38.112 227336 DEBUG oslo_concurrency.processutils [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:32:38 localhost nova_compute[227332]: 2025-11-28 09:32:38.581 227336 DEBUG oslo_concurrency.processutils [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:32:38 localhost nova_compute[227332]: 2025-11-28 09:32:38.589 227336 DEBUG nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N Nov 28 04:32:38 localhost nova_compute[227332]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m Nov 28 04:32:38 localhost nova_compute[227332]: 2025-11-28 09:32:38.590 227336 INFO nova.virt.libvirt.host [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] kernel doesn't support AMD SEV#033[00m Nov 28 04:32:38 localhost nova_compute[227332]: 2025-11-28 09:32:38.592 227336 DEBUG nova.compute.provider_tree [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 28 04:32:38 localhost nova_compute[227332]: 2025-11-28 09:32:38.592 227336 DEBUG nova.virt.libvirt.driver [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Nov 28 04:32:38 localhost nova_compute[227332]: 2025-11-28 09:32:38.656 227336 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:32:38 localhost nova_compute[227332]: 2025-11-28 09:32:38.687 227336 DEBUG nova.scheduler.client.report [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Updated inventory for provider 35fead26-0bad-4950-b646-987079d58a17 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m Nov 28 04:32:38 localhost nova_compute[227332]: 2025-11-28 09:32:38.688 227336 DEBUG nova.compute.provider_tree [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Updating resource provider 35fead26-0bad-4950-b646-987079d58a17 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m Nov 28 04:32:38 localhost nova_compute[227332]: 2025-11-28 09:32:38.688 227336 DEBUG nova.compute.provider_tree [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 28 04:32:38 localhost nova_compute[227332]: 2025-11-28 09:32:38.783 227336 DEBUG nova.compute.provider_tree [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Updating resource provider 35fead26-0bad-4950-b646-987079d58a17 generation from 4 to 5 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m Nov 28 04:32:38 localhost nova_compute[227332]: 2025-11-28 09:32:38.816 227336 DEBUG nova.compute.resource_tracker [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 04:32:38 localhost nova_compute[227332]: 2025-11-28 09:32:38.817 227336 DEBUG oslo_concurrency.lockutils [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.979s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:32:38 localhost nova_compute[227332]: 2025-11-28 09:32:38.817 227336 DEBUG nova.service [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m Nov 28 04:32:38 localhost nova_compute[227332]: 2025-11-28 09:32:38.875 227336 DEBUG nova.service [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m Nov 28 04:32:38 localhost nova_compute[227332]: 2025-11-28 09:32:38.876 227336 DEBUG nova.servicegroup.drivers.db [None req-59a0c5c5-dabe-4028-af4c-f51c8a58ab47 - - - - - -] DB_Driver: join new ServiceGroup member np0005538513.localdomain to the compute group, service = join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m Nov 28 04:32:39 localhost python3.9[228005]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:32:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61558 DF PROTO=TCP SPT=50872 DPT=9100 SEQ=1856325133 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D65820000000001030307) Nov 28 04:32:40 localhost python3.9[228137]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Nov 28 04:32:40 localhost systemd-journald[47227]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 115.0 (383 of 333 items), suggesting rotation. Nov 28 04:32:40 localhost systemd-journald[47227]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 28 04:32:40 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 28 04:32:40 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 28 04:32:40 localhost nova_compute[227332]: 2025-11-28 09:32:40.648 227336 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:32:41 localhost python3.9[228270]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 04:32:41 localhost systemd[1]: Stopping nova_compute container... Nov 28 04:32:41 localhost nova_compute[227332]: 2025-11-28 09:32:41.639 227336 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170#033[00m Nov 28 04:32:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41358 DF PROTO=TCP SPT=57320 DPT=9100 SEQ=1150725942 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D72020000000001030307) Nov 28 04:32:42 localhost nova_compute[227332]: 2025-11-28 09:32:42.994 227336 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored#033[00m Nov 28 04:32:42 localhost nova_compute[227332]: 2025-11-28 09:32:42.996 227336 DEBUG oslo_concurrency.lockutils [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:32:42 localhost nova_compute[227332]: 2025-11-28 09:32:42.997 227336 DEBUG oslo_concurrency.lockutils [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:32:42 localhost nova_compute[227332]: 2025-11-28 09:32:42.997 227336 DEBUG oslo_concurrency.lockutils [None req-8a54bd26-1e78-4abd-9aab-65e719b271e0 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:32:43 localhost journal[201490]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, ) Nov 28 04:32:43 localhost journal[201490]: hostname: np0005538513.localdomain Nov 28 04:32:43 localhost journal[201490]: End of file while reading data: Input/output error Nov 28 04:32:43 localhost systemd[1]: libpod-11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf.scope: Deactivated successfully. Nov 28 04:32:43 localhost systemd[1]: libpod-11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf.scope: Consumed 4.822s CPU time. Nov 28 04:32:43 localhost podman[228274]: 2025-11-28 09:32:43.415610696 +0000 UTC m=+1.856776621 container died 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true) Nov 28 04:32:43 localhost systemd[1]: tmp-crun.KudeZ4.mount: Deactivated successfully. Nov 28 04:32:43 localhost systemd[1]: tmp-crun.86Ixpm.mount: Deactivated successfully. Nov 28 04:32:43 localhost podman[228274]: 2025-11-28 09:32:43.483977036 +0000 UTC m=+1.925142941 container cleanup 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}) Nov 28 04:32:43 localhost podman[228274]: nova_compute Nov 28 04:32:43 localhost podman[228315]: error opening file `/run/crun/11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf/status`: No such file or directory Nov 28 04:32:43 localhost podman[228304]: 2025-11-28 09:32:43.590401593 +0000 UTC m=+0.070891881 container cleanup 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 04:32:43 localhost podman[228304]: nova_compute Nov 28 04:32:43 localhost systemd[1]: edpm_nova_compute.service: Deactivated successfully. Nov 28 04:32:43 localhost systemd[1]: Stopped nova_compute container. Nov 28 04:32:43 localhost systemd[1]: Starting nova_compute container... Nov 28 04:32:43 localhost systemd[1]: Started libcrun container. Nov 28 04:32:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Nov 28 04:32:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Nov 28 04:32:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 04:32:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Nov 28 04:32:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 28 04:32:43 localhost podman[228317]: 2025-11-28 09:32:43.743165199 +0000 UTC m=+0.116657572 container init 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:32:43 localhost podman[228317]: 2025-11-28 09:32:43.751888105 +0000 UTC m=+0.125380478 container start 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 04:32:43 localhost podman[228317]: nova_compute Nov 28 04:32:43 localhost nova_compute[228333]: + sudo -E kolla_set_configs Nov 28 04:32:43 localhost systemd[1]: Started nova_compute container. Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Validating config file Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Copying service configuration files Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Deleting /etc/nova/nova.conf Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Setting permission for /etc/nova/nova.conf Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Deleting /etc/ceph Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Creating directory /etc/ceph Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Setting permission for /etc/ceph Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Deleting /usr/sbin/iscsiadm Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Writing out command to execute Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Nov 28 04:32:43 localhost nova_compute[228333]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Nov 28 04:32:43 localhost nova_compute[228333]: ++ cat /run_command Nov 28 04:32:43 localhost nova_compute[228333]: + CMD=nova-compute Nov 28 04:32:43 localhost nova_compute[228333]: + ARGS= Nov 28 04:32:43 localhost nova_compute[228333]: + sudo kolla_copy_cacerts Nov 28 04:32:43 localhost nova_compute[228333]: + [[ ! -n '' ]] Nov 28 04:32:43 localhost nova_compute[228333]: + . kolla_extend_start Nov 28 04:32:43 localhost nova_compute[228333]: Running command: 'nova-compute' Nov 28 04:32:43 localhost nova_compute[228333]: + echo 'Running command: '\''nova-compute'\''' Nov 28 04:32:43 localhost nova_compute[228333]: + umask 0022 Nov 28 04:32:43 localhost nova_compute[228333]: + exec nova-compute Nov 28 04:32:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:32:44 localhost podman[228362]: 2025-11-28 09:32:44.350911091 +0000 UTC m=+0.085848544 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:32:44 localhost podman[228362]: 2025-11-28 09:32:44.386878922 +0000 UTC m=+0.121816405 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Nov 28 04:32:44 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:32:45 localhost nova_compute[228333]: 2025-11-28 09:32:45.536 228337 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 28 04:32:45 localhost nova_compute[228333]: 2025-11-28 09:32:45.536 228337 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 28 04:32:45 localhost nova_compute[228333]: 2025-11-28 09:32:45.536 228337 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 28 04:32:45 localhost nova_compute[228333]: 2025-11-28 09:32:45.536 228337 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Nov 28 04:32:45 localhost nova_compute[228333]: 2025-11-28 09:32:45.650 228337 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:32:45 localhost nova_compute[228333]: 2025-11-28 09:32:45.672 228337 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:32:45 localhost nova_compute[228333]: 2025-11-28 09:32:45.672 228337 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.102 228337 INFO nova.virt.driver [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.215 228337 INFO nova.compute.provider_config [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.223 228337 WARNING nova.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.223 228337 DEBUG oslo_concurrency.lockutils [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.223 228337 DEBUG oslo_concurrency.lockutils [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.223 228337 DEBUG oslo_concurrency.lockutils [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.224 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.224 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.224 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.224 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.224 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.224 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.225 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.225 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.225 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.225 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.225 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.225 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.225 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.225 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.226 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.226 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.226 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.226 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.226 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.226 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] console_host = np0005538513.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.226 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.227 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.227 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.227 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.227 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.227 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.227 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.227 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.227 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.228 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.228 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.228 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.228 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.228 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.228 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.228 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.229 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.229 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.229 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] host = np0005538513.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.229 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.229 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.229 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.229 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.230 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.230 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.230 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.230 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.230 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.230 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.230 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.231 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.231 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.231 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.231 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.231 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.231 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.231 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.231 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.232 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.232 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.232 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.232 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.232 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.232 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.232 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.232 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.233 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.233 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.233 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.233 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.233 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.233 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.233 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.234 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.234 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.234 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.234 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.234 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.234 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.234 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.234 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] my_block_storage_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.235 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] my_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.235 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.235 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.235 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.235 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.235 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.235 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.236 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.236 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.236 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.236 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.236 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.236 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.237 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.237 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.237 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.237 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.237 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.237 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.237 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.237 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.238 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.238 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.238 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.238 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.238 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.238 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.238 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.238 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.239 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.239 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.239 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.239 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.239 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.239 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.239 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.239 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.240 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.240 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.240 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.240 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.240 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.240 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.240 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.241 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.241 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.241 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.241 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.241 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.241 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.241 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.241 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.242 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.242 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.242 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.242 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.242 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.242 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.242 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.242 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.243 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.243 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.243 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.243 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.243 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.243 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.243 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.243 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.244 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.244 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.244 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.244 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.244 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.244 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.244 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.245 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.245 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.245 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.245 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.245 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.245 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.245 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.246 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.246 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.246 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.246 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.246 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.246 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.246 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.246 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.247 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.247 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.247 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.247 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.247 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.247 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.247 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.248 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.248 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.248 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.248 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.248 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.248 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.248 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.249 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.249 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.249 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.249 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.249 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.249 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.249 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.249 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.250 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.250 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.250 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.250 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.250 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.250 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.250 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.251 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.251 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.251 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.251 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.251 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.251 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.251 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.251 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.252 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.252 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.252 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.252 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.252 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.252 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.252 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.253 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.253 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.253 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.253 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.253 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.253 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.253 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.253 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.os_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.254 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.254 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.254 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.254 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.254 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.254 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.254 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.254 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.255 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.255 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.255 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.255 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.255 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.255 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.255 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.256 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.256 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.256 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.256 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.256 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.256 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.256 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.257 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.257 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.257 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.257 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.257 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.257 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.257 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.257 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.258 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.258 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.258 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.258 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.258 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.258 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.258 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.258 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.259 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.259 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.259 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.259 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.259 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.259 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.259 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.260 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.260 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.260 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.260 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.260 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.260 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.260 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.261 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.261 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.261 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.261 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.261 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.261 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.261 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.261 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.262 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.262 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.262 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.262 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.262 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.262 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.262 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.263 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.263 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.263 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.263 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.263 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.263 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.263 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.263 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.264 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.264 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.264 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.264 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.264 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.264 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.264 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.264 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.265 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.265 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.265 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.265 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.265 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.265 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.265 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.266 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.266 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.266 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.266 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.266 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.266 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.266 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.267 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.267 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.267 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.267 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.267 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.267 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.267 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.267 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.268 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.268 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.268 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.268 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.268 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.268 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.268 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.269 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.269 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.269 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.269 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.269 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.269 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.269 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.269 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.270 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.270 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.270 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.270 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.270 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.270 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.270 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.271 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.271 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.271 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.271 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.271 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.271 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.271 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.272 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.272 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.272 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.272 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.272 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.272 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.272 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.273 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.273 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.273 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.273 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.273 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.273 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.273 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.273 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.274 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.274 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.274 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.274 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.274 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.274 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.274 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.274 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.275 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.275 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.275 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.275 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.275 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.275 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.275 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.276 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.276 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.276 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.276 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.276 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.276 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.276 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.277 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.barbican_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.277 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.277 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.277 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.277 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.277 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.277 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.277 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.278 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.278 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.278 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.278 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.278 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.278 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.278 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.278 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.279 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.279 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.279 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.279 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.279 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.279 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.279 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.280 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.280 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.280 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.280 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.280 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.280 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.280 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.280 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.281 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.281 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.281 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.281 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.281 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.281 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.281 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.281 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.282 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.282 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.282 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.282 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.282 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.282 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.282 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.283 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.283 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.283 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.283 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.283 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.283 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.283 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.283 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.284 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.284 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.284 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.284 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.284 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.284 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.284 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.285 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.285 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.285 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.285 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.285 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.285 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.285 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.285 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.286 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.286 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.286 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.286 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.286 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.286 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.286 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.287 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.287 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.287 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.287 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.287 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.287 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.287 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.287 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.288 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.288 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.288 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.288 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.288 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.288 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.288 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.289 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.289 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.289 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.289 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.289 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.289 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.289 228337 WARNING oslo_config.cfg [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Nov 28 04:32:46 localhost nova_compute[228333]: live_migration_uri is deprecated for removal in favor of two other options that Nov 28 04:32:46 localhost nova_compute[228333]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Nov 28 04:32:46 localhost nova_compute[228333]: and ``live_migration_inbound_addr`` respectively. Nov 28 04:32:46 localhost nova_compute[228333]: ). Its value may be silently ignored in the future.#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.290 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.290 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.290 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.290 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.290 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.290 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.290 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.291 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.291 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.291 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.291 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.291 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.291 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.291 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.292 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.292 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.292 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.292 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.292 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.rbd_secret_uuid = 2c5417c9-00eb-57d5-a565-ddecbc7995c1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.292 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.292 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.293 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.293 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.293 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.293 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.293 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.293 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.293 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.293 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.294 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.294 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.294 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.294 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.294 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.294 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.294 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.295 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.295 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.295 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.295 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.295 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.295 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.295 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.296 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.296 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.296 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.296 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.296 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.296 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.296 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.296 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.297 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.297 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.297 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.297 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.297 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.297 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.297 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.298 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.298 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.298 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.298 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.298 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.298 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.298 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.298 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.299 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.299 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.299 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.299 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.299 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.299 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.299 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.299 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.300 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.300 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.300 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.300 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.300 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.300 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.300 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.301 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.301 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.301 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.301 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.301 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.301 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.301 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.301 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.302 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.302 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.302 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.302 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.302 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.302 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.302 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.303 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.303 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.303 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.303 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.303 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.303 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.303 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.303 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.304 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.304 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.304 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.304 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.304 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.304 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.304 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.305 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.305 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.305 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.305 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.305 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.305 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.305 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.305 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.306 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.306 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.306 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.306 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.306 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.306 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.306 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.306 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.307 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.307 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.307 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.307 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.307 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.307 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.307 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.308 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.308 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.308 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.308 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.308 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.308 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.308 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.309 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.309 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.309 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.309 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.309 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.309 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.309 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.309 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.310 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.310 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.310 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.310 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.310 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.310 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.310 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.311 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.311 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.311 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.311 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.311 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.311 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.311 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.312 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.312 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.312 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.312 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.312 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.312 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.312 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.312 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.313 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.313 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.313 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.313 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.313 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.313 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.313 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.313 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.314 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.314 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.314 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.314 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.314 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.314 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.315 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.315 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.315 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.315 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.315 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.315 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.315 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.315 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.316 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.316 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.316 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.316 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.316 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.316 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.316 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.317 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.317 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.317 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.317 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.317 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.317 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.317 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.317 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.318 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.318 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.318 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.318 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.318 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.318 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.318 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.319 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.319 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.319 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.319 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.319 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.319 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.319 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.319 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.320 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.320 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.320 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.320 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.320 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.320 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.320 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.320 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.321 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.321 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.321 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.321 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.321 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.321 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.321 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.321 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.322 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.322 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.322 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.322 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.322 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.322 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.322 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.323 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.323 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.323 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.323 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.323 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.323 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.323 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.324 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.324 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.324 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.324 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.324 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.324 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.324 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.325 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.325 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.325 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.325 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.325 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.325 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.325 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.325 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.326 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.326 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.326 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.326 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.326 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.326 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.326 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.326 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.327 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.327 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.327 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.327 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.327 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.327 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.327 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.328 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.328 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.328 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.328 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.328 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.328 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.328 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.329 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.329 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.329 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.329 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.329 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.329 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.329 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.329 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.330 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.330 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.330 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.330 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.330 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.330 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.330 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.331 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.331 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.331 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.331 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.331 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.331 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.331 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.331 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.332 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.332 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.332 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.332 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.332 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.332 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.332 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.332 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.333 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.333 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.333 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.333 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.333 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.333 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.333 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.334 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.334 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.334 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.334 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.334 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.334 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.334 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.334 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.335 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.335 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.335 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.335 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.335 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.335 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.335 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.336 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.336 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.336 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.336 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.336 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.336 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.336 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.336 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.337 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.337 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.337 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.337 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.337 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.337 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.337 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.337 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.338 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.338 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.338 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.338 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.338 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.338 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.338 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.339 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.339 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.339 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.339 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.339 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.339 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.339 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.339 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.340 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.340 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.340 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.340 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.340 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.340 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.340 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.340 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.341 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.341 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.341 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.341 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.341 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.341 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.341 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.341 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.342 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.342 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.342 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.342 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.342 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.342 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.342 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.343 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.343 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.343 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.343 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.343 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.343 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.343 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.343 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.344 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.344 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.344 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.344 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.344 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.344 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.344 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.345 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.345 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.345 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.345 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.345 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.345 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.345 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.345 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.346 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.346 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.346 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.346 228337 DEBUG oslo_service.service [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.348 228337 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.360 228337 INFO nova.virt.node [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Determined node identity 35fead26-0bad-4950-b646-987079d58a17 from /var/lib/nova/compute_id#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.361 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.362 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.362 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.362 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.372 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.375 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.375 228337 INFO nova.virt.libvirt.driver [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Connection event '1' reason 'None'#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.380 228337 INFO nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Libvirt host capabilities Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: eb468aed-e0e9-4528-988f-9267a3530b7a Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: x86_64 Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome-v4 Nov 28 04:32:46 localhost nova_compute[228333]: AMD Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: tcp Nov 28 04:32:46 localhost nova_compute[228333]: rdma Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: 16116612 Nov 28 04:32:46 localhost nova_compute[228333]: 4029153 Nov 28 04:32:46 localhost nova_compute[228333]: 0 Nov 28 04:32:46 localhost nova_compute[228333]: 0 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: selinux Nov 28 04:32:46 localhost nova_compute[228333]: 0 Nov 28 04:32:46 localhost nova_compute[228333]: system_u:system_r:svirt_t:s0 Nov 28 04:32:46 localhost nova_compute[228333]: system_u:system_r:svirt_tcg_t:s0 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: dac Nov 28 04:32:46 localhost nova_compute[228333]: 0 Nov 28 04:32:46 localhost nova_compute[228333]: +107:+107 Nov 28 04:32:46 localhost nova_compute[228333]: +107:+107 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: hvm Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: 32 Nov 28 04:32:46 localhost nova_compute[228333]: /usr/libexec/qemu-kvm Nov 28 04:32:46 localhost nova_compute[228333]: pc-i440fx-rhel7.6.0 Nov 28 04:32:46 localhost nova_compute[228333]: pc Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel9.8.0 Nov 28 04:32:46 localhost nova_compute[228333]: q35 Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel9.6.0 Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel8.6.0 Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel9.4.0 Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel8.5.0 Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel8.3.0 Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel7.6.0 Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel8.4.0 Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel9.2.0 Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel8.2.0 Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel9.0.0 Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel8.0.0 Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel8.1.0 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: hvm Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: 64 Nov 28 04:32:46 localhost nova_compute[228333]: /usr/libexec/qemu-kvm Nov 28 04:32:46 localhost nova_compute[228333]: pc-i440fx-rhel7.6.0 Nov 28 04:32:46 localhost nova_compute[228333]: pc Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel9.8.0 Nov 28 04:32:46 localhost nova_compute[228333]: q35 Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel9.6.0 Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel8.6.0 Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel9.4.0 Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel8.5.0 Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel8.3.0 Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel7.6.0 Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel8.4.0 Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel9.2.0 Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel8.2.0 Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel9.0.0 Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel8.0.0 Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel8.1.0 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: #033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.386 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.389 228337 DEBUG nova.virt.libvirt.volume.mount [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.390 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: /usr/libexec/qemu-kvm Nov 28 04:32:46 localhost nova_compute[228333]: kvm Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel9.8.0 Nov 28 04:32:46 localhost nova_compute[228333]: i686 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: rom Nov 28 04:32:46 localhost nova_compute[228333]: pflash Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: yes Nov 28 04:32:46 localhost nova_compute[228333]: no Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: no Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: on Nov 28 04:32:46 localhost nova_compute[228333]: off Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: on Nov 28 04:32:46 localhost nova_compute[228333]: off Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome Nov 28 04:32:46 localhost nova_compute[228333]: AMD Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: 486 Nov 28 04:32:46 localhost nova_compute[228333]: 486-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-noTSX Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-noTSX-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server-noTSX Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server-v5 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Conroe Nov 28 04:32:46 localhost nova_compute[228333]: Conroe-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Cooperlake Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cooperlake-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cooperlake-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Denverton Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Denverton-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Denverton-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Denverton-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Dhyana Nov 28 04:32:46 localhost nova_compute[228333]: Dhyana-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Dhyana-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Genoa Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Genoa-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-IBPB Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Milan Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Milan-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Milan-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome-v4 Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-v1 Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-v2 Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: GraniteRapids Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: GraniteRapids-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: GraniteRapids-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-noTSX Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-noTSX-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-noTSX Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v5 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v6 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v7 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: IvyBridge Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: IvyBridge-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: IvyBridge-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: IvyBridge-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: KnightsMill Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: KnightsMill-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nehalem Nov 28 04:32:46 localhost nova_compute[228333]: Nehalem-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nehalem-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nehalem-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G1 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G1-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G2 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G2-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G3 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G3-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G4-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G5 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G5-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Penryn Nov 28 04:32:46 localhost nova_compute[228333]: Penryn-v1 Nov 28 04:32:46 localhost nova_compute[228333]: SandyBridge Nov 28 04:32:46 localhost nova_compute[228333]: SandyBridge-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: SandyBridge-v1 Nov 28 04:32:46 localhost nova_compute[228333]: SandyBridge-v2 Nov 28 04:32:46 localhost nova_compute[228333]: SapphireRapids Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: SapphireRapids-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: SapphireRapids-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: SapphireRapids-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: SierraForest Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: SierraForest-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client-noTSX-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-noTSX-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-v5 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Snowridge Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Snowridge-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Snowridge-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Snowridge-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Snowridge-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Westmere Nov 28 04:32:46 localhost nova_compute[228333]: Westmere-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Westmere-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Westmere-v2 Nov 28 04:32:46 localhost nova_compute[228333]: athlon Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: athlon-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: core2duo Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: core2duo-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: coreduo Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: coreduo-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: kvm32 Nov 28 04:32:46 localhost nova_compute[228333]: kvm32-v1 Nov 28 04:32:46 localhost nova_compute[228333]: kvm64 Nov 28 04:32:46 localhost nova_compute[228333]: kvm64-v1 Nov 28 04:32:46 localhost nova_compute[228333]: n270 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: n270-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: pentium Nov 28 04:32:46 localhost nova_compute[228333]: pentium-v1 Nov 28 04:32:46 localhost nova_compute[228333]: pentium2 Nov 28 04:32:46 localhost nova_compute[228333]: pentium2-v1 Nov 28 04:32:46 localhost nova_compute[228333]: pentium3 Nov 28 04:32:46 localhost nova_compute[228333]: pentium3-v1 Nov 28 04:32:46 localhost nova_compute[228333]: phenom Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: phenom-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: qemu32 Nov 28 04:32:46 localhost nova_compute[228333]: qemu32-v1 Nov 28 04:32:46 localhost nova_compute[228333]: qemu64 Nov 28 04:32:46 localhost nova_compute[228333]: qemu64-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: file Nov 28 04:32:46 localhost nova_compute[228333]: anonymous Nov 28 04:32:46 localhost nova_compute[228333]: memfd Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: disk Nov 28 04:32:46 localhost nova_compute[228333]: cdrom Nov 28 04:32:46 localhost nova_compute[228333]: floppy Nov 28 04:32:46 localhost nova_compute[228333]: lun Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: fdc Nov 28 04:32:46 localhost nova_compute[228333]: scsi Nov 28 04:32:46 localhost nova_compute[228333]: virtio Nov 28 04:32:46 localhost nova_compute[228333]: usb Nov 28 04:32:46 localhost nova_compute[228333]: sata Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: virtio Nov 28 04:32:46 localhost nova_compute[228333]: virtio-transitional Nov 28 04:32:46 localhost nova_compute[228333]: virtio-non-transitional Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: vnc Nov 28 04:32:46 localhost nova_compute[228333]: egl-headless Nov 28 04:32:46 localhost nova_compute[228333]: dbus Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: subsystem Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: default Nov 28 04:32:46 localhost nova_compute[228333]: mandatory Nov 28 04:32:46 localhost nova_compute[228333]: requisite Nov 28 04:32:46 localhost nova_compute[228333]: optional Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: usb Nov 28 04:32:46 localhost nova_compute[228333]: pci Nov 28 04:32:46 localhost nova_compute[228333]: scsi Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: virtio Nov 28 04:32:46 localhost nova_compute[228333]: virtio-transitional Nov 28 04:32:46 localhost nova_compute[228333]: virtio-non-transitional Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: random Nov 28 04:32:46 localhost nova_compute[228333]: egd Nov 28 04:32:46 localhost nova_compute[228333]: builtin Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: path Nov 28 04:32:46 localhost nova_compute[228333]: handle Nov 28 04:32:46 localhost nova_compute[228333]: virtiofs Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: tpm-tis Nov 28 04:32:46 localhost nova_compute[228333]: tpm-crb Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: emulator Nov 28 04:32:46 localhost nova_compute[228333]: external Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: 2.0 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: usb Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: pty Nov 28 04:32:46 localhost nova_compute[228333]: unix Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: qemu Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: builtin Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: default Nov 28 04:32:46 localhost nova_compute[228333]: passt Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: isa Nov 28 04:32:46 localhost nova_compute[228333]: hyperv Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: null Nov 28 04:32:46 localhost nova_compute[228333]: vc Nov 28 04:32:46 localhost nova_compute[228333]: pty Nov 28 04:32:46 localhost nova_compute[228333]: dev Nov 28 04:32:46 localhost nova_compute[228333]: file Nov 28 04:32:46 localhost nova_compute[228333]: pipe Nov 28 04:32:46 localhost nova_compute[228333]: stdio Nov 28 04:32:46 localhost nova_compute[228333]: udp Nov 28 04:32:46 localhost nova_compute[228333]: tcp Nov 28 04:32:46 localhost nova_compute[228333]: unix Nov 28 04:32:46 localhost nova_compute[228333]: qemu-vdagent Nov 28 04:32:46 localhost nova_compute[228333]: dbus Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: relaxed Nov 28 04:32:46 localhost nova_compute[228333]: vapic Nov 28 04:32:46 localhost nova_compute[228333]: spinlocks Nov 28 04:32:46 localhost nova_compute[228333]: vpindex Nov 28 04:32:46 localhost nova_compute[228333]: runtime Nov 28 04:32:46 localhost nova_compute[228333]: synic Nov 28 04:32:46 localhost nova_compute[228333]: stimer Nov 28 04:32:46 localhost nova_compute[228333]: reset Nov 28 04:32:46 localhost nova_compute[228333]: vendor_id Nov 28 04:32:46 localhost nova_compute[228333]: frequencies Nov 28 04:32:46 localhost nova_compute[228333]: reenlightenment Nov 28 04:32:46 localhost nova_compute[228333]: tlbflush Nov 28 04:32:46 localhost nova_compute[228333]: ipi Nov 28 04:32:46 localhost nova_compute[228333]: avic Nov 28 04:32:46 localhost nova_compute[228333]: emsr_bitmap Nov 28 04:32:46 localhost nova_compute[228333]: xmm_input Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: 4095 Nov 28 04:32:46 localhost nova_compute[228333]: on Nov 28 04:32:46 localhost nova_compute[228333]: off Nov 28 04:32:46 localhost nova_compute[228333]: off Nov 28 04:32:46 localhost nova_compute[228333]: Linux KVM Hv Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: tdx Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.396 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: /usr/libexec/qemu-kvm Nov 28 04:32:46 localhost nova_compute[228333]: kvm Nov 28 04:32:46 localhost nova_compute[228333]: pc-i440fx-rhel7.6.0 Nov 28 04:32:46 localhost nova_compute[228333]: i686 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: rom Nov 28 04:32:46 localhost nova_compute[228333]: pflash Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: yes Nov 28 04:32:46 localhost nova_compute[228333]: no Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: no Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: on Nov 28 04:32:46 localhost nova_compute[228333]: off Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: on Nov 28 04:32:46 localhost nova_compute[228333]: off Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome Nov 28 04:32:46 localhost nova_compute[228333]: AMD Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: 486 Nov 28 04:32:46 localhost nova_compute[228333]: 486-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-noTSX Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-noTSX-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server-noTSX Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server-v5 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Conroe Nov 28 04:32:46 localhost nova_compute[228333]: Conroe-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Cooperlake Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cooperlake-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cooperlake-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Denverton Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Denverton-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Denverton-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Denverton-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Dhyana Nov 28 04:32:46 localhost nova_compute[228333]: Dhyana-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Dhyana-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Genoa Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Genoa-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-IBPB Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Milan Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Milan-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Milan-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome-v4 Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-v1 Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-v2 Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: GraniteRapids Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: GraniteRapids-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: GraniteRapids-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-noTSX Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-noTSX-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-noTSX Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v5 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v6 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v7 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: IvyBridge Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: IvyBridge-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: IvyBridge-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: IvyBridge-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: KnightsMill Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: KnightsMill-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nehalem Nov 28 04:32:46 localhost nova_compute[228333]: Nehalem-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nehalem-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nehalem-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G1 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G1-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G2 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G2-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G3 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G3-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G4-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G5 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G5-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Penryn Nov 28 04:32:46 localhost nova_compute[228333]: Penryn-v1 Nov 28 04:32:46 localhost nova_compute[228333]: SandyBridge Nov 28 04:32:46 localhost nova_compute[228333]: SandyBridge-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: SandyBridge-v1 Nov 28 04:32:46 localhost nova_compute[228333]: SandyBridge-v2 Nov 28 04:32:46 localhost nova_compute[228333]: SapphireRapids Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: SapphireRapids-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: SapphireRapids-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: SapphireRapids-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: SierraForest Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: SierraForest-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client-noTSX-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-noTSX-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-v5 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Snowridge Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Snowridge-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Snowridge-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Snowridge-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Snowridge-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Westmere Nov 28 04:32:46 localhost nova_compute[228333]: Westmere-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Westmere-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Westmere-v2 Nov 28 04:32:46 localhost nova_compute[228333]: athlon Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: athlon-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: core2duo Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: core2duo-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: coreduo Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: coreduo-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: kvm32 Nov 28 04:32:46 localhost nova_compute[228333]: kvm32-v1 Nov 28 04:32:46 localhost nova_compute[228333]: kvm64 Nov 28 04:32:46 localhost nova_compute[228333]: kvm64-v1 Nov 28 04:32:46 localhost nova_compute[228333]: n270 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: n270-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: pentium Nov 28 04:32:46 localhost nova_compute[228333]: pentium-v1 Nov 28 04:32:46 localhost nova_compute[228333]: pentium2 Nov 28 04:32:46 localhost nova_compute[228333]: pentium2-v1 Nov 28 04:32:46 localhost nova_compute[228333]: pentium3 Nov 28 04:32:46 localhost nova_compute[228333]: pentium3-v1 Nov 28 04:32:46 localhost nova_compute[228333]: phenom Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: phenom-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: qemu32 Nov 28 04:32:46 localhost nova_compute[228333]: qemu32-v1 Nov 28 04:32:46 localhost nova_compute[228333]: qemu64 Nov 28 04:32:46 localhost nova_compute[228333]: qemu64-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: file Nov 28 04:32:46 localhost nova_compute[228333]: anonymous Nov 28 04:32:46 localhost nova_compute[228333]: memfd Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: disk Nov 28 04:32:46 localhost nova_compute[228333]: cdrom Nov 28 04:32:46 localhost nova_compute[228333]: floppy Nov 28 04:32:46 localhost nova_compute[228333]: lun Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: ide Nov 28 04:32:46 localhost nova_compute[228333]: fdc Nov 28 04:32:46 localhost nova_compute[228333]: scsi Nov 28 04:32:46 localhost nova_compute[228333]: virtio Nov 28 04:32:46 localhost nova_compute[228333]: usb Nov 28 04:32:46 localhost nova_compute[228333]: sata Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: virtio Nov 28 04:32:46 localhost nova_compute[228333]: virtio-transitional Nov 28 04:32:46 localhost nova_compute[228333]: virtio-non-transitional Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: vnc Nov 28 04:32:46 localhost nova_compute[228333]: egl-headless Nov 28 04:32:46 localhost nova_compute[228333]: dbus Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: subsystem Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: default Nov 28 04:32:46 localhost nova_compute[228333]: mandatory Nov 28 04:32:46 localhost nova_compute[228333]: requisite Nov 28 04:32:46 localhost nova_compute[228333]: optional Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: usb Nov 28 04:32:46 localhost nova_compute[228333]: pci Nov 28 04:32:46 localhost nova_compute[228333]: scsi Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: virtio Nov 28 04:32:46 localhost nova_compute[228333]: virtio-transitional Nov 28 04:32:46 localhost nova_compute[228333]: virtio-non-transitional Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: random Nov 28 04:32:46 localhost nova_compute[228333]: egd Nov 28 04:32:46 localhost nova_compute[228333]: builtin Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: path Nov 28 04:32:46 localhost nova_compute[228333]: handle Nov 28 04:32:46 localhost nova_compute[228333]: virtiofs Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: tpm-tis Nov 28 04:32:46 localhost nova_compute[228333]: tpm-crb Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: emulator Nov 28 04:32:46 localhost nova_compute[228333]: external Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: 2.0 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: usb Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: pty Nov 28 04:32:46 localhost nova_compute[228333]: unix Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: qemu Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: builtin Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: default Nov 28 04:32:46 localhost nova_compute[228333]: passt Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: isa Nov 28 04:32:46 localhost nova_compute[228333]: hyperv Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: null Nov 28 04:32:46 localhost nova_compute[228333]: vc Nov 28 04:32:46 localhost nova_compute[228333]: pty Nov 28 04:32:46 localhost nova_compute[228333]: dev Nov 28 04:32:46 localhost nova_compute[228333]: file Nov 28 04:32:46 localhost nova_compute[228333]: pipe Nov 28 04:32:46 localhost nova_compute[228333]: stdio Nov 28 04:32:46 localhost nova_compute[228333]: udp Nov 28 04:32:46 localhost nova_compute[228333]: tcp Nov 28 04:32:46 localhost nova_compute[228333]: unix Nov 28 04:32:46 localhost nova_compute[228333]: qemu-vdagent Nov 28 04:32:46 localhost nova_compute[228333]: dbus Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: relaxed Nov 28 04:32:46 localhost nova_compute[228333]: vapic Nov 28 04:32:46 localhost nova_compute[228333]: spinlocks Nov 28 04:32:46 localhost nova_compute[228333]: vpindex Nov 28 04:32:46 localhost nova_compute[228333]: runtime Nov 28 04:32:46 localhost nova_compute[228333]: synic Nov 28 04:32:46 localhost nova_compute[228333]: stimer Nov 28 04:32:46 localhost nova_compute[228333]: reset Nov 28 04:32:46 localhost nova_compute[228333]: vendor_id Nov 28 04:32:46 localhost nova_compute[228333]: frequencies Nov 28 04:32:46 localhost nova_compute[228333]: reenlightenment Nov 28 04:32:46 localhost nova_compute[228333]: tlbflush Nov 28 04:32:46 localhost nova_compute[228333]: ipi Nov 28 04:32:46 localhost nova_compute[228333]: avic Nov 28 04:32:46 localhost nova_compute[228333]: emsr_bitmap Nov 28 04:32:46 localhost nova_compute[228333]: xmm_input Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: 4095 Nov 28 04:32:46 localhost nova_compute[228333]: on Nov 28 04:32:46 localhost nova_compute[228333]: off Nov 28 04:32:46 localhost nova_compute[228333]: off Nov 28 04:32:46 localhost nova_compute[228333]: Linux KVM Hv Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: tdx Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.425 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.430 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: /usr/libexec/qemu-kvm Nov 28 04:32:46 localhost nova_compute[228333]: kvm Nov 28 04:32:46 localhost nova_compute[228333]: pc-q35-rhel9.8.0 Nov 28 04:32:46 localhost nova_compute[228333]: x86_64 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: efi Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Nov 28 04:32:46 localhost nova_compute[228333]: /usr/share/edk2/ovmf/OVMF_CODE.fd Nov 28 04:32:46 localhost nova_compute[228333]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Nov 28 04:32:46 localhost nova_compute[228333]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: rom Nov 28 04:32:46 localhost nova_compute[228333]: pflash Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: yes Nov 28 04:32:46 localhost nova_compute[228333]: no Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: yes Nov 28 04:32:46 localhost nova_compute[228333]: no Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: on Nov 28 04:32:46 localhost nova_compute[228333]: off Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: on Nov 28 04:32:46 localhost nova_compute[228333]: off Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome Nov 28 04:32:46 localhost nova_compute[228333]: AMD Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: 486 Nov 28 04:32:46 localhost nova_compute[228333]: 486-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-noTSX Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-noTSX-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server-noTSX Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server-v5 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Conroe Nov 28 04:32:46 localhost nova_compute[228333]: Conroe-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Cooperlake Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cooperlake-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cooperlake-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Denverton Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Denverton-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Denverton-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Denverton-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Dhyana Nov 28 04:32:46 localhost nova_compute[228333]: Dhyana-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Dhyana-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Genoa Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Genoa-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-IBPB Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Milan Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Milan-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Milan-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome-v4 Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-v1 Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-v2 Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: GraniteRapids Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: GraniteRapids-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: GraniteRapids-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-noTSX Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-noTSX-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-noTSX Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v5 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v6 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v7 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: IvyBridge Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: IvyBridge-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: IvyBridge-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: IvyBridge-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: KnightsMill Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: KnightsMill-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nehalem Nov 28 04:32:46 localhost nova_compute[228333]: Nehalem-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nehalem-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nehalem-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G1 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G1-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G2 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G2-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G3 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G3-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G4-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G5 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G5-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Penryn Nov 28 04:32:46 localhost nova_compute[228333]: Penryn-v1 Nov 28 04:32:46 localhost nova_compute[228333]: SandyBridge Nov 28 04:32:46 localhost nova_compute[228333]: SandyBridge-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: SandyBridge-v1 Nov 28 04:32:46 localhost nova_compute[228333]: SandyBridge-v2 Nov 28 04:32:46 localhost nova_compute[228333]: SapphireRapids Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: SapphireRapids-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: SapphireRapids-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: SapphireRapids-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: SierraForest Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: SierraForest-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client-noTSX-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-noTSX-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-v5 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Snowridge Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Snowridge-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Snowridge-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Snowridge-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Snowridge-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Westmere Nov 28 04:32:46 localhost nova_compute[228333]: Westmere-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Westmere-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Westmere-v2 Nov 28 04:32:46 localhost nova_compute[228333]: athlon Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: athlon-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: core2duo Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: core2duo-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: coreduo Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: coreduo-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: kvm32 Nov 28 04:32:46 localhost nova_compute[228333]: kvm32-v1 Nov 28 04:32:46 localhost nova_compute[228333]: kvm64 Nov 28 04:32:46 localhost nova_compute[228333]: kvm64-v1 Nov 28 04:32:46 localhost nova_compute[228333]: n270 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: n270-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: pentium Nov 28 04:32:46 localhost nova_compute[228333]: pentium-v1 Nov 28 04:32:46 localhost nova_compute[228333]: pentium2 Nov 28 04:32:46 localhost nova_compute[228333]: pentium2-v1 Nov 28 04:32:46 localhost nova_compute[228333]: pentium3 Nov 28 04:32:46 localhost nova_compute[228333]: pentium3-v1 Nov 28 04:32:46 localhost nova_compute[228333]: phenom Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: phenom-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: qemu32 Nov 28 04:32:46 localhost nova_compute[228333]: qemu32-v1 Nov 28 04:32:46 localhost nova_compute[228333]: qemu64 Nov 28 04:32:46 localhost nova_compute[228333]: qemu64-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: file Nov 28 04:32:46 localhost nova_compute[228333]: anonymous Nov 28 04:32:46 localhost nova_compute[228333]: memfd Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: disk Nov 28 04:32:46 localhost nova_compute[228333]: cdrom Nov 28 04:32:46 localhost nova_compute[228333]: floppy Nov 28 04:32:46 localhost nova_compute[228333]: lun Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: fdc Nov 28 04:32:46 localhost nova_compute[228333]: scsi Nov 28 04:32:46 localhost nova_compute[228333]: virtio Nov 28 04:32:46 localhost nova_compute[228333]: usb Nov 28 04:32:46 localhost nova_compute[228333]: sata Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: virtio Nov 28 04:32:46 localhost nova_compute[228333]: virtio-transitional Nov 28 04:32:46 localhost nova_compute[228333]: virtio-non-transitional Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: vnc Nov 28 04:32:46 localhost nova_compute[228333]: egl-headless Nov 28 04:32:46 localhost nova_compute[228333]: dbus Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: subsystem Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: default Nov 28 04:32:46 localhost nova_compute[228333]: mandatory Nov 28 04:32:46 localhost nova_compute[228333]: requisite Nov 28 04:32:46 localhost nova_compute[228333]: optional Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: usb Nov 28 04:32:46 localhost nova_compute[228333]: pci Nov 28 04:32:46 localhost nova_compute[228333]: scsi Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: virtio Nov 28 04:32:46 localhost nova_compute[228333]: virtio-transitional Nov 28 04:32:46 localhost nova_compute[228333]: virtio-non-transitional Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: random Nov 28 04:32:46 localhost nova_compute[228333]: egd Nov 28 04:32:46 localhost nova_compute[228333]: builtin Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: path Nov 28 04:32:46 localhost nova_compute[228333]: handle Nov 28 04:32:46 localhost nova_compute[228333]: virtiofs Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: tpm-tis Nov 28 04:32:46 localhost nova_compute[228333]: tpm-crb Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: emulator Nov 28 04:32:46 localhost nova_compute[228333]: external Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: 2.0 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: usb Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: pty Nov 28 04:32:46 localhost nova_compute[228333]: unix Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: qemu Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: builtin Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: default Nov 28 04:32:46 localhost nova_compute[228333]: passt Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: isa Nov 28 04:32:46 localhost nova_compute[228333]: hyperv Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: null Nov 28 04:32:46 localhost nova_compute[228333]: vc Nov 28 04:32:46 localhost nova_compute[228333]: pty Nov 28 04:32:46 localhost nova_compute[228333]: dev Nov 28 04:32:46 localhost nova_compute[228333]: file Nov 28 04:32:46 localhost nova_compute[228333]: pipe Nov 28 04:32:46 localhost nova_compute[228333]: stdio Nov 28 04:32:46 localhost nova_compute[228333]: udp Nov 28 04:32:46 localhost nova_compute[228333]: tcp Nov 28 04:32:46 localhost nova_compute[228333]: unix Nov 28 04:32:46 localhost nova_compute[228333]: qemu-vdagent Nov 28 04:32:46 localhost nova_compute[228333]: dbus Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: relaxed Nov 28 04:32:46 localhost nova_compute[228333]: vapic Nov 28 04:32:46 localhost nova_compute[228333]: spinlocks Nov 28 04:32:46 localhost nova_compute[228333]: vpindex Nov 28 04:32:46 localhost nova_compute[228333]: runtime Nov 28 04:32:46 localhost nova_compute[228333]: synic Nov 28 04:32:46 localhost nova_compute[228333]: stimer Nov 28 04:32:46 localhost nova_compute[228333]: reset Nov 28 04:32:46 localhost nova_compute[228333]: vendor_id Nov 28 04:32:46 localhost nova_compute[228333]: frequencies Nov 28 04:32:46 localhost nova_compute[228333]: reenlightenment Nov 28 04:32:46 localhost nova_compute[228333]: tlbflush Nov 28 04:32:46 localhost nova_compute[228333]: ipi Nov 28 04:32:46 localhost nova_compute[228333]: avic Nov 28 04:32:46 localhost nova_compute[228333]: emsr_bitmap Nov 28 04:32:46 localhost nova_compute[228333]: xmm_input Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: 4095 Nov 28 04:32:46 localhost nova_compute[228333]: on Nov 28 04:32:46 localhost nova_compute[228333]: off Nov 28 04:32:46 localhost nova_compute[228333]: off Nov 28 04:32:46 localhost nova_compute[228333]: Linux KVM Hv Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: tdx Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.477 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: /usr/libexec/qemu-kvm Nov 28 04:32:46 localhost nova_compute[228333]: kvm Nov 28 04:32:46 localhost nova_compute[228333]: pc-i440fx-rhel7.6.0 Nov 28 04:32:46 localhost nova_compute[228333]: x86_64 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: rom Nov 28 04:32:46 localhost nova_compute[228333]: pflash Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: yes Nov 28 04:32:46 localhost nova_compute[228333]: no Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: no Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: on Nov 28 04:32:46 localhost nova_compute[228333]: off Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: on Nov 28 04:32:46 localhost nova_compute[228333]: off Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome Nov 28 04:32:46 localhost nova_compute[228333]: AMD Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: 486 Nov 28 04:32:46 localhost nova_compute[228333]: 486-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-noTSX Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-noTSX-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Broadwell-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server-noTSX Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cascadelake-Server-v5 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Conroe Nov 28 04:32:46 localhost nova_compute[228333]: Conroe-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Cooperlake Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cooperlake-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Cooperlake-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Denverton Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Denverton-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Denverton-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Denverton-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Dhyana Nov 28 04:32:46 localhost nova_compute[228333]: Dhyana-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Dhyana-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Genoa Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Genoa-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-IBPB Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Milan Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Milan-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Milan-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-Rome-v4 Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-v1 Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-v2 Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: EPYC-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: GraniteRapids Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: GraniteRapids-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: GraniteRapids-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-noTSX Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-noTSX-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Haswell-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-noTSX Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v5 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v6 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Icelake-Server-v7 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: IvyBridge Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: IvyBridge-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: IvyBridge-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: IvyBridge-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: KnightsMill Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: KnightsMill-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nehalem Nov 28 04:32:46 localhost nova_compute[228333]: Nehalem-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nehalem-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nehalem-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G1 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G1-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G2 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G2-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G3 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G3-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G4-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G5 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Opteron_G5-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Penryn Nov 28 04:32:46 localhost nova_compute[228333]: Penryn-v1 Nov 28 04:32:46 localhost nova_compute[228333]: SandyBridge Nov 28 04:32:46 localhost nova_compute[228333]: SandyBridge-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: SandyBridge-v1 Nov 28 04:32:46 localhost nova_compute[228333]: SandyBridge-v2 Nov 28 04:32:46 localhost nova_compute[228333]: SapphireRapids Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: SapphireRapids-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: SapphireRapids-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: SapphireRapids-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: SierraForest Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: SierraForest-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client-noTSX-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Client-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-noTSX-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Skylake-Server-v5 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Snowridge Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Snowridge-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Snowridge-v2 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Snowridge-v3 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Snowridge-v4 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Westmere Nov 28 04:32:46 localhost nova_compute[228333]: Westmere-IBRS Nov 28 04:32:46 localhost nova_compute[228333]: Westmere-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Westmere-v2 Nov 28 04:32:46 localhost nova_compute[228333]: athlon Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: athlon-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: core2duo Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: core2duo-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: coreduo Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: coreduo-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: kvm32 Nov 28 04:32:46 localhost nova_compute[228333]: kvm32-v1 Nov 28 04:32:46 localhost nova_compute[228333]: kvm64 Nov 28 04:32:46 localhost nova_compute[228333]: kvm64-v1 Nov 28 04:32:46 localhost nova_compute[228333]: n270 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: n270-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: pentium Nov 28 04:32:46 localhost nova_compute[228333]: pentium-v1 Nov 28 04:32:46 localhost nova_compute[228333]: pentium2 Nov 28 04:32:46 localhost nova_compute[228333]: pentium2-v1 Nov 28 04:32:46 localhost nova_compute[228333]: pentium3 Nov 28 04:32:46 localhost nova_compute[228333]: pentium3-v1 Nov 28 04:32:46 localhost nova_compute[228333]: phenom Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: phenom-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: qemu32 Nov 28 04:32:46 localhost nova_compute[228333]: qemu32-v1 Nov 28 04:32:46 localhost nova_compute[228333]: qemu64 Nov 28 04:32:46 localhost nova_compute[228333]: qemu64-v1 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: file Nov 28 04:32:46 localhost nova_compute[228333]: anonymous Nov 28 04:32:46 localhost nova_compute[228333]: memfd Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: disk Nov 28 04:32:46 localhost nova_compute[228333]: cdrom Nov 28 04:32:46 localhost nova_compute[228333]: floppy Nov 28 04:32:46 localhost nova_compute[228333]: lun Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: ide Nov 28 04:32:46 localhost nova_compute[228333]: fdc Nov 28 04:32:46 localhost nova_compute[228333]: scsi Nov 28 04:32:46 localhost nova_compute[228333]: virtio Nov 28 04:32:46 localhost nova_compute[228333]: usb Nov 28 04:32:46 localhost nova_compute[228333]: sata Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: virtio Nov 28 04:32:46 localhost nova_compute[228333]: virtio-transitional Nov 28 04:32:46 localhost nova_compute[228333]: virtio-non-transitional Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: vnc Nov 28 04:32:46 localhost nova_compute[228333]: egl-headless Nov 28 04:32:46 localhost nova_compute[228333]: dbus Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: subsystem Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: default Nov 28 04:32:46 localhost nova_compute[228333]: mandatory Nov 28 04:32:46 localhost nova_compute[228333]: requisite Nov 28 04:32:46 localhost nova_compute[228333]: optional Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: usb Nov 28 04:32:46 localhost nova_compute[228333]: pci Nov 28 04:32:46 localhost nova_compute[228333]: scsi Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: virtio Nov 28 04:32:46 localhost nova_compute[228333]: virtio-transitional Nov 28 04:32:46 localhost nova_compute[228333]: virtio-non-transitional Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: random Nov 28 04:32:46 localhost nova_compute[228333]: egd Nov 28 04:32:46 localhost nova_compute[228333]: builtin Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: path Nov 28 04:32:46 localhost nova_compute[228333]: handle Nov 28 04:32:46 localhost nova_compute[228333]: virtiofs Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: tpm-tis Nov 28 04:32:46 localhost nova_compute[228333]: tpm-crb Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: emulator Nov 28 04:32:46 localhost nova_compute[228333]: external Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: 2.0 Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: usb Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: pty Nov 28 04:32:46 localhost nova_compute[228333]: unix Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: qemu Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: builtin Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: default Nov 28 04:32:46 localhost nova_compute[228333]: passt Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: isa Nov 28 04:32:46 localhost nova_compute[228333]: hyperv Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: null Nov 28 04:32:46 localhost nova_compute[228333]: vc Nov 28 04:32:46 localhost nova_compute[228333]: pty Nov 28 04:32:46 localhost nova_compute[228333]: dev Nov 28 04:32:46 localhost nova_compute[228333]: file Nov 28 04:32:46 localhost nova_compute[228333]: pipe Nov 28 04:32:46 localhost nova_compute[228333]: stdio Nov 28 04:32:46 localhost nova_compute[228333]: udp Nov 28 04:32:46 localhost nova_compute[228333]: tcp Nov 28 04:32:46 localhost nova_compute[228333]: unix Nov 28 04:32:46 localhost nova_compute[228333]: qemu-vdagent Nov 28 04:32:46 localhost nova_compute[228333]: dbus Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: relaxed Nov 28 04:32:46 localhost nova_compute[228333]: vapic Nov 28 04:32:46 localhost nova_compute[228333]: spinlocks Nov 28 04:32:46 localhost nova_compute[228333]: vpindex Nov 28 04:32:46 localhost nova_compute[228333]: runtime Nov 28 04:32:46 localhost nova_compute[228333]: synic Nov 28 04:32:46 localhost nova_compute[228333]: stimer Nov 28 04:32:46 localhost nova_compute[228333]: reset Nov 28 04:32:46 localhost nova_compute[228333]: vendor_id Nov 28 04:32:46 localhost nova_compute[228333]: frequencies Nov 28 04:32:46 localhost nova_compute[228333]: reenlightenment Nov 28 04:32:46 localhost nova_compute[228333]: tlbflush Nov 28 04:32:46 localhost nova_compute[228333]: ipi Nov 28 04:32:46 localhost nova_compute[228333]: avic Nov 28 04:32:46 localhost nova_compute[228333]: emsr_bitmap Nov 28 04:32:46 localhost nova_compute[228333]: xmm_input Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: 4095 Nov 28 04:32:46 localhost nova_compute[228333]: on Nov 28 04:32:46 localhost nova_compute[228333]: off Nov 28 04:32:46 localhost nova_compute[228333]: off Nov 28 04:32:46 localhost nova_compute[228333]: Linux KVM Hv Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: tdx Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: Nov 28 04:32:46 localhost nova_compute[228333]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.544 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.544 228337 INFO nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Secure Boot support detected#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.546 228337 INFO nova.virt.libvirt.driver [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.547 228337 INFO nova.virt.libvirt.driver [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.559 228337 DEBUG nova.virt.libvirt.driver [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.604 228337 INFO nova.virt.node [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Determined node identity 35fead26-0bad-4950-b646-987079d58a17 from /var/lib/nova/compute_id#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.621 228337 DEBUG nova.compute.manager [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Verified node 35fead26-0bad-4950-b646-987079d58a17 matches my host np0005538513.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.651 228337 DEBUG nova.compute.manager [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.656 228337 DEBUG nova.virt.libvirt.vif [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T08:32:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=,hidden=False,host='np0005538513.localdomain',hostname='test',id=2,image_ref='391767f1-35f2-4b68-ae15-e0b29db66dcb',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-28T08:33:06Z,launched_on='np0005538513.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=,node='np0005538513.localdomain',numa_topology=None,old_flavor=,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='9dda653c53224db086060962b0702694',ramdisk_id='',reservation_id='r-a3c307c0',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata=,tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2025-11-28T08:33:07Z,user_data=None,user_id='4d9169247d4447d0a8dd4c33f8b23dee',uuid=c2f0c7d6-df5f-4541-8b2c-bc1eaf805812,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.656 228337 DEBUG nova.network.os_vif_util [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Converting VIF {"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.657 228337 DEBUG nova.network.os_vif_util [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.658 228337 DEBUG os_vif [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.701 228337 DEBUG ovsdbapp.backend.ovs_idl [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.701 228337 DEBUG ovsdbapp.backend.ovs_idl [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.701 228337 DEBUG ovsdbapp.backend.ovs_idl [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.702 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.702 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.702 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.703 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.704 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.706 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.722 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.722 228337 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.723 228337 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 28 04:32:46 localhost nova_compute[228333]: 2025-11-28 09:32:46.724 228337 INFO oslo.privsep.daemon [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpyle7c7tk/privsep.sock']#033[00m Nov 28 04:32:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29574 DF PROTO=TCP SPT=59928 DPT=9882 SEQ=3280948079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D84820000000001030307) Nov 28 04:32:47 localhost nova_compute[228333]: 2025-11-28 09:32:47.261 228337 INFO oslo.privsep.daemon [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Nov 28 04:32:47 localhost nova_compute[228333]: 2025-11-28 09:32:47.171 228411 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 28 04:32:47 localhost nova_compute[228333]: 2025-11-28 09:32:47.174 228411 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 28 04:32:47 localhost nova_compute[228333]: 2025-11-28 09:32:47.176 228411 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m Nov 28 04:32:47 localhost nova_compute[228333]: 2025-11-28 09:32:47.176 228411 INFO oslo.privsep.daemon [-] privsep daemon running as pid 228411#033[00m Nov 28 04:32:47 localhost nova_compute[228333]: 2025-11-28 09:32:47.533 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:32:47 localhost nova_compute[228333]: 2025-11-28 09:32:47.534 228337 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09612b07-51, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:32:47 localhost nova_compute[228333]: 2025-11-28 09:32:47.534 228337 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap09612b07-51, col_values=(('external_ids', {'iface-id': '09612b07-5142-4b0f-9dab-74bf4403f69f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f4:fc:6c', 'vm-uuid': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:32:47 localhost nova_compute[228333]: 2025-11-28 09:32:47.536 228337 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 28 04:32:47 localhost nova_compute[228333]: 2025-11-28 09:32:47.536 228337 INFO os_vif [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51')#033[00m Nov 28 04:32:47 localhost nova_compute[228333]: 2025-11-28 09:32:47.537 228337 DEBUG nova.compute.manager [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 04:32:47 localhost nova_compute[228333]: 2025-11-28 09:32:47.540 228337 DEBUG nova.compute.manager [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304#033[00m Nov 28 04:32:47 localhost nova_compute[228333]: 2025-11-28 09:32:47.541 228337 INFO nova.compute.manager [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m Nov 28 04:32:47 localhost nova_compute[228333]: 2025-11-28 09:32:47.603 228337 DEBUG oslo_concurrency.lockutils [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:32:47 localhost nova_compute[228333]: 2025-11-28 09:32:47.604 228337 DEBUG oslo_concurrency.lockutils [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:32:47 localhost nova_compute[228333]: 2025-11-28 09:32:47.604 228337 DEBUG oslo_concurrency.lockutils [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:32:47 localhost nova_compute[228333]: 2025-11-28 09:32:47.604 228337 DEBUG nova.compute.resource_tracker [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 04:32:47 localhost nova_compute[228333]: 2025-11-28 09:32:47.605 228337 DEBUG oslo_concurrency.processutils [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:32:48 localhost nova_compute[228333]: 2025-11-28 09:32:48.073 228337 DEBUG oslo_concurrency.processutils [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:32:48 localhost nova_compute[228333]: 2025-11-28 09:32:48.130 228337 DEBUG nova.virt.libvirt.driver [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:32:48 localhost nova_compute[228333]: 2025-11-28 09:32:48.131 228337 DEBUG nova.virt.libvirt.driver [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:32:48 localhost nova_compute[228333]: 2025-11-28 09:32:48.286 228337 WARNING nova.virt.libvirt.driver [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 04:32:48 localhost nova_compute[228333]: 2025-11-28 09:32:48.287 228337 DEBUG nova.compute.resource_tracker [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12887MB free_disk=41.837093353271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 04:32:48 localhost nova_compute[228333]: 2025-11-28 09:32:48.287 228337 DEBUG oslo_concurrency.lockutils [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:32:48 localhost nova_compute[228333]: 2025-11-28 09:32:48.287 228337 DEBUG oslo_concurrency.lockutils [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:32:48 localhost nova_compute[228333]: 2025-11-28 09:32:48.440 228337 DEBUG nova.compute.resource_tracker [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 04:32:48 localhost nova_compute[228333]: 2025-11-28 09:32:48.441 228337 DEBUG nova.compute.resource_tracker [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 04:32:48 localhost nova_compute[228333]: 2025-11-28 09:32:48.441 228337 DEBUG nova.compute.resource_tracker [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 04:32:48 localhost nova_compute[228333]: 2025-11-28 09:32:48.493 228337 DEBUG nova.scheduler.client.report [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Refreshing inventories for resource provider 35fead26-0bad-4950-b646-987079d58a17 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 28 04:32:48 localhost nova_compute[228333]: 2025-11-28 09:32:48.513 228337 DEBUG nova.scheduler.client.report [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Updating ProviderTree inventory for provider 35fead26-0bad-4950-b646-987079d58a17 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 28 04:32:48 localhost nova_compute[228333]: 2025-11-28 09:32:48.514 228337 DEBUG nova.compute.provider_tree [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 28 04:32:48 localhost nova_compute[228333]: 2025-11-28 09:32:48.536 228337 DEBUG nova.scheduler.client.report [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Refreshing aggregate associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 28 04:32:48 localhost nova_compute[228333]: 2025-11-28 09:32:48.560 228337 DEBUG nova.scheduler.client.report [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Refreshing trait associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,HW_CPU_X86_FMA3,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX2,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_CLMUL,HW_CPU_X86_ABM,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE41,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_F16C,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 28 04:32:48 localhost nova_compute[228333]: 2025-11-28 09:32:48.606 228337 DEBUG oslo_concurrency.processutils [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:32:48 localhost nova_compute[228333]: 2025-11-28 09:32:48.705 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:32:49 localhost nova_compute[228333]: 2025-11-28 09:32:49.056 228337 DEBUG oslo_concurrency.processutils [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:32:49 localhost nova_compute[228333]: 2025-11-28 09:32:49.060 228337 DEBUG nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N Nov 28 04:32:49 localhost nova_compute[228333]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m Nov 28 04:32:49 localhost nova_compute[228333]: 2025-11-28 09:32:49.061 228337 INFO nova.virt.libvirt.host [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] kernel doesn't support AMD SEV#033[00m Nov 28 04:32:49 localhost nova_compute[228333]: 2025-11-28 09:32:49.061 228337 DEBUG nova.compute.provider_tree [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 04:32:49 localhost nova_compute[228333]: 2025-11-28 09:32:49.062 228337 DEBUG nova.virt.libvirt.driver [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Nov 28 04:32:49 localhost nova_compute[228333]: 2025-11-28 09:32:49.076 228337 DEBUG nova.scheduler.client.report [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 04:32:49 localhost nova_compute[228333]: 2025-11-28 09:32:49.093 228337 DEBUG nova.compute.resource_tracker [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 04:32:49 localhost nova_compute[228333]: 2025-11-28 09:32:49.093 228337 DEBUG oslo_concurrency.lockutils [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.806s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:32:49 localhost nova_compute[228333]: 2025-11-28 09:32:49.093 228337 DEBUG nova.service [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m Nov 28 04:32:49 localhost python3.9[228616]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Nov 28 04:32:49 localhost nova_compute[228333]: 2025-11-28 09:32:49.112 228337 DEBUG nova.service [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m Nov 28 04:32:49 localhost nova_compute[228333]: 2025-11-28 09:32:49.112 228337 DEBUG nova.servicegroup.drivers.db [None req-df05a155-3b95-4a89-8845-1b45e57fcae8 - - - - - -] DB_Driver: join new ServiceGroup member np0005538513.localdomain to the compute group, service = join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m Nov 28 04:32:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58028 DF PROTO=TCP SPT=45838 DPT=9105 SEQ=3383472125 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D8C020000000001030307) Nov 28 04:32:49 localhost systemd[1]: Started libpod-conmon-f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967.scope. Nov 28 04:32:49 localhost systemd[1]: Started libcrun container. Nov 28 04:32:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91ffc49efb3c7a4f0573f98c172baf0327c9cb7db605d9d237beebdefd054f1a/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff) Nov 28 04:32:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91ffc49efb3c7a4f0573f98c172baf0327c9cb7db605d9d237beebdefd054f1a/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Nov 28 04:32:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91ffc49efb3c7a4f0573f98c172baf0327c9cb7db605d9d237beebdefd054f1a/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 28 04:32:49 localhost podman[228661]: 2025-11-28 09:32:49.334167235 +0000 UTC m=+0.129887731 container init f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:32:49 localhost podman[228661]: 2025-11-28 09:32:49.343632995 +0000 UTC m=+0.139353501 container start f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Nov 28 04:32:49 localhost python3.9[228616]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Applying nova statedir ownership Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436 Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/ Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436 Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0 Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/ Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436 Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0 Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/ Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 already 42436:42436 Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 to system_u:object_r:container_file_t:s0 Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/console.log Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436 Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0 Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/2b1009b2ab824d8ddaaa3afb1ca6ce3f88abf415 Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66 Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/ Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436 Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0 Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-2b1009b2ab824d8ddaaa3afb1ca6ce3f88abf415 Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66 Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436 Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0 Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/ Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436 Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0 Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/ Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436 Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0 Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/b234715fc878456b41e32c4fbc669b417044dbe6c6684bbc9059e5c93396ffea Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/469bc4441baff9216df986857f9ff45dbf25965a8d2f755a6449ac2645cb7191 Nov 28 04:32:49 localhost nova_compute_init[228682]: INFO:nova_statedir:Nova statedir ownership complete Nov 28 04:32:49 localhost systemd[1]: libpod-f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967.scope: Deactivated successfully. Nov 28 04:32:49 localhost podman[228697]: 2025-11-28 09:32:49.490559097 +0000 UTC m=+0.068631909 container died f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 04:32:49 localhost podman[228697]: 2025-11-28 09:32:49.51965475 +0000 UTC m=+0.097727522 container cleanup f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:32:49 localhost systemd[1]: libpod-conmon-f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967.scope: Deactivated successfully. Nov 28 04:32:50 localhost systemd[1]: var-lib-containers-storage-overlay-91ffc49efb3c7a4f0573f98c172baf0327c9cb7db605d9d237beebdefd054f1a-merged.mount: Deactivated successfully. Nov 28 04:32:50 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967-userdata-shm.mount: Deactivated successfully. Nov 28 04:32:50 localhost systemd[1]: session-53.scope: Deactivated successfully. Nov 28 04:32:50 localhost systemd[1]: session-53.scope: Consumed 2min 11.205s CPU time. Nov 28 04:32:50 localhost systemd-logind[764]: Session 53 logged out. Waiting for processes to exit. Nov 28 04:32:50 localhost systemd-logind[764]: Removed session 53. Nov 28 04:32:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:32:50.811 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:32:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:32:50.812 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:32:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:32:50.813 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:32:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58029 DF PROTO=TCP SPT=45838 DPT=9105 SEQ=3383472125 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1D94020000000001030307) Nov 28 04:32:51 localhost nova_compute[228333]: 2025-11-28 09:32:51.706 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:32:52 localhost nova_compute[228333]: 2025-11-28 09:32:52.114 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:32:52 localhost nova_compute[228333]: 2025-11-28 09:32:52.135 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Triggering sync for uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Nov 28 04:32:52 localhost nova_compute[228333]: 2025-11-28 09:32:52.136 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:32:52 localhost nova_compute[228333]: 2025-11-28 09:32:52.137 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:32:52 localhost nova_compute[228333]: 2025-11-28 09:32:52.137 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:32:52 localhost nova_compute[228333]: 2025-11-28 09:32:52.201 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.064s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:32:53 localhost nova_compute[228333]: 2025-11-28 09:32:53.706 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:32:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58030 DF PROTO=TCP SPT=45838 DPT=9105 SEQ=3383472125 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1DA3C20000000001030307) Nov 28 04:32:56 localhost sshd[228737]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:32:56 localhost systemd-logind[764]: New session 55 of user zuul. Nov 28 04:32:56 localhost systemd[1]: Started Session 55 of User zuul. Nov 28 04:32:56 localhost nova_compute[228333]: 2025-11-28 09:32:56.708 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:32:57 localhost python3.9[228848]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:32:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:32:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:32:57 localhost podman[228854]: 2025-11-28 09:32:57.842304984 +0000 UTC m=+0.077131028 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Nov 28 04:32:57 localhost podman[228854]: 2025-11-28 09:32:57.847905361 +0000 UTC m=+0.082731415 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Nov 28 04:32:57 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:32:57 localhost systemd[1]: tmp-crun.LunVHZ.mount: Deactivated successfully. Nov 28 04:32:57 localhost podman[228853]: 2025-11-28 09:32:57.899763067 +0000 UTC m=+0.134160628 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 28 04:32:57 localhost podman[228853]: 2025-11-28 09:32:57.962472426 +0000 UTC m=+0.196869977 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 28 04:32:57 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:32:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28993 DF PROTO=TCP SPT=56164 DPT=9101 SEQ=2028527426 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1DAEEA0000000001030307) Nov 28 04:32:58 localhost nova_compute[228333]: 2025-11-28 09:32:58.741 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:32:59 localhost python3.9[229006]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 28 04:32:59 localhost systemd[1]: Reloading. Nov 28 04:32:59 localhost systemd-sysv-generator[229033]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:32:59 localhost systemd-rc-local-generator[229028]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:32:59 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:32:59 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:32:59 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:32:59 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:32:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:32:59 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:32:59 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:32:59 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:32:59 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:00 localhost python3.9[229150]: ansible-ansible.builtin.service_facts Invoked Nov 28 04:33:00 localhost network[229167]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 28 04:33:00 localhost network[229168]: 'network-scripts' will be removed from distribution in near future. Nov 28 04:33:00 localhost network[229169]: It is advised to switch to 'NetworkManager' instead for network management. Nov 28 04:33:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28995 DF PROTO=TCP SPT=56164 DPT=9101 SEQ=2028527426 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1DBB020000000001030307) Nov 28 04:33:01 localhost nova_compute[228333]: 2025-11-28 09:33:01.710 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:33:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:33:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58031 DF PROTO=TCP SPT=45838 DPT=9105 SEQ=3383472125 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1DC3820000000001030307) Nov 28 04:33:03 localhost nova_compute[228333]: 2025-11-28 09:33:03.744 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:33:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65005 DF PROTO=TCP SPT=60840 DPT=9100 SEQ=2329009262 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1DCF820000000001030307) Nov 28 04:33:06 localhost nova_compute[228333]: 2025-11-28 09:33:06.713 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:33:06 localhost python3.9[229404]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:33:08 localhost python3.9[229515]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:33:08 localhost systemd-journald[47227]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 76.3 (254 of 333 items), suggesting rotation. Nov 28 04:33:08 localhost systemd-journald[47227]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 28 04:33:08 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 28 04:33:08 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 28 04:33:08 localhost nova_compute[228333]: 2025-11-28 09:33:08.774 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:33:08 localhost python3.9[229626]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:33:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45682 DF PROTO=TCP SPT=34304 DPT=9102 SEQ=594967161 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1DDB820000000001030307) Nov 28 04:33:10 localhost python3.9[229736]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:33:10 localhost python3.9[229846]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 28 04:33:11 localhost nova_compute[228333]: 2025-11-28 09:33:11.715 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:33:12 localhost python3.9[229956]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 28 04:33:12 localhost systemd[1]: Reloading. Nov 28 04:33:12 localhost systemd-rc-local-generator[229978]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:33:12 localhost systemd-sysv-generator[229986]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:33:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65007 DF PROTO=TCP SPT=60840 DPT=9100 SEQ=2329009262 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1DE7420000000001030307) Nov 28 04:33:12 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:12 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:12 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:12 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:33:12 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:12 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:12 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:12 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:13 localhost python3.9[230102]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:33:13 localhost nova_compute[228333]: 2025-11-28 09:33:13.776 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:33:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:33:14 localhost podman[230159]: 2025-11-28 09:33:14.856363098 +0000 UTC m=+0.086087460 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:33:14 localhost podman[230159]: 2025-11-28 09:33:14.873368823 +0000 UTC m=+0.103093175 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 04:33:14 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:33:15 localhost python3.9[230233]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:33:15 localhost python3.9[230341]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:33:16 localhost nova_compute[228333]: 2025-11-28 09:33:16.717 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:33:16 localhost python3.9[230451]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:33:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46252 DF PROTO=TCP SPT=52650 DPT=9882 SEQ=4185442292 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1DF9C20000000001030307) Nov 28 04:33:17 localhost python3.9[230537]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322396.2342167-359-263547322514989/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=8310b0590be84763ce46965bb976fd9ca6a7668a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:33:18 localhost python3.9[230647]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None Nov 28 04:33:18 localhost nova_compute[228333]: 2025-11-28 09:33:18.802 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:33:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17290 DF PROTO=TCP SPT=54360 DPT=9105 SEQ=937547911 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E01420000000001030307) Nov 28 04:33:19 localhost python3.9[230757]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None Nov 28 04:33:20 localhost python3.9[230868]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Nov 28 04:33:21 localhost python3.9[230984]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005538513.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None Nov 28 04:33:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17291 DF PROTO=TCP SPT=54360 DPT=9105 SEQ=937547911 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E09420000000001030307) Nov 28 04:33:21 localhost nova_compute[228333]: 2025-11-28 09:33:21.720 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:33:22 localhost python3.9[231100]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:33:23 localhost python3.9[231186]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764322402.117302-563-252611136264212/.source.conf _original_basename=ceilometer.conf follow=False checksum=e4f5a0d8a335534158f72dc0bd2ff76fd1e29e2d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:33:23 localhost python3.9[231294]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:33:23 localhost nova_compute[228333]: 2025-11-28 09:33:23.802 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:33:24 localhost python3.9[231380]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764322403.1893637-563-89601618094617/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:33:25 localhost python3.9[231488]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:33:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17292 DF PROTO=TCP SPT=54360 DPT=9105 SEQ=937547911 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E19020000000001030307) Nov 28 04:33:25 localhost python3.9[231574]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764322404.7822988-563-39065417668188/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:33:26 localhost nova_compute[228333]: 2025-11-28 09:33:26.721 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:33:27 localhost python3.9[231682]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:33:27 localhost python3.9[231790]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:33:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29020 DF PROTO=TCP SPT=55544 DPT=9101 SEQ=474957933 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E24190000000001030307) Nov 28 04:33:28 localhost python3.9[231898]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:33:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:33:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:33:28 localhost nova_compute[228333]: 2025-11-28 09:33:28.805 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:33:28 localhost systemd[1]: tmp-crun.yo6INa.mount: Deactivated successfully. Nov 28 04:33:28 localhost podman[231986]: 2025-11-28 09:33:28.856820617 +0000 UTC m=+0.087540986 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 04:33:28 localhost python3.9[231984]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322407.9481335-740-232584814813974/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:33:28 localhost systemd[1]: tmp-crun.znjtNQ.mount: Deactivated successfully. Nov 28 04:33:28 localhost podman[231985]: 2025-11-28 09:33:28.89870101 +0000 UTC m=+0.132985613 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 04:33:28 localhost podman[231986]: 2025-11-28 09:33:28.923807244 +0000 UTC m=+0.154527633 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible) Nov 28 04:33:28 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:33:28 localhost podman[231985]: 2025-11-28 09:33:28.959682524 +0000 UTC m=+0.193967117 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller) Nov 28 04:33:28 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:33:29 localhost python3.9[232134]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:33:29 localhost python3.9[232189]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:33:30 localhost python3.9[232297]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:33:30 localhost python3.9[232383]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322410.011794-740-228076457598327/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=d15068604cf730dd6e7b88a19d62f57d3a39f94f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:33:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29022 DF PROTO=TCP SPT=55544 DPT=9101 SEQ=474957933 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E30020000000001030307) Nov 28 04:33:31 localhost python3.9[232491]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:33:31 localhost nova_compute[228333]: 2025-11-28 09:33:31.723 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:33:32 localhost python3.9[232577]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322411.097895-740-139062728913990/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:33:32 localhost python3.9[232685]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:33:33 localhost python3.9[232771]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322412.237855-740-54978732213164/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:33:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38450 DF PROTO=TCP SPT=55732 DPT=9102 SEQ=3705141826 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E39020000000001030307) Nov 28 04:33:33 localhost nova_compute[228333]: 2025-11-28 09:33:33.810 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:33:33 localhost python3.9[232879]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:33:34 localhost python3.9[232965]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322413.362747-740-149523037075264/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=7e5ab36b7368c1d4a00810e02af11a7f7d7c84e8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:33:35 localhost python3.9[233073]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:33:35 localhost python3.9[233159]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322414.5653374-740-189288447202918/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:33:36 localhost python3.9[233267]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:33:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44169 DF PROTO=TCP SPT=58168 DPT=9100 SEQ=32968573 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E44820000000001030307) Nov 28 04:33:36 localhost python3.9[233353]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322415.6777573-740-233639674667536/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=0e4ea521b0035bea70b7a804346a5c89364dcbc3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:33:36 localhost nova_compute[228333]: 2025-11-28 09:33:36.725 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:33:37 localhost python3.9[233461]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:33:38 localhost python3.9[233547]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322416.7693648-740-48363449612220/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=b056dcaaba7624b93826bb95ee9e82f81bde6c72 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:33:38 localhost nova_compute[228333]: 2025-11-28 09:33:38.842 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:33:38 localhost python3.9[233655]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:33:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41361 DF PROTO=TCP SPT=57320 DPT=9100 SEQ=1150725942 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E4F820000000001030307) Nov 28 04:33:39 localhost python3.9[233741]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322418.5314248-740-19092222534056/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=885ccc6f5edd8803cb385bdda5648d0b3017b4e4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:33:40 localhost python3.9[233849]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:33:41 localhost python3.9[233935]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322420.1067162-740-32014311779095/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:33:41 localhost nova_compute[228333]: 2025-11-28 09:33:41.727 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:33:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44171 DF PROTO=TCP SPT=58168 DPT=9100 SEQ=32968573 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E5C420000000001030307) Nov 28 04:33:42 localhost python3.9[234045]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:33:43 localhost python3.9[234155]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:33:43 localhost systemd[1]: Reloading. Nov 28 04:33:43 localhost systemd-rc-local-generator[234182]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:33:43 localhost systemd-sysv-generator[234188]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:33:43 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:43 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:43 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:43 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:33:43 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:43 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:43 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:43 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:43 localhost nova_compute[228333]: 2025-11-28 09:33:43.842 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:33:43 localhost systemd[1]: Listening on Podman API Socket. Nov 28 04:33:44 localhost python3.9[234305]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:33:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:33:45 localhost systemd[1]: tmp-crun.CFaR2m.mount: Deactivated successfully. Nov 28 04:33:45 localhost podman[234394]: 2025-11-28 09:33:45.270148116 +0000 UTC m=+0.098383454 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:33:45 localhost podman[234394]: 2025-11-28 09:33:45.281482959 +0000 UTC m=+0.109718347 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:33:45 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:33:45 localhost python3.9[234393]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322424.393135-1256-181850790095253/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 28 04:33:45 localhost nova_compute[228333]: 2025-11-28 09:33:45.777 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:33:45 localhost nova_compute[228333]: 2025-11-28 09:33:45.778 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:33:45 localhost nova_compute[228333]: 2025-11-28 09:33:45.778 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 04:33:45 localhost nova_compute[228333]: 2025-11-28 09:33:45.778 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 04:33:45 localhost python3.9[234469]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:33:46 localhost python3.9[234557]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322424.393135-1256-181850790095253/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 28 04:33:46 localhost nova_compute[228333]: 2025-11-28 09:33:46.730 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:33:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65128 DF PROTO=TCP SPT=58924 DPT=9882 SEQ=2916956063 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E6EC20000000001030307) Nov 28 04:33:47 localhost nova_compute[228333]: 2025-11-28 09:33:47.232 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:33:47 localhost nova_compute[228333]: 2025-11-28 09:33:47.232 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:33:47 localhost nova_compute[228333]: 2025-11-28 09:33:47.232 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 04:33:47 localhost nova_compute[228333]: 2025-11-28 09:33:47.233 228337 DEBUG nova.objects.instance [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:33:47 localhost python3.9[234667]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False Nov 28 04:33:47 localhost nova_compute[228333]: 2025-11-28 09:33:47.722 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 04:33:47 localhost nova_compute[228333]: 2025-11-28 09:33:47.744 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:33:47 localhost nova_compute[228333]: 2025-11-28 09:33:47.744 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 04:33:47 localhost nova_compute[228333]: 2025-11-28 09:33:47.745 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:33:47 localhost nova_compute[228333]: 2025-11-28 09:33:47.745 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:33:47 localhost nova_compute[228333]: 2025-11-28 09:33:47.746 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:33:47 localhost nova_compute[228333]: 2025-11-28 09:33:47.746 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:33:47 localhost nova_compute[228333]: 2025-11-28 09:33:47.747 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:33:47 localhost nova_compute[228333]: 2025-11-28 09:33:47.747 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:33:47 localhost nova_compute[228333]: 2025-11-28 09:33:47.748 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 04:33:47 localhost nova_compute[228333]: 2025-11-28 09:33:47.748 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:33:47 localhost nova_compute[228333]: 2025-11-28 09:33:47.765 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:33:47 localhost nova_compute[228333]: 2025-11-28 09:33:47.765 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:33:47 localhost nova_compute[228333]: 2025-11-28 09:33:47.766 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:33:47 localhost nova_compute[228333]: 2025-11-28 09:33:47.766 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 04:33:47 localhost nova_compute[228333]: 2025-11-28 09:33:47.767 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:33:48 localhost nova_compute[228333]: 2025-11-28 09:33:48.210 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:33:48 localhost nova_compute[228333]: 2025-11-28 09:33:48.272 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:33:48 localhost nova_compute[228333]: 2025-11-28 09:33:48.273 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:33:48 localhost nova_compute[228333]: 2025-11-28 09:33:48.496 228337 WARNING nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 04:33:48 localhost nova_compute[228333]: 2025-11-28 09:33:48.498 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12917MB free_disk=41.837093353271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 04:33:48 localhost nova_compute[228333]: 2025-11-28 09:33:48.498 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:33:48 localhost nova_compute[228333]: 2025-11-28 09:33:48.499 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:33:48 localhost nova_compute[228333]: 2025-11-28 09:33:48.588 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 04:33:48 localhost nova_compute[228333]: 2025-11-28 09:33:48.589 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 04:33:48 localhost nova_compute[228333]: 2025-11-28 09:33:48.589 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 04:33:48 localhost python3.9[234799]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 28 04:33:48 localhost nova_compute[228333]: 2025-11-28 09:33:48.657 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:33:48 localhost nova_compute[228333]: 2025-11-28 09:33:48.871 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:33:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22867 DF PROTO=TCP SPT=36458 DPT=9105 SEQ=3545525059 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E76830000000001030307) Nov 28 04:33:49 localhost nova_compute[228333]: 2025-11-28 09:33:49.158 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:33:49 localhost nova_compute[228333]: 2025-11-28 09:33:49.165 228337 DEBUG nova.compute.provider_tree [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 04:33:49 localhost nova_compute[228333]: 2025-11-28 09:33:49.211 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 04:33:49 localhost nova_compute[228333]: 2025-11-28 09:33:49.213 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 04:33:49 localhost nova_compute[228333]: 2025-11-28 09:33:49.214 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.715s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:33:50 localhost podman[235006]: 2025-11-28 09:33:50.324350354 +0000 UTC m=+0.094223730 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, io.openshift.expose-services=, release=553, vcs-type=git, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_CLEAN=True, version=7, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, ceph=True, maintainer=Guillaume Abrioux , name=rhceph, vendor=Red Hat, Inc., RELEASE=main, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 04:33:50 localhost podman[235006]: 2025-11-28 09:33:50.449907988 +0000 UTC m=+0.219781414 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, release=553, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., version=7, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_CLEAN=True, vcs-type=git, RELEASE=main, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, name=rhceph) Nov 28 04:33:50 localhost python3[235057]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False Nov 28 04:33:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:33:50.812 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:33:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:33:50.813 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:33:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:33:50.814 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:33:50 localhost python3[235057]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "e6f07353639e492d8c9627d6d615ceeb47cb00ac4d14993b12e8023ee2aeee6f",#012 "Digest": "sha256:ba8d4a4e89620dec751cb5de5631f858557101d862972a8e817b82e4e10180a1",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:ba8d4a4e89620dec751cb5de5631f858557101d862972a8e817b82e4e10180a1"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-11-26T06:26:47.510377458Z",#012 "Config": {#012 "User": "root",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 505178369,#012 "VirtualSize": 505178369,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/dc5b8b4def912dce4d14a76402b323c6b5c48ee8271230eacbdaaa7e58e676b2/diff:/var/lib/containers/storage/overlay/f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a/diff:/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/5ea32d7a444086a7f1ea2479bd7b214a5adab9651f7d4df1f24a039ae5563f9d/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/5ea32d7a444086a7f1ea2479bd7b214a5adab9651f7d4df1f24a039ae5563f9d/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",#012 "sha256:1e3477d3ea795ca64b46f28aa9428ba791c4250e0fd05e173a4b9c0fb0bdee23",#012 "sha256:c136b33417f134a3b932677bcf7a2df089c29f20eca250129eafd2132d4708bb",#012 "sha256:df29e1f065b3ca62a976bd39a05f70336eee2ae6be8f0f1548e8c749ab2e29f2",#012 "sha256:23884b48504b714fa8c89fa23b204d39c39cc69fece546e604d8bd0566e4fb11"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "root",#012 "History": [#012 {#012 "created": "2025-11-25T04:02:36.223494528Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:36.223562059Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251125\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:39.054452717Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-11-26T06:10:57.55004106Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550061231Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550071761Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550082711Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550094371Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550104472Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.937139683Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:11:33.845342269Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 Nov 28 04:33:50 localhost podman[235190]: 2025-11-28 09:33:50.909763006 +0000 UTC m=+0.078621421 container remove 4627823c4795955f60778a8cc5afaf370056a76938525e6607651162ae2c58a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684be86bd5476b8c779d4769a9adf982'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4) Nov 28 04:33:50 localhost python3[235057]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ceilometer_agent_compute Nov 28 04:33:51 localhost podman[235205]: Nov 28 04:33:51 localhost podman[235205]: 2025-11-28 09:33:51.016709674 +0000 UTC m=+0.089129438 container create 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:33:51 localhost podman[235205]: 2025-11-28 09:33:50.973090466 +0000 UTC m=+0.045510270 image pull quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified Nov 28 04:33:51 localhost python3[235057]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start Nov 28 04:33:51 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22868 DF PROTO=TCP SPT=36458 DPT=9105 SEQ=3545525059 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E7E820000000001030307) Nov 28 04:33:51 localhost nova_compute[228333]: 2025-11-28 09:33:51.731 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:33:51 localhost python3.9[235384]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:33:52 localhost python3.9[235514]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:33:53 localhost python3.9[235623]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764322433.0439098-1448-231923254777113/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:33:53 localhost nova_compute[228333]: 2025-11-28 09:33:53.874 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:33:54 localhost python3.9[235678]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 28 04:33:54 localhost systemd[1]: Reloading. Nov 28 04:33:54 localhost systemd-sysv-generator[235705]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:33:54 localhost systemd-rc-local-generator[235700]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:33:54 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:54 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:54 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:54 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:54 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:33:54 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:54 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:54 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:54 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22869 DF PROTO=TCP SPT=36458 DPT=9105 SEQ=3545525059 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E8E420000000001030307) Nov 28 04:33:55 localhost python3.9[235769]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:33:55 localhost systemd[1]: Reloading. Nov 28 04:33:55 localhost systemd-sysv-generator[235799]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:33:55 localhost systemd-rc-local-generator[235795]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:33:55 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:55 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:55 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:55 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:55 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:33:55 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:55 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:55 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:55 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:33:55 localhost systemd[1]: Starting ceilometer_agent_compute container... Nov 28 04:33:55 localhost systemd[1]: Started libcrun container. Nov 28 04:33:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9629a2e38a20fc33020415afb3d6da4739b5060b02482d1a6cbfc404ea2cb188/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff) Nov 28 04:33:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9629a2e38a20fc33020415afb3d6da4739b5060b02482d1a6cbfc404ea2cb188/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff) Nov 28 04:33:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:33:56 localhost podman[235810]: 2025-11-28 09:33:56.011109005 +0000 UTC m=+0.138443577 container init 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true) Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: + sudo -E kolla_set_configs Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: sudo: unable to send audit message: Operation not permitted Nov 28 04:33:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:33:56 localhost podman[235810]: 2025-11-28 09:33:56.048242876 +0000 UTC m=+0.175577398 container start 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, managed_by=edpm_ansible, config_id=edpm, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 28 04:33:56 localhost podman[235810]: ceilometer_agent_compute Nov 28 04:33:56 localhost systemd[1]: Started ceilometer_agent_compute container. Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: INFO:__main__:Validating config file Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: INFO:__main__:Copying service configuration files Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: INFO:__main__:Writing out command to execute Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: ++ cat /run_command Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout' Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: + ARGS= Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: + sudo kolla_copy_cacerts Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: sudo: unable to send audit message: Operation not permitted Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: + [[ ! -n '' ]] Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: + . kolla_extend_start Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\''' Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout' Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: + umask 0022 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout Nov 28 04:33:56 localhost podman[235833]: 2025-11-28 09:33:56.137156175 +0000 UTC m=+0.083806357 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 04:33:56 localhost podman[235833]: 2025-11-28 09:33:56.166240598 +0000 UTC m=+0.112890760 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 28 04:33:56 localhost podman[235833]: unhealthy Nov 28 04:33:56 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:33:56 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Failed with result 'exit-code'. Nov 28 04:33:56 localhost nova_compute[228333]: 2025-11-28 09:33:56.732 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.854 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.854 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.854 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.854 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.854 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.854 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.854 2 DEBUG cotyledon.oslo_config_glue [-] batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.854 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.855 2 DEBUG cotyledon.oslo_config_glue [-] config_dir = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.855 2 DEBUG cotyledon.oslo_config_glue [-] config_file = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.855 2 DEBUG cotyledon.oslo_config_glue [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.855 2 DEBUG cotyledon.oslo_config_glue [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.855 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.855 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.855 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.855 2 DEBUG cotyledon.oslo_config_glue [-] host = np0005538513.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.855 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.855 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.855 2 DEBUG cotyledon.oslo_config_glue [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.855 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] log_dir = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] log_file = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.856 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.857 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] sample_source = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.858 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.859 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.862 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.862 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.862 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.862 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.862 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.862 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.862 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.862 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.862 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.862 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.862 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.862 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.863 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.864 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.864 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.864 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.864 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.864 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.864 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.864 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.864 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.864 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.864 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.864 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.864 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.865 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.866 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.867 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.867 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.867 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.867 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.867 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.885 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']]. Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.886 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d]. Nov 28 04:33:56 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:56.887 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']]. Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.001 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.100 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.100 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.100 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.100 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.100 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.101 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.101 12 DEBUG cotyledon.oslo_config_glue [-] batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.101 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.101 12 DEBUG cotyledon.oslo_config_glue [-] config_dir = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.101 12 DEBUG cotyledon.oslo_config_glue [-] config_file = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.101 12 DEBUG cotyledon.oslo_config_glue [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.101 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.101 12 DEBUG cotyledon.oslo_config_glue [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.101 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.101 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.102 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.102 12 DEBUG cotyledon.oslo_config_glue [-] host = np0005538513.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.102 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.102 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.102 12 DEBUG cotyledon.oslo_config_glue [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.102 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.102 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.102 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.102 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.102 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.102 12 DEBUG cotyledon.oslo_config_glue [-] log_dir = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] log_file = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.103 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.104 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.104 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.104 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.104 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.104 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.104 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.104 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.104 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.104 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.104 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.104 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.104 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.105 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.105 12 DEBUG cotyledon.oslo_config_glue [-] sample_source = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.105 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.105 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.105 12 DEBUG cotyledon.oslo_config_glue [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.105 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.105 12 DEBUG cotyledon.oslo_config_glue [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.105 12 DEBUG cotyledon.oslo_config_glue [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.105 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.105 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.105 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.105 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.106 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.106 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.106 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.106 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.106 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.106 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.106 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.106 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.106 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.106 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.106 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.106 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.107 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.108 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.109 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.109 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.109 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.109 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.109 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.109 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.109 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.109 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.109 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.109 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.109 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.110 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.110 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.110 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.110 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.110 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.110 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.110 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.110 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.110 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.110 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.110 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.111 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.111 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.111 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.111 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.111 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.111 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.111 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.111 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.111 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.111 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.111 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.111 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.113 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.114 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.114 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.114 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.114 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.114 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.114 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.114 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.114 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.114 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.114 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.114 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.114 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.115 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.118 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.118 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.122 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.130 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.522 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}51229a5bf42cf6af691520c2ac7386a36cdfa0a076b91349c2da91076b3ef699" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.629 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 327 Content-Type: application/json Date: Fri, 28 Nov 2025 09:33:57 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-d6465910-457e-42dd-8a7d-8ec985d08cb3 x-openstack-request-id: req-d6465910-457e-42dd-8a7d-8ec985d08cb3 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.630 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "f3c44237-060e-4213-a926-aa7fdb4bf902", "name": "m1.small", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.630 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-d6465910-457e-42dd-8a7d-8ec985d08cb3 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.631 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}51229a5bf42cf6af691520c2ac7386a36cdfa0a076b91349c2da91076b3ef699" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.671 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 494 Content-Type: application/json Date: Fri, 28 Nov 2025 09:33:57 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-3120e210-08ea-40ea-962d-66699a6a63f6 x-openstack-request-id: req-3120e210-08ea-40ea-962d-66699a6a63f6 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.672 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "f3c44237-060e-4213-a926-aa7fdb4bf902", "name": "m1.small", "ram": 512, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 1, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.672 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902 used request id req-3120e210-08ea-40ea-962d-66699a6a63f6 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.673 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.674 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.678 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 / tap09612b07-51 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.678 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa27bbbc-3702-4283-be90-43f12cc30d06', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 144, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:33:57.674522', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5d2d1f32-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.84653548, 'message_signature': '7ab1d7c56ec27be02c50ca8ddca319124a9c75cc3e6bd0ddc88d8c3e44a4cf66'}]}, 'timestamp': '2025-11-28 09:33:57.679916', '_unique_id': '9b213e18a80c4f6ba31c632a2183756b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.687 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.690 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.691 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 12742 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64ac717b-6c3e-4b31-87bb-e04939ce0b0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12742, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:33:57.691092', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5d2eeef2-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.84653548, 'message_signature': '039f431660e40919d057bbd5ce54aea90f8f7ebddbabe2778454cbe6c4cfafd7'}]}, 'timestamp': '2025-11-28 09:33:57.691639', '_unique_id': '67be56d627dc447a8d4e59649febf04c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.692 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.694 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.694 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 86 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82e137d5-cea1-4ca4-abf6-901803e30015', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 86, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:33:57.694206', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5d2f67a6-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.84653548, 'message_signature': '23e3ce63e48bba963d778646f39450fab789bca0dba3a1eb8019e25bd84461f8'}]}, 'timestamp': '2025-11-28 09:33:57.694717', '_unique_id': '31f3880f174d457fba64e8a6c8eb92cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.695 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.696 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.697 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.697 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.697 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.698 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.698 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.698 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.698 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4b76efa-8ec2-47fd-93d2-43e152ec5cd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:33:57.698626', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5d301444-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.84653548, 'message_signature': 'dfdd8936e406ed3b1d47ab8a3bf9fe02eaf0e00768e8e812d2f8453bc356de6b'}]}, 'timestamp': '2025-11-28 09:33:57.699186', '_unique_id': '3c48782524d940238a3b6934074dc8ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.700 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.701 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.701 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.701 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.701 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.726 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 73981952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.726 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f43a430-2f07-41b6-8f15-789882feebea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73981952, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:33:57.702000', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5d344e06-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.874055993, 'message_signature': '348e897946e5927a0cd01498c219838b4797e143ae34db28edb169f3afacc76c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:33:57.702000', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5d345f22-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.874055993, 'message_signature': 'bddc70edf9a5c7db83d9df1d6bcff7b64c8c2603a08d5ca5e48e813499742486'}]}, 'timestamp': '2025-11-28 09:33:57.727255', '_unique_id': '598918ff44074ef984d841ec96ba6b6a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.728 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.729 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.729 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1480a42-da39-4fe4-83e1-56cec27dca1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:33:57.729615', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5d34cea8-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.84653548, 'message_signature': 'f71c15a82d308abe09888198d098f3030abd5fa4ed829398ad867aa4e320fbec'}]}, 'timestamp': '2025-11-28 09:33:57.730131', '_unique_id': '8d6166cddc2244928c84494f11f4f973'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.731 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.732 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.732 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 566 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.732 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e13ee981-b175-4e47-b54b-954f09d15e04', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 566, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:33:57.732216', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5d35332a-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.874055993, 'message_signature': '648f7c62161d64ff3cbbb4ace8e768edd853b6c807bad9ae53e8ce13738e92bd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:33:57.732216', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5d354356-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.874055993, 'message_signature': 'efd7b0d671fd6aa905d09b2251b8b346b00cbf5626cd90894f9dfe66e3efe94f'}]}, 'timestamp': '2025-11-28 09:33:57.733082', '_unique_id': 'e84a16f8206c413895a58ea0e93a64b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.734 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.735 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.735 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f567f27-0a64-435d-a548-f1a2afe91747', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:33:57.735340', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5d35adf0-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.84653548, 'message_signature': 'eca2aff805cd3dbbd6bce6ef12a9d885d00105146bb3f2c0ad890fcf6e9c865a'}]}, 'timestamp': '2025-11-28 09:33:57.735819', '_unique_id': '41e9f8fb7ec14bbd92f3604e3bff4c17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.736 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.737 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.753 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 45990000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6cd58f4-69af-4a31-853d-48298e4d0695', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 45990000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:33:57.738044', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '5d3861ee-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.924564451, 'message_signature': '323eaf640fd10b3030a0f5e847384e0308e23ee0c377185108eaeae8751b836a'}]}, 'timestamp': '2025-11-28 09:33:57.753560', '_unique_id': '70349386fc9a4dfcb6d7cce628d9d6fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.754 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.755 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.768 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.768 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '011ef4f7-5dd7-4cdc-b548-ab5202f406d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:33:57.756079', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5d3abbce-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.928088524, 'message_signature': 'fd3e9b591c91df6d398a05058a52ace110f8541056734e92cf49afdd72a4973f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:33:57.756079', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5d3ace52-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.928088524, 'message_signature': '04e4c6bda24a386ef59cb4bda387a5ed9e14ad019f6fa885d1aa51281fe5ccf3'}]}, 'timestamp': '2025-11-28 09:33:57.769386', '_unique_id': '2748a374c91742a09c7351203a735374'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.770 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.771 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.771 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1313024378 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.772 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 168520223 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5722941-546d-4851-b7cd-9dd8bf8552e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1313024378, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:33:57.771575', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5d3b3496-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.874055993, 'message_signature': '002316aa748c4b76fec7a81721e87b8cbb023a6ad23a7c7f789a2f21a71b08ff'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 168520223, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:33:57.771575', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5d3b4774-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.874055993, 'message_signature': '8294cd4f89992dad67116654a499808c5096884678ce0afca8044d0c636a5486'}]}, 'timestamp': '2025-11-28 09:33:57.772475', '_unique_id': 'dbe6dba5ad284533afb63100f3c0b902'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.773 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.774 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.774 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '259e6840-de26-47b5-ae7d-dec912fd902a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:33:57.774726', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5d3bb042-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.84653548, 'message_signature': 'afcb2a163c4190046096c59e3fb52da485afcdff6a6a9b0f27bcf48761d83f10'}]}, 'timestamp': '2025-11-28 09:33:57.775220', '_unique_id': '644d125839bc41cca9c80248b0feebaf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.776 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.777 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.777 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 9175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '990c1ac2-6276-4972-aa34-f321ec57a52f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9175, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:33:57.777270', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5d3c1334-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.84653548, 'message_signature': '1fda7708a4000b17ab0cbcaa632b2097dea9a10a3b5796e064230ae6855cbacc'}]}, 'timestamp': '2025-11-28 09:33:57.777754', '_unique_id': '2ac2f465d5f54b0188ee81584a7dfa29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.778 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.779 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.779 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.780 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca2ec3c7-2c10-4203-bb2f-10c0d32fb21a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:33:57.779874', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5d3c7a0e-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.928088524, 'message_signature': '821ba8cb104b068bea0482688c4a16dda1f993649c5153ad386bce845baa08cc'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:33:57.779874', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5d3c8a12-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.928088524, 'message_signature': '2a868eabb5b81636e94faf8e6223ba93a6820eafe7c64dd9fac1910dedb32f2c'}]}, 'timestamp': '2025-11-28 09:33:57.780734', '_unique_id': '994250b4e3174e1a84cd9ae224947e20'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.781 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.782 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.782 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 305908425 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.783 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 30452399 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3040a6ab-0de7-4373-80a9-294bcd9fc883', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 305908425, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:33:57.782861', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5d3ceec6-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.874055993, 'message_signature': 'ff4c057d1fc000f59c85798c4fbcaebfa514330877fc2c88f2dd8c68a185a7f4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30452399, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:33:57.782861', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5d3cfede-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.874055993, 'message_signature': 'e86e351eeb3370e35875b8f8a19773d10f02ccb8997bc8bc64311f447dd25ec7'}]}, 'timestamp': '2025-11-28 09:33:57.783722', '_unique_id': 'fbf2d635a0e3458788df2a37cdc35e69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.784 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.785 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.786 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.786 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ca3d5cd-a85a-4728-bc7c-d09595b00104', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:33:57.785972', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5d3d68ba-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.874055993, 'message_signature': '1f8ed315533b19d2a0c31996d68a9b9886ae9a4a78c0f4140ca297567433e744'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:33:57.785972', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5d3d78a0-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.874055993, 'message_signature': '087fa5b5e41fb6b3161650b56f1528152e9e3aef1b2f489b2065b3fbb5a6e768'}]}, 'timestamp': '2025-11-28 09:33:57.786843', '_unique_id': '8631371e7c3048b7a21ed80189fa98c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.787 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.788 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.789 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7b1a3cb-6388-4ef6-9f6e-e7553f73777a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:33:57.788968', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5d3dddae-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.84653548, 'message_signature': '9c29d11809e40cafff42df960341425cc8f364266e4daf324c54135a987d35da'}]}, 'timestamp': '2025-11-28 09:33:57.789459', '_unique_id': '0ad8f734eff44e549a019d2f15d65e1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.790 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.791 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.791 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 52.40234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f50b491-056f-4adf-aa90-e87e5217659a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.40234375, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:33:57.791601', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '5d3e42f8-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.924564451, 'message_signature': '11a18da6ee90fd4913ada2df168b17a9d6eafa9681b97c33d6cfc96be8d99c7f'}]}, 'timestamp': '2025-11-28 09:33:57.792064', '_unique_id': '876f6e43611a44b586b357b0366a953a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.792 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.794 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.794 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.794 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.794 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.795 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.795 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5205426-083f-44f0-808a-6bb0a4710cfc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:33:57.794977', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5d3ec8d6-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.928088524, 'message_signature': '4da5aa7a3ac30478a0598e050da35fc3b24d417a795f0469c332db793d8167a9'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:33:57.794977', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5d3ed8e4-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.928088524, 'message_signature': '0a352a081040427bee9c54f3a2fbecc51105cdfb8c8a1db00d7a27b9387f24cd'}]}, 'timestamp': '2025-11-28 09:33:57.795861', '_unique_id': 'f1015bde5ddc4393a5238b8660372ce1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.796 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.797 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.798 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.798 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6af788c-0402-4281-a7c0-005de4f17bf0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:33:57.798154', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5d3f4310-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.874055993, 'message_signature': 'dc1e3a40f73573aad0dddba42014aa84412c0e05be1d820173004c62603864d6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:33:57.798154', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5d3f52ec-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.874055993, 'message_signature': 'd1d20ae784cd0c917636cb14b0ee9d48380cfa3b759e77a50b4b0364d7ec753b'}]}, 'timestamp': '2025-11-28 09:33:57.798980', '_unique_id': '3fde224552b14faaa8ebe60f935427fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.799 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.800 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.801 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb1d93fe-b332-4fc2-8076-bdf31e7a014f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:33:57.801145', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5d3fb840-cc3d-11f0-afa8-fa163eb02593', 'monotonic_time': 10271.84653548, 'message_signature': 'f6f95317da1d8bb8fdcc3418e23b947937bc5188f613894c38d538a0c4543617'}]}, 'timestamp': '2025-11-28 09:33:57.801637', '_unique_id': 'da7daececfb64abe8684178de282d352'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:33:57 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:57.802 12 ERROR oslo_messaging.notify.messaging Nov 28 04:33:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34679 DF PROTO=TCP SPT=33768 DPT=9101 SEQ=2529987109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1E99490000000001030307) Nov 28 04:33:58 localhost python3.9[235970]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 04:33:58 localhost systemd[1]: Stopping ceilometer_agent_compute container... Nov 28 04:33:58 localhost systemd[1]: tmp-crun.EGdWK5.mount: Deactivated successfully. Nov 28 04:33:58 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:58.908 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process Nov 28 04:33:58 localhost nova_compute[228333]: 2025-11-28 09:33:58.907 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:33:59 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:59.009 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304 Nov 28 04:33:59 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:59.009 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308 Nov 28 04:33:59 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:59.009 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12] Nov 28 04:33:59 localhost journal[201490]: End of file while reading data: Input/output error Nov 28 04:33:59 localhost journal[201490]: End of file while reading data: Input/output error Nov 28 04:33:59 localhost ceilometer_agent_compute[235824]: 2025-11-28 09:33:59.017 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320 Nov 28 04:33:59 localhost systemd[1]: libpod-1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.scope: Deactivated successfully. Nov 28 04:33:59 localhost systemd[1]: libpod-1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.scope: Consumed 1.348s CPU time. Nov 28 04:33:59 localhost podman[235974]: 2025-11-28 09:33:59.156416178 +0000 UTC m=+0.339524103 container died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:33:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:33:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:33:59 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.timer: Deactivated successfully. Nov 28 04:33:59 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:33:59 localhost podman[235994]: 2025-11-28 09:33:59.254497921 +0000 UTC m=+0.081931707 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:33:59 localhost podman[235995]: 2025-11-28 09:33:59.312723617 +0000 UTC m=+0.137552319 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent) Nov 28 04:33:59 localhost podman[235994]: 2025-11-28 09:33:59.321732396 +0000 UTC m=+0.149166182 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 04:33:59 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:33:59 localhost podman[235974]: 2025-11-28 09:33:59.344062751 +0000 UTC m=+0.527170606 container cleanup 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Nov 28 04:33:59 localhost podman[235974]: ceilometer_agent_compute Nov 28 04:33:59 localhost podman[235995]: 2025-11-28 09:33:59.394069034 +0000 UTC m=+0.218897766 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 28 04:33:59 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:33:59 localhost podman[236045]: 2025-11-28 09:33:59.442813306 +0000 UTC m=+0.074277301 container cleanup 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0) Nov 28 04:33:59 localhost podman[236045]: ceilometer_agent_compute Nov 28 04:33:59 localhost systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully. Nov 28 04:33:59 localhost systemd[1]: Stopped ceilometer_agent_compute container. Nov 28 04:33:59 localhost systemd[1]: Starting ceilometer_agent_compute container... Nov 28 04:33:59 localhost systemd[1]: Started libcrun container. Nov 28 04:33:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9629a2e38a20fc33020415afb3d6da4739b5060b02482d1a6cbfc404ea2cb188/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff) Nov 28 04:33:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9629a2e38a20fc33020415afb3d6da4739b5060b02482d1a6cbfc404ea2cb188/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff) Nov 28 04:33:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:33:59 localhost podman[236058]: 2025-11-28 09:33:59.607829954 +0000 UTC m=+0.133168678 container init 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm) Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: + sudo -E kolla_set_configs Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: sudo: unable to send audit message: Operation not permitted Nov 28 04:33:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:33:59 localhost podman[236058]: 2025-11-28 09:33:59.649583583 +0000 UTC m=+0.174922267 container start 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible) Nov 28 04:33:59 localhost podman[236058]: ceilometer_agent_compute Nov 28 04:33:59 localhost systemd[1]: Started ceilometer_agent_compute container. Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: INFO:__main__:Validating config file Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: INFO:__main__:Copying service configuration files Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: INFO:__main__:Writing out command to execute Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: ++ cat /run_command Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout' Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: + ARGS= Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: + sudo kolla_copy_cacerts Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: sudo: unable to send audit message: Operation not permitted Nov 28 04:33:59 localhost podman[236081]: 2025-11-28 09:33:59.739420532 +0000 UTC m=+0.084298973 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: + [[ ! -n '' ]] Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: + . kolla_extend_start Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout' Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\''' Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: + umask 0022 Nov 28 04:33:59 localhost ceilometer_agent_compute[236072]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout Nov 28 04:33:59 localhost podman[236081]: 2025-11-28 09:33:59.774453915 +0000 UTC m=+0.119332376 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute) Nov 28 04:33:59 localhost podman[236081]: unhealthy Nov 28 04:33:59 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Main process exited, code=exited, status=1/FAILURE Nov 28 04:33:59 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Failed with result 'exit-code'. Nov 28 04:34:00 localhost python3.9[236212]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.458 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.459 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.459 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.459 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.459 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.459 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.459 2 DEBUG cotyledon.oslo_config_glue [-] batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.459 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.459 2 DEBUG cotyledon.oslo_config_glue [-] config_dir = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.459 2 DEBUG cotyledon.oslo_config_glue [-] config_file = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.459 2 DEBUG cotyledon.oslo_config_glue [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.459 2 DEBUG cotyledon.oslo_config_glue [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.460 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.460 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.460 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.460 2 DEBUG cotyledon.oslo_config_glue [-] host = np0005538513.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.460 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.460 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.460 2 DEBUG cotyledon.oslo_config_glue [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.460 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.460 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.461 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.461 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.461 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.461 2 DEBUG cotyledon.oslo_config_glue [-] log_dir = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.461 2 DEBUG cotyledon.oslo_config_glue [-] log_file = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.461 2 DEBUG cotyledon.oslo_config_glue [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.461 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.461 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.461 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.461 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.461 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.461 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.462 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] sample_source = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.463 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.464 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.464 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.464 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.464 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.464 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.464 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.464 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.464 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.464 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.464 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.464 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.465 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.466 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.467 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.467 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.467 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.467 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.467 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.467 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.467 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.467 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.467 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.467 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.467 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.467 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.468 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.468 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.468 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.468 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.468 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.468 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.468 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.468 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.468 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.468 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.468 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.469 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.470 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.470 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.470 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.470 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.470 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.470 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.470 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.470 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.470 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.470 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.470 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.471 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.472 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.472 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.472 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.472 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.472 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.472 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.472 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.472 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.490 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']]. Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.492 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d]. Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.493 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']]. Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.510 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.634 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.634 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.634 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.634 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.634 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.634 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.634 12 DEBUG cotyledon.oslo_config_glue [-] batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.635 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.635 12 DEBUG cotyledon.oslo_config_glue [-] config_dir = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.635 12 DEBUG cotyledon.oslo_config_glue [-] config_file = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.635 12 DEBUG cotyledon.oslo_config_glue [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.635 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.635 12 DEBUG cotyledon.oslo_config_glue [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.635 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.635 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.635 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.635 12 DEBUG cotyledon.oslo_config_glue [-] host = np0005538513.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.636 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.636 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.636 12 DEBUG cotyledon.oslo_config_glue [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.636 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.636 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.636 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.636 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.636 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.636 12 DEBUG cotyledon.oslo_config_glue [-] log_dir = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.636 12 DEBUG cotyledon.oslo_config_glue [-] log_file = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.636 12 DEBUG cotyledon.oslo_config_glue [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.636 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.637 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.638 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.638 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.638 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.638 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.638 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.638 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.638 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.638 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.638 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.638 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.638 12 DEBUG cotyledon.oslo_config_glue [-] sample_source = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.638 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.639 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.639 12 DEBUG cotyledon.oslo_config_glue [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.639 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.639 12 DEBUG cotyledon.oslo_config_glue [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.639 12 DEBUG cotyledon.oslo_config_glue [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.639 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.639 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.639 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.639 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.639 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.639 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.639 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.640 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.640 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.640 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.640 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.640 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.640 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.640 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.640 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.640 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.640 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.640 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.641 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.641 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.641 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.641 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.641 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.641 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.641 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.641 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.641 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.641 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.641 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.641 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.642 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.642 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.642 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.642 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.642 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.642 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.642 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.642 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.642 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.642 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.642 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.643 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.643 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.643 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.643 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.643 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.643 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.643 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.643 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.643 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.643 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.643 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.644 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.644 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.644 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.644 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.644 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.644 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.644 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.644 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.644 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.644 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.644 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.645 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.645 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.645 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.645 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.645 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.645 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.645 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.645 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.645 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.645 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.645 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.648 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.648 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.648 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.648 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.648 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.648 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.648 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.648 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.648 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.648 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.648 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.648 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.649 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.649 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.649 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.649 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.649 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.649 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.649 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.649 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.649 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.649 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.649 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.649 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.650 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.651 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.652 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.652 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.652 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.652 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.652 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.652 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.652 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.652 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.652 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.652 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.652 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.656 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64 Nov 28 04:34:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:00.664 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93 Nov 28 04:34:00 localhost python3.9[236306]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322439.9206502-1544-122255528229007/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.044 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}0c98bde1f38df54628b5d8e5f8133062b9971605ca5dd209589037cdd5565180" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519 Nov 28 04:34:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34681 DF PROTO=TCP SPT=33768 DPT=9101 SEQ=2529987109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1EA5620000000001030307) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.161 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 327 Content-Type: application/json Date: Fri, 28 Nov 2025 09:34:01 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-b154ef59-1b58-4072-b77e-39cd99b341b7 x-openstack-request-id: req-b154ef59-1b58-4072-b77e-39cd99b341b7 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.161 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "f3c44237-060e-4213-a926-aa7fdb4bf902", "name": "m1.small", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.161 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-b154ef59-1b58-4072-b77e-39cd99b341b7 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.164 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}0c98bde1f38df54628b5d8e5f8133062b9971605ca5dd209589037cdd5565180" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.182 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 494 Content-Type: application/json Date: Fri, 28 Nov 2025 09:34:01 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-e628c67f-ce5d-4010-9447-dfea4d757f7c x-openstack-request-id: req-e628c67f-ce5d-4010-9447-dfea4d757f7c _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.182 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "f3c44237-060e-4213-a926-aa7fdb4bf902", "name": "m1.small", "ram": 512, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 1, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.183 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors/f3c44237-060e-4213-a926-aa7fdb4bf902 used request id req-e628c67f-ce5d-4010-9447-dfea4d757f7c request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.184 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.184 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.190 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 / tap09612b07-51 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.190 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97f97acd-3b21-49d8-8b0a-800e2eaeb6ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:34:01.185254', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5f44ff74-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.357300134, 'message_signature': 'bb4d6d381bf9ed9b230eeda83f10ac5ad4841d80a91c777c331a307ee2ef9519'}]}, 'timestamp': '2025-11-28 09:34:01.191837', '_unique_id': '5f0d11ce69e54019a5c5232fd2eb0a4b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.200 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.204 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.204 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.204 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.205 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.205 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.205 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.205 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.205 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 12742 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12f5d177-2e0a-4d4e-9c0f-8219b23a40cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12742, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:34:01.205870', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5f473f00-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.357300134, 'message_signature': '2c9dff60989d225464ee24f47b7b88bd7f11380563fb5b07a12bfc764c2642c6'}]}, 'timestamp': '2025-11-28 09:34:01.206444', '_unique_id': '4ab641480f234902aca9fed28d512a5f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.207 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.208 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.208 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62147e78-804b-48ee-9aef-65c0db56f082', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:34:01.208813', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5f47b11a-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.357300134, 'message_signature': '34489fd06d48c9ad3536959440c34da771a63857e8909de17505c04e5bb4dced'}]}, 'timestamp': '2025-11-28 09:34:01.209320', '_unique_id': 'a421786b399746e2a7c2ec3695e20f89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.210 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.211 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.247 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 305908425 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.248 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 30452399 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a06b837f-2db5-45fb-ae2e-4a5cf1e273b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 305908425, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:34:01.211452', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5f4daea8-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.383486794, 'message_signature': 'c5d9f870dce25b0732e5f9db278626a08ad8614a0d7ab794485d718f438d21de'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30452399, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:34:01.211452', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5f4dc032-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.383486794, 'message_signature': 'ea8d7d368945e9329c0b4ef9f06468dffefdcb27348eb5a87bdafa7b204aeb10'}]}, 'timestamp': '2025-11-28 09:34:01.248990', '_unique_id': '5c8b6164a6324f39a3d798ff91702c01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.250 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.251 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.251 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '439092f3-5af4-4aec-a3d1-7d60a587dd8a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:34:01.251759', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5f4e3ddc-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.357300134, 'message_signature': '0019c1c37c7b9a76f9a123c4dcca0531c4ee386c9fd60b1df1c4ebae69383446'}]}, 'timestamp': '2025-11-28 09:34:01.252280', '_unique_id': '3f12acab79b3433bab1087999db5075c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.253 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.254 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.254 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.254 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.255 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.268 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.268 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02ffb186-0811-4c05-b84d-b3e2d56692af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:34:01.255157', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5f50bddc-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.427199334, 'message_signature': 'b04401fd082a2b97bdea872e526dd5fda4f02ab375e18b4e96b433d069f256af'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:34:01.255157', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5f50ce94-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.427199334, 'message_signature': '0876939eaca65ac62044a7c0aa2a45cbb55ccd70059cfd4764622e3aa715a3d9'}]}, 'timestamp': '2025-11-28 09:34:01.269042', '_unique_id': '7f2cf4af764f4790b41389876361a324'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.270 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.271 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.271 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8561dabb-fc3a-4a45-8921-e62c913dfc37', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:34:01.271452', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5f513eec-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.357300134, 'message_signature': '746647ea4408f2312003a4a03e306a45270a6c6e4467dd6e3ec9489c9870dc32'}]}, 'timestamp': '2025-11-28 09:34:01.271921', '_unique_id': '7d9a1263e48b4cb49dcc6dba1e88cf94'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.272 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.273 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.294 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 52.40234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3fb917e-62a1-41ef-92a8-e74236e750a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.40234375, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:34:01.274097', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '5f54ba0e-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.46573624, 'message_signature': '515acd3ca0f6eff7eed76081c03a074ef189a6b84f101bb119f38429ffc7d03c'}]}, 'timestamp': '2025-11-28 09:34:01.294724', '_unique_id': 'b77ed978a2cf486c80d1f2016ac99a29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.295 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.296 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.296 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '398f5d2f-c6fe-4648-b5a6-bf208ea0023e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:34:01.296939', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5f552476-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.357300134, 'message_signature': '52c24c47d26b5b9074e600942219ff3694752758defe775a666d4fd5660b3d99'}]}, 'timestamp': '2025-11-28 09:34:01.297456', '_unique_id': '707a31ca3a184ce88f75df824e030cd4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.298 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.299 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.299 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.299 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.300 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.300 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1313024378 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.300 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 168520223 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6cb8d55-4708-4036-a3db-74a0bfad7faa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1313024378, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:34:01.300164', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5f55a054-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.383486794, 'message_signature': 'd7cf23081c947dbb01f7256641e69196359a1bae2aa45b8c64070f5cf3899ee4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 168520223, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:34:01.300164', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5f55b08a-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.383486794, 'message_signature': '472849c8e4634f863a79fb47159d03c98fec5d834d8533d486676853225a481c'}]}, 'timestamp': '2025-11-28 09:34:01.301010', '_unique_id': '6bf17e2d7b31423c868bc85f77419b1d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.301 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.303 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.303 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.303 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3e2f7b2-527d-473a-9ef4-a632aae5183c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:34:01.303174', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5f5615c0-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.427199334, 'message_signature': 'ef512859a60e284dbfa439d3e855be2397fd7be50e2526983ae629c64d8f30cf'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:34:01.303174', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5f5625b0-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.427199334, 'message_signature': '86db59c0a2864ba55f5e238bcb85075d2bf8a24a2d69a41f5466906add2aa295'}]}, 'timestamp': '2025-11-28 09:34:01.304046', '_unique_id': '8a8ffefb342d459194fa71dca44dadb3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.305 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.306 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.306 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 566 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.306 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fcc0c24-6d39-4358-830d-d2f1014c1158', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 566, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:34:01.306343', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5f5691a8-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.383486794, 'message_signature': 'eec2fa02999ddd8c5d7578914011f2fc92d5c6cbea84d2e35afc2f4b5e2dc6dc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:34:01.306343', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5f56a1ac-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.383486794, 'message_signature': '2b051335e225bacff7033a5c561371158044e350d4f2b8ec2f9ffe6992392cde'}]}, 'timestamp': '2025-11-28 09:34:01.307221', '_unique_id': 'b8ee691e7ebf40ba92fb04a6e0ea80cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.308 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.309 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.309 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6738a910-1c2c-4a7b-a7d4-4660ced1c8f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:34:01.309400', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5f570912-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.357300134, 'message_signature': 'db97d6bf93cc1a089b143ecd224fd1c995f395c54ffc73baf370541ccb848d72'}]}, 'timestamp': '2025-11-28 09:34:01.309852', '_unique_id': 'e540467858b3407cbb2067cb6dd2b9c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.310 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.311 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.312 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.312 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b7499f9-0343-463a-b3bf-50bc88468427', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:34:01.312111', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5f5772e4-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.427199334, 'message_signature': '3f70c2767c6642f8f7f0950bdd795817558de8bba4e172cb5bee5c7288c1cf4f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:34:01.312111', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5f5782c0-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.427199334, 'message_signature': '75b6dcad3e42c418c224cd67c0e08b28da725b82ac8c26627ce14714de4a6246'}]}, 'timestamp': '2025-11-28 09:34:01.312939', '_unique_id': '8742d984ca8a47fca417fda4c4ad239a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.313 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.314 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.315 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ca72991-25de-4ba7-8d41-9004c4816da0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 144, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:34:01.315295', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5f57f17e-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.357300134, 'message_signature': '5ab0d1e334eb5cd694b043b0e9fbab8f4e030cf2fd093e6779346b002b15e0ce'}]}, 'timestamp': '2025-11-28 09:34:01.315811', '_unique_id': 'cad02125b26942359e11776a2d2faae0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.316 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.317 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.318 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 9175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eef74aea-dae6-464b-a9ff-fe35371b36c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9175, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:34:01.317966', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5f585920-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.357300134, 'message_signature': '95c451172e677a117790269a2989e54117c0d4e498df5104f2ec768b0dd1d5de'}]}, 'timestamp': '2025-11-28 09:34:01.318570', '_unique_id': '78ae35551eee49c0b85cf80e46a8f36e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.319 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.320 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.321 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.321 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '618cc4f6-28a8-42a1-aa87-370dee06ae3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:34:01.321056', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5f58d166-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.383486794, 'message_signature': 'd814680ec686665d787ffa77e873771e9c54a2774f8ea26b28e51d78c805ac82'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:34:01.321056', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5f58e16a-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.383486794, 'message_signature': '160bdf82186450640e44aea5ce8cb3976cb9bec36068647ecb33f2e8273b8ab5'}]}, 'timestamp': '2025-11-28 09:34:01.321916', '_unique_id': 'f4a847118c1843f190b5e8eb5f4ea71c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.322 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.324 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.324 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 73981952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.324 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b54a2dbf-fac6-4b44-816f-f4b0b6d5c225', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73981952, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:34:01.324210', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5f594c0e-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.383486794, 'message_signature': '682900c99478c8db0cdd2f1a2865aff300411c7ddaf2333c5c1d37dbb2aca768'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:34:01.324210', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5f595be0-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.383486794, 'message_signature': '0029281dcb7d2da8581a33e494823239375e922ca736837f3ecf48a3d9a81e16'}]}, 'timestamp': '2025-11-28 09:34:01.325087', '_unique_id': 'e3f74a1f3efb4d9d938ef5bc75c9ef6a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.325 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.327 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.327 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 46020000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4545d91-e988-4604-9a7b-5ac5498ad0f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 46020000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:34:01.327861', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '5f59dd40-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.46573624, 'message_signature': 'c4000779c40a4da1463d2fe1e8ea3c50331b1ac2118d52adcf1be5b5dc08ca44'}]}, 'timestamp': '2025-11-28 09:34:01.328381', '_unique_id': '7068a07f92ef44c69a79fd59f8aa7364'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.329 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.330 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.330 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.330 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '769fba96-d840-46a3-9bb9-93f9df85b152', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:34:01.330190', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5f5a31d2-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.383486794, 'message_signature': '023b547ccf5048325fdad8de66016b21350296673518982d0a9826102de5b89a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:34:01.330190', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5f5a3b6e-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.383486794, 'message_signature': 'c924dd66cc0cdaea188fb97621112e90a364620108fa1dba226558197cd4a462'}]}, 'timestamp': '2025-11-28 09:34:01.330696', '_unique_id': '67b5a7e5d96f4a5f8b4c2db623f34a5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.331 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.332 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.332 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 86 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3639a01b-6e6f-45e7-af72-0cce6c3bbe15', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 86, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:34:01.332156', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '5f5a7f0c-cc3d-11f0-a370-fa163eb02593', 'monotonic_time': 10275.357300134, 'message_signature': '2630e23c4f5fe9521d0c888b5fafb46f9e66d53afb8bd382e8424944aec5cb08'}]}, 'timestamp': '2025-11-28 09:34:01.332450', '_unique_id': 'f2befa3af4c34ff4aea2f0d78606246a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:34:01 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:34:01.333 12 ERROR oslo_messaging.notify.messaging Nov 28 04:34:01 localhost nova_compute[228333]: 2025-11-28 09:34:01.736 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:34:01 localhost python3.9[236416]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False Nov 28 04:34:02 localhost python3.9[236526]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 28 04:34:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24986 DF PROTO=TCP SPT=33148 DPT=9102 SEQ=2463907536 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1EAE420000000001030307) Nov 28 04:34:03 localhost nova_compute[228333]: 2025-11-28 09:34:03.909 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:34:04 localhost python3[236636]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False Nov 28 04:34:04 localhost podman[236674]: Nov 28 04:34:04 localhost podman[236674]: 2025-11-28 09:34:04.3792568 +0000 UTC m=+0.077564396 container create 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors ) Nov 28 04:34:04 localhost podman[236674]: 2025-11-28 09:34:04.339271319 +0000 UTC m=+0.037578945 image pull quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c Nov 28 04:34:04 localhost python3[236636]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl Nov 28 04:34:06 localhost python3.9[236820]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:34:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45685 DF PROTO=TCP SPT=34304 DPT=9102 SEQ=594967161 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1EB9820000000001030307) Nov 28 04:34:06 localhost nova_compute[228333]: 2025-11-28 09:34:06.737 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:34:06 localhost python3.9[236932]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:34:07 localhost python3.9[237041]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764322446.952259-1703-110887842884263/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:34:08 localhost python3.9[237096]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 28 04:34:08 localhost systemd[1]: Reloading. Nov 28 04:34:08 localhost systemd-rc-local-generator[237119]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:34:08 localhost systemd-sysv-generator[237124]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:34:08 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:34:08 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:34:08 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:34:08 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:34:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:34:08 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:34:08 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:34:08 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:34:08 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:34:08 localhost nova_compute[228333]: 2025-11-28 09:34:08.941 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:34:09 localhost python3.9[237187]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:34:09 localhost systemd[1]: Reloading. Nov 28 04:34:09 localhost systemd-rc-local-generator[237212]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:34:09 localhost systemd-sysv-generator[237218]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:34:09 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:34:09 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:34:09 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:34:09 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:34:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:34:09 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:34:09 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:34:09 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:34:09 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:34:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65010 DF PROTO=TCP SPT=60840 DPT=9100 SEQ=2329009262 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1EC5820000000001030307) Nov 28 04:34:09 localhost systemd[1]: Starting node_exporter container... Nov 28 04:34:09 localhost systemd[1]: tmp-crun.yMxVMN.mount: Deactivated successfully. Nov 28 04:34:09 localhost systemd[1]: Started libcrun container. Nov 28 04:34:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:34:09 localhost podman[237228]: 2025-11-28 09:34:09.648768718 +0000 UTC m=+0.151432456 container init 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.669Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)" Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.669Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)" Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.669Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required." Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.670Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/) Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.670Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$ Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.670Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$ Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.670Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice) Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:110 level=info msg="Enabled collectors" Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=arp Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=bcache Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=bonding Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=btrfs Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=conntrack Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=cpu Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=cpufreq Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=diskstats Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=edac Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=fibrechannel Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=filefd Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=filesystem Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=infiniband Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=ipvs Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=loadavg Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=mdadm Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=meminfo Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=netclass Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=netdev Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=netstat Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=nfs Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=nfsd Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=nvme Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=schedstat Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=sockstat Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=softnet Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=systemd Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=tapestats Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=udp_queues Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=vmstat Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=xfs Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.671Z caller=node_exporter.go:117 level=info collector=zfs Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.672Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100 Nov 28 04:34:09 localhost node_exporter[237242]: ts=2025-11-28T09:34:09.672Z caller=tls_config.go:235 level=info msg="TLS is disabled." http2=false address=[::]:9100 Nov 28 04:34:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:34:09 localhost podman[237228]: 2025-11-28 09:34:09.689725182 +0000 UTC m=+0.192388890 container start 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 04:34:09 localhost podman[237228]: node_exporter Nov 28 04:34:09 localhost systemd[1]: Started node_exporter container. Nov 28 04:34:09 localhost podman[237251]: 2025-11-28 09:34:09.794141329 +0000 UTC m=+0.097664561 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=starting, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 04:34:09 localhost podman[237251]: 2025-11-28 09:34:09.802996834 +0000 UTC m=+0.106520036 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 04:34:09 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:34:10 localhost systemd[1]: tmp-crun.VcsOSJ.mount: Deactivated successfully. Nov 28 04:34:10 localhost python3.9[237384]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 04:34:10 localhost systemd[1]: Stopping node_exporter container... Nov 28 04:34:10 localhost systemd[1]: libpod-49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.scope: Deactivated successfully. Nov 28 04:34:10 localhost podman[237388]: 2025-11-28 09:34:10.846279482 +0000 UTC m=+0.082680083 container died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:34:10 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.timer: Deactivated successfully. Nov 28 04:34:10 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:34:10 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553-userdata-shm.mount: Deactivated successfully. Nov 28 04:34:10 localhost systemd[1]: var-lib-containers-storage-overlay-d489f0eb67e7eb4e89c33762ecb0c35a6142c1059d2b20cd6070670e7ee5ef23-merged.mount: Deactivated successfully. Nov 28 04:34:10 localhost podman[237388]: 2025-11-28 09:34:10.90112285 +0000 UTC m=+0.137523371 container cleanup 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 04:34:10 localhost podman[237388]: node_exporter Nov 28 04:34:10 localhost systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT Nov 28 04:34:10 localhost podman[237412]: 2025-11-28 09:34:10.992703736 +0000 UTC m=+0.064260271 container cleanup 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 04:34:10 localhost podman[237412]: node_exporter Nov 28 04:34:10 localhost systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'. Nov 28 04:34:10 localhost systemd[1]: Stopped node_exporter container. Nov 28 04:34:11 localhost systemd[1]: Starting node_exporter container... Nov 28 04:34:11 localhost systemd[1]: Started libcrun container. Nov 28 04:34:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:34:11 localhost podman[237425]: 2025-11-28 09:34:11.17186037 +0000 UTC m=+0.147027305 container init 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.187Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)" Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.187Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)" Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.187Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required." Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.188Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$ Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.188Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.188Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice) Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/) Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$ Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:110 level=info msg="Enabled collectors" Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=arp Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=bcache Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=bonding Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=btrfs Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=conntrack Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=cpu Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=cpufreq Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=diskstats Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=edac Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=fibrechannel Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=filefd Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=filesystem Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=infiniband Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=ipvs Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=loadavg Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=mdadm Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=meminfo Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=netclass Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=netdev Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=netstat Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=nfs Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=nfsd Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=nvme Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=schedstat Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=sockstat Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=softnet Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=systemd Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=tapestats Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=udp_queues Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=vmstat Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=xfs Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.189Z caller=node_exporter.go:117 level=info collector=zfs Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.190Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100 Nov 28 04:34:11 localhost node_exporter[237440]: ts=2025-11-28T09:34:11.190Z caller=tls_config.go:235 level=info msg="TLS is disabled." http2=false address=[::]:9100 Nov 28 04:34:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:34:11 localhost podman[237425]: 2025-11-28 09:34:11.20212697 +0000 UTC m=+0.177293865 container start 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 04:34:11 localhost podman[237425]: node_exporter Nov 28 04:34:11 localhost systemd[1]: Started node_exporter container. Nov 28 04:34:11 localhost podman[237449]: 2025-11-28 09:34:11.292035463 +0000 UTC m=+0.082085733 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=starting, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:34:11 localhost podman[237449]: 2025-11-28 09:34:11.324670489 +0000 UTC m=+0.114720709 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 04:34:11 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:34:11 localhost nova_compute[228333]: 2025-11-28 09:34:11.739 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:34:11 localhost python3.9[237582]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:34:12 localhost python3.9[237670]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322451.4197206-1799-273962096826046/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Nov 28 04:34:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46748 DF PROTO=TCP SPT=41968 DPT=9100 SEQ=2251484439 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1ED1820000000001030307) Nov 28 04:34:13 localhost python3.9[237780]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False Nov 28 04:34:13 localhost nova_compute[228333]: 2025-11-28 09:34:13.945 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:34:14 localhost python3.9[237890]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 28 04:34:15 localhost python3[238000]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False Nov 28 04:34:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:34:15 localhost systemd[1]: tmp-crun.WmSqBS.mount: Deactivated successfully. Nov 28 04:34:15 localhost podman[238028]: 2025-11-28 09:34:15.853464815 +0000 UTC m=+0.084747068 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 04:34:15 localhost podman[238028]: 2025-11-28 09:34:15.860944775 +0000 UTC m=+0.092227078 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 04:34:15 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:34:16 localhost nova_compute[228333]: 2025-11-28 09:34:16.741 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:34:17 localhost podman[238015]: 2025-11-28 09:34:15.278996247 +0000 UTC m=+0.031158329 image pull quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd Nov 28 04:34:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60796 DF PROTO=TCP SPT=60144 DPT=9882 SEQ=1095107216 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1EE4020000000001030307) Nov 28 04:34:17 localhost podman[238109]: Nov 28 04:34:17 localhost podman[238109]: 2025-11-28 09:34:17.219671847 +0000 UTC m=+0.074453568 container create 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi , config_id=edpm) Nov 28 04:34:17 localhost podman[238109]: 2025-11-28 09:34:17.177729752 +0000 UTC m=+0.032511493 image pull quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd Nov 28 04:34:17 localhost python3[238000]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd Nov 28 04:34:18 localhost python3.9[238255]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:34:18 localhost nova_compute[228333]: 2025-11-28 09:34:18.978 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:34:19 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7022 DF PROTO=TCP SPT=48914 DPT=9105 SEQ=2279364401 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1EEB820000000001030307) Nov 28 04:34:19 localhost python3.9[238367]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:34:20 localhost python3.9[238476]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764322459.825107-1958-33736602754692/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:34:21 localhost python3.9[238531]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 28 04:34:21 localhost systemd[1]: Reloading. Nov 28 04:34:21 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7023 DF PROTO=TCP SPT=48914 DPT=9105 SEQ=2279364401 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB1EF3830000000001030307) Nov 28 04:34:21 localhost systemd-sysv-generator[238556]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:34:21 localhost systemd-rc-local-generator[238551]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:34:21 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:34:21 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:14 localhost python3.9[256606]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:41:14 localhost rsyslogd[759]: imjournal: 6598 messages lost due to rate-limiting (20000 allowed within 600 seconds) Nov 28 04:41:14 localhost python3.9[256692]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322873.582973-542-34120828442781/.source follow=False _original_basename=haproxy.j2 checksum=e4288860049c1baef23f6e1bb6c6f91acb5432e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:41:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:41:14 localhost podman[256748]: 2025-11-28 09:41:14.850005722 +0000 UTC m=+0.083166410 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.openshift.expose-services=, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, distribution-scope=public, com.redhat.component=ubi9-minimal-container) Nov 28 04:41:14 localhost podman[256748]: 2025-11-28 09:41:14.888185131 +0000 UTC m=+0.121345809 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm) Nov 28 04:41:14 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:41:15 localhost python3.9[256856]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:41:15 localhost nova_compute[228333]: 2025-11-28 09:41:15.507 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:41:15 localhost python3.9[256961]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322874.6889603-542-233357954995091/.source follow=False _original_basename=dnsmasq.j2 checksum=efc19f376a79c40570368e9c2b979cde746f1ea8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:41:16 localhost python3.9[257081]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:41:17 localhost nova_compute[228333]: 2025-11-28 09:41:17.540 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:41:17 localhost python3.9[257136]: ansible-ansible.legacy.file Invoked with mode=0755 setype=container_file_t dest=/var/lib/neutron/kill_scripts/haproxy-kill _original_basename=kill-script.j2 recurse=False state=file path=/var/lib/neutron/kill_scripts/haproxy-kill force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:41:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40560 DF PROTO=TCP SPT=33708 DPT=9102 SEQ=2153988774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB254F820000000001030307) Nov 28 04:41:18 localhost openstack_network_exporter[240658]: ERROR 09:41:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:41:18 localhost openstack_network_exporter[240658]: ERROR 09:41:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:41:18 localhost openstack_network_exporter[240658]: ERROR 09:41:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:41:18 localhost openstack_network_exporter[240658]: ERROR 09:41:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:41:18 localhost openstack_network_exporter[240658]: Nov 28 04:41:18 localhost openstack_network_exporter[240658]: ERROR 09:41:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:41:18 localhost openstack_network_exporter[240658]: Nov 28 04:41:18 localhost python3.9[257244]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/dnsmasq-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:41:18 localhost python3.9[257348]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/dnsmasq-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764322877.7987523-629-127520689412943/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:41:20 localhost python3.9[257456]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:41:20 localhost nova_compute[228333]: 2025-11-28 09:41:20.540 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:41:20 localhost python3.9[257568]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:41:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:41:21 localhost podman[257679]: 2025-11-28 09:41:21.474649704 +0000 UTC m=+0.083098658 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 28 04:41:21 localhost podman[257679]: 2025-11-28 09:41:21.482681155 +0000 UTC m=+0.091130159 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 04:41:21 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:41:21 localhost python3.9[257678]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:41:22 localhost python3.9[257758]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:41:22 localhost nova_compute[228333]: 2025-11-28 09:41:22.562 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:41:22 localhost python3.9[257868]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:41:23 localhost python3.9[257925]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:41:23 localhost python3.9[258035]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:41:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:41:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:41:24 localhost podman[258146]: 2025-11-28 09:41:24.559183002 +0000 UTC m=+0.082307431 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 28 04:41:24 localhost podman[258147]: 2025-11-28 09:41:24.629496754 +0000 UTC m=+0.145790952 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:41:24 localhost podman[258146]: 2025-11-28 09:41:24.639010684 +0000 UTC m=+0.162135123 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0) Nov 28 04:41:24 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:41:24 localhost podman[258147]: 2025-11-28 09:41:24.660351616 +0000 UTC m=+0.176645754 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2) Nov 28 04:41:24 localhost python3.9[258145]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:41:24 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:41:25 localhost python3.9[258246]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:41:25 localhost nova_compute[228333]: 2025-11-28 09:41:25.543 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:41:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:41:25 localhost python3.9[258356]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:41:25 localhost podman[258357]: 2025-11-28 09:41:25.85812528 +0000 UTC m=+0.086672024 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:41:25 localhost podman[258357]: 2025-11-28 09:41:25.89849027 +0000 UTC m=+0.127037014 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:41:25 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:41:26 localhost python3.9[258432]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:41:27 localhost python3.9[258542]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:41:27 localhost systemd[1]: Reloading. Nov 28 04:41:27 localhost systemd-rc-local-generator[258564]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:41:27 localhost systemd-sysv-generator[258567]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:41:27 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:27 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:27 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:27 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:41:27 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:27 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:27 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:27 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:27 localhost nova_compute[228333]: 2025-11-28 09:41:27.601 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:41:28 localhost python3.9[258690]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:41:28 localhost python3.9[258747]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:41:29 localhost python3.9[258857]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:41:29 localhost python3.9[258914]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:41:30 localhost nova_compute[228333]: 2025-11-28 09:41:30.579 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:41:30 localhost python3.9[259024]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:41:30 localhost systemd[1]: Reloading. Nov 28 04:41:30 localhost systemd-rc-local-generator[259046]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:41:30 localhost systemd-sysv-generator[259053]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:41:30 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:30 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:30 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:30 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:41:30 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:30 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:30 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:30 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:31 localhost systemd[1]: Starting Create netns directory... Nov 28 04:41:31 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 28 04:41:31 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 28 04:41:31 localhost systemd[1]: Finished Create netns directory. Nov 28 04:41:32 localhost python3.9[259176]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:41:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41788 DF PROTO=TCP SPT=43694 DPT=9102 SEQ=689482158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2588040000000001030307) Nov 28 04:41:32 localhost nova_compute[228333]: 2025-11-28 09:41:32.632 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:41:32 localhost python3.9[259286]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_dhcp_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:41:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:41:33 localhost systemd[1]: tmp-crun.yUZ8UQ.mount: Deactivated successfully. Nov 28 04:41:33 localhost podman[259375]: 2025-11-28 09:41:33.370997439 +0000 UTC m=+0.094041953 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:41:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41789 DF PROTO=TCP SPT=43694 DPT=9102 SEQ=689482158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB258C020000000001030307) Nov 28 04:41:33 localhost podman[259375]: 2025-11-28 09:41:33.419499463 +0000 UTC m=+0.142543927 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 04:41:33 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:41:33 localhost python3.9[259374]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_dhcp_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764322892.4152684-1073-276583804516189/.source.json _original_basename=.v9_fvik_ follow=False checksum=c62829c98c0f9e788d62f52aa71fba276cd98270 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:41:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40561 DF PROTO=TCP SPT=33708 DPT=9102 SEQ=2153988774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB258F820000000001030307) Nov 28 04:41:34 localhost python3.9[259506]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_dhcp state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:41:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41790 DF PROTO=TCP SPT=43694 DPT=9102 SEQ=689482158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2594020000000001030307) Nov 28 04:41:35 localhost nova_compute[228333]: 2025-11-28 09:41:35.604 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:41:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:41:35 localhost systemd[1]: tmp-crun.seBCVf.mount: Deactivated successfully. Nov 28 04:41:35 localhost podman[259705]: 2025-11-28 09:41:35.712491561 +0000 UTC m=+0.083478479 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3) Nov 28 04:41:35 localhost podman[259705]: 2025-11-28 09:41:35.748727307 +0000 UTC m=+0.119714205 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible) Nov 28 04:41:35 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:41:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22070 DF PROTO=TCP SPT=59194 DPT=9102 SEQ=4294502352 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2597820000000001030307) Nov 28 04:41:36 localhost python3.9[259833]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_pattern=*.json debug=False Nov 28 04:41:37 localhost nova_compute[228333]: 2025-11-28 09:41:37.644 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:41:37 localhost python3.9[259943]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 28 04:41:38 localhost python3.9[260053]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Nov 28 04:41:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41791 DF PROTO=TCP SPT=43694 DPT=9102 SEQ=689482158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB25A3C20000000001030307) Nov 28 04:41:40 localhost podman[238687]: time="2025-11-28T09:41:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:41:40 localhost podman[238687]: @ - - [28/Nov/2025:09:41:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146307 "" "Go-http-client/1.1" Nov 28 04:41:40 localhost podman[238687]: @ - - [28/Nov/2025:09:41:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16788 "" "Go-http-client/1.1" Nov 28 04:41:40 localhost nova_compute[228333]: 2025-11-28 09:41:40.633 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:41:42 localhost nova_compute[228333]: 2025-11-28 09:41:42.659 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:41:44 localhost python3[260190]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_id=neutron_dhcp config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Nov 28 04:41:44 localhost podman[260229]: Nov 28 04:41:44 localhost podman[260229]: 2025-11-28 09:41:44.698335756 +0000 UTC m=+0.094912111 container create 41dbdfa347a213d049fb00ac3c34299901fd76ab2d5dd16b179c3895a0118a85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, config_id=neutron_dhcp, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5020394c74ae3012d879e5ff3e11d0579506779b70afd59ced150226597d9373'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, container_name=neutron_dhcp_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2) Nov 28 04:41:44 localhost podman[260229]: 2025-11-28 09:41:44.648277841 +0000 UTC m=+0.044854296 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 04:41:44 localhost python3[260190]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_dhcp_agent --cgroupns=host --conmon-pidfile /run/neutron_dhcp_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=5020394c74ae3012d879e5ff3e11d0579506779b70afd59ced150226597d9373 --label config_id=neutron_dhcp --label container_name=neutron_dhcp_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5020394c74ae3012d879e5ff3e11d0579506779b70afd59ced150226597d9373'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/netns:/run/netns:shared --volume /var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 04:41:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:41:45 localhost podman[260375]: 2025-11-28 09:41:45.461351819 +0000 UTC m=+0.083847851 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, name=ubi9-minimal, release=1755695350, vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter) Nov 28 04:41:45 localhost podman[260375]: 2025-11-28 09:41:45.474806946 +0000 UTC m=+0.097302968 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter) Nov 28 04:41:45 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:41:45 localhost python3.9[260376]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:41:45 localhost nova_compute[228333]: 2025-11-28 09:41:45.636 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:41:46 localhost python3.9[260507]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:41:46 localhost nova_compute[228333]: 2025-11-28 09:41:46.680 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:41:46 localhost python3.9[260562]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:41:47 localhost python3.9[260671]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764322906.910816-1337-43812824345335/source dest=/etc/systemd/system/edpm_neutron_dhcp_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:41:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41792 DF PROTO=TCP SPT=43694 DPT=9102 SEQ=689482158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB25C3820000000001030307) Nov 28 04:41:47 localhost nova_compute[228333]: 2025-11-28 09:41:47.687 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:41:48 localhost openstack_network_exporter[240658]: ERROR 09:41:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:41:48 localhost openstack_network_exporter[240658]: ERROR 09:41:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:41:48 localhost openstack_network_exporter[240658]: ERROR 09:41:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:41:48 localhost openstack_network_exporter[240658]: ERROR 09:41:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:41:48 localhost openstack_network_exporter[240658]: Nov 28 04:41:48 localhost openstack_network_exporter[240658]: ERROR 09:41:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:41:48 localhost openstack_network_exporter[240658]: Nov 28 04:41:48 localhost python3.9[260726]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 28 04:41:48 localhost systemd[1]: Reloading. Nov 28 04:41:48 localhost systemd-rc-local-generator[260748]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:41:48 localhost systemd-sysv-generator[260753]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:41:48 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:48 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:48 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:48 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:48 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:41:48 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:48 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:48 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:48 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:49 localhost python3.9[260817]: ansible-systemd Invoked with state=restarted name=edpm_neutron_dhcp_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:41:49 localhost systemd[1]: Reloading. Nov 28 04:41:49 localhost systemd-rc-local-generator[260845]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:41:49 localhost systemd-sysv-generator[260850]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:41:49 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:49 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:49 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:49 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:41:49 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:49 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:49 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:49 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:41:49 localhost systemd[1]: Starting neutron_dhcp_agent container... Nov 28 04:41:49 localhost systemd[1]: tmp-crun.XmGJKY.mount: Deactivated successfully. Nov 28 04:41:49 localhost systemd[1]: Started libcrun container. Nov 28 04:41:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83921fe5428134accaa21c01a209a23cffcc43ff5cc885ed88771872ecf2a1dd/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Nov 28 04:41:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83921fe5428134accaa21c01a209a23cffcc43ff5cc885ed88771872ecf2a1dd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 04:41:49 localhost podman[260859]: 2025-11-28 09:41:49.672295966 +0000 UTC m=+0.129017508 container init 41dbdfa347a213d049fb00ac3c34299901fd76ab2d5dd16b179c3895a0118a85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5020394c74ae3012d879e5ff3e11d0579506779b70afd59ced150226597d9373'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=neutron_dhcp_agent, config_id=neutron_dhcp, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: + sudo -E kolla_set_configs Nov 28 04:41:49 localhost podman[260859]: 2025-11-28 09:41:49.688652716 +0000 UTC m=+0.145374258 container start 41dbdfa347a213d049fb00ac3c34299901fd76ab2d5dd16b179c3895a0118a85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=neutron_dhcp, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5020394c74ae3012d879e5ff3e11d0579506779b70afd59ced150226597d9373'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_dhcp_agent) Nov 28 04:41:49 localhost podman[260859]: neutron_dhcp_agent Nov 28 04:41:49 localhost systemd[1]: Started neutron_dhcp_agent container. Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Validating config file Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Copying service configuration files Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Writing out command to execute Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/external Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934 Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/a99cc267a5c3ade03c88b3bb0a43299c9bb62825df6f4ca0c30c03cccfac55c1 Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/40d5da59-6201-424a-8380-80ecc3d67c7e.pid.haproxy Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/40d5da59-6201-424a-8380-80ecc3d67c7e.conf Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: ++ cat /run_command Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: + CMD=/usr/bin/neutron-dhcp-agent Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: + ARGS= Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: + sudo kolla_copy_cacerts Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: + [[ ! -n '' ]] Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: + . kolla_extend_start Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\''' Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: Running command: '/usr/bin/neutron-dhcp-agent' Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: + umask 0022 Nov 28 04:41:49 localhost neutron_dhcp_agent[260873]: + exec /usr/bin/neutron-dhcp-agent Nov 28 04:41:50 localhost python3.9[260997]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_dhcp_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 04:41:50 localhost systemd[1]: Stopping neutron_dhcp_agent container... Nov 28 04:41:50 localhost systemd[1]: tmp-crun.ZIc1A9.mount: Deactivated successfully. Nov 28 04:41:50 localhost nova_compute[228333]: 2025-11-28 09:41:50.680 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:41:50 localhost nova_compute[228333]: 2025-11-28 09:41:50.693 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:41:50 localhost systemd[1]: libpod-41dbdfa347a213d049fb00ac3c34299901fd76ab2d5dd16b179c3895a0118a85.scope: Deactivated successfully. Nov 28 04:41:50 localhost systemd[1]: libpod-41dbdfa347a213d049fb00ac3c34299901fd76ab2d5dd16b179c3895a0118a85.scope: Consumed 1.010s CPU time. Nov 28 04:41:50 localhost podman[261001]: 2025-11-28 09:41:50.703578996 +0000 UTC m=+0.119188020 container died 41dbdfa347a213d049fb00ac3c34299901fd76ab2d5dd16b179c3895a0118a85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, container_name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5020394c74ae3012d879e5ff3e11d0579506779b70afd59ced150226597d9373'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.schema-version=1.0, config_id=neutron_dhcp, org.label-schema.vendor=CentOS) Nov 28 04:41:50 localhost nova_compute[228333]: 2025-11-28 09:41:50.716 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:41:50 localhost nova_compute[228333]: 2025-11-28 09:41:50.716 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:41:50 localhost nova_compute[228333]: 2025-11-28 09:41:50.716 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:41:50 localhost nova_compute[228333]: 2025-11-28 09:41:50.717 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 04:41:50 localhost nova_compute[228333]: 2025-11-28 09:41:50.717 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:41:50 localhost podman[261001]: 2025-11-28 09:41:50.799208299 +0000 UTC m=+0.214817323 container cleanup 41dbdfa347a213d049fb00ac3c34299901fd76ab2d5dd16b179c3895a0118a85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5020394c74ae3012d879e5ff3e11d0579506779b70afd59ced150226597d9373'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=neutron_dhcp_agent) Nov 28 04:41:50 localhost podman[261001]: neutron_dhcp_agent Nov 28 04:41:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:41:50.818 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:41:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:41:50.818 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:41:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:41:50.819 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:41:50 localhost podman[261065]: error opening file `/run/crun/41dbdfa347a213d049fb00ac3c34299901fd76ab2d5dd16b179c3895a0118a85/status`: No such file or directory Nov 28 04:41:50 localhost podman[261035]: 2025-11-28 09:41:50.912798325 +0000 UTC m=+0.070798328 container cleanup 41dbdfa347a213d049fb00ac3c34299901fd76ab2d5dd16b179c3895a0118a85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, container_name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=neutron_dhcp, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5020394c74ae3012d879e5ff3e11d0579506779b70afd59ced150226597d9373'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}) Nov 28 04:41:50 localhost podman[261035]: neutron_dhcp_agent Nov 28 04:41:50 localhost systemd[1]: edpm_neutron_dhcp_agent.service: Deactivated successfully. Nov 28 04:41:50 localhost systemd[1]: Stopped neutron_dhcp_agent container. Nov 28 04:41:50 localhost systemd[1]: Starting neutron_dhcp_agent container... Nov 28 04:41:51 localhost systemd[1]: Started libcrun container. Nov 28 04:41:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83921fe5428134accaa21c01a209a23cffcc43ff5cc885ed88771872ecf2a1dd/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Nov 28 04:41:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83921fe5428134accaa21c01a209a23cffcc43ff5cc885ed88771872ecf2a1dd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 04:41:51 localhost podman[261067]: 2025-11-28 09:41:51.063743484 +0000 UTC m=+0.112593645 container init 41dbdfa347a213d049fb00ac3c34299901fd76ab2d5dd16b179c3895a0118a85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=neutron_dhcp_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5020394c74ae3012d879e5ff3e11d0579506779b70afd59ced150226597d9373'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:41:51 localhost podman[261067]: 2025-11-28 09:41:51.072659733 +0000 UTC m=+0.121509854 container start 41dbdfa347a213d049fb00ac3c34299901fd76ab2d5dd16b179c3895a0118a85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=neutron_dhcp_agent, org.label-schema.build-date=20251125, tcib_managed=true, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5020394c74ae3012d879e5ff3e11d0579506779b70afd59ced150226597d9373'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/config-data/ansible-generated/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}) Nov 28 04:41:51 localhost podman[261067]: neutron_dhcp_agent Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: + sudo -E kolla_set_configs Nov 28 04:41:51 localhost systemd[1]: Started neutron_dhcp_agent container. Nov 28 04:41:51 localhost nova_compute[228333]: 2025-11-28 09:41:51.143 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Validating config file Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Copying service configuration files Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Writing out command to execute Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/external Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934 Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/a99cc267a5c3ade03c88b3bb0a43299c9bb62825df6f4ca0c30c03cccfac55c1 Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/40d5da59-6201-424a-8380-80ecc3d67c7e.pid.haproxy Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/40d5da59-6201-424a-8380-80ecc3d67c7e.conf Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: ++ cat /run_command Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: + CMD=/usr/bin/neutron-dhcp-agent Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: + ARGS= Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: + sudo kolla_copy_cacerts Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: + [[ ! -n '' ]] Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: + . kolla_extend_start Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: Running command: '/usr/bin/neutron-dhcp-agent' Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\''' Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: + umask 0022 Nov 28 04:41:51 localhost neutron_dhcp_agent[261080]: + exec /usr/bin/neutron-dhcp-agent Nov 28 04:41:51 localhost nova_compute[228333]: 2025-11-28 09:41:51.213 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:41:51 localhost nova_compute[228333]: 2025-11-28 09:41:51.213 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:41:51 localhost nova_compute[228333]: 2025-11-28 09:41:51.384 228337 WARNING nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 04:41:51 localhost nova_compute[228333]: 2025-11-28 09:41:51.385 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12201MB free_disk=41.837093353271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 04:41:51 localhost nova_compute[228333]: 2025-11-28 09:41:51.385 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:41:51 localhost nova_compute[228333]: 2025-11-28 09:41:51.385 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:41:51 localhost nova_compute[228333]: 2025-11-28 09:41:51.459 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 04:41:51 localhost nova_compute[228333]: 2025-11-28 09:41:51.459 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 04:41:51 localhost nova_compute[228333]: 2025-11-28 09:41:51.460 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 04:41:51 localhost nova_compute[228333]: 2025-11-28 09:41:51.492 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:41:51 localhost systemd[1]: session-58.scope: Deactivated successfully. Nov 28 04:41:51 localhost systemd[1]: session-58.scope: Consumed 33.814s CPU time. Nov 28 04:41:51 localhost systemd-logind[764]: Session 58 logged out. Waiting for processes to exit. Nov 28 04:41:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:41:51 localhost systemd-logind[764]: Removed session 58. Nov 28 04:41:51 localhost systemd[1]: tmp-crun.h8bdlP.mount: Deactivated successfully. Nov 28 04:41:51 localhost podman[261114]: 2025-11-28 09:41:51.646525618 +0000 UTC m=+0.090058583 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:41:51 localhost podman[261114]: 2025-11-28 09:41:51.657511195 +0000 UTC m=+0.101044200 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 04:41:51 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:41:51 localhost nova_compute[228333]: 2025-11-28 09:41:51.949 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:41:51 localhost nova_compute[228333]: 2025-11-28 09:41:51.955 228337 DEBUG nova.compute.provider_tree [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 04:41:51 localhost nova_compute[228333]: 2025-11-28 09:41:51.972 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 04:41:51 localhost nova_compute[228333]: 2025-11-28 09:41:51.975 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 04:41:51 localhost nova_compute[228333]: 2025-11-28 09:41:51.975 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.590s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:41:52 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:41:52.344 261084 INFO neutron.common.config [-] Logging enabled!#033[00m Nov 28 04:41:52 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:41:52.344 261084 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev43#033[00m Nov 28 04:41:52 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:41:52.704 261084 INFO neutron.agent.dhcp.agent [-] Synchronizing state#033[00m Nov 28 04:41:52 localhost nova_compute[228333]: 2025-11-28 09:41:52.734 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:41:52 localhost nova_compute[228333]: 2025-11-28 09:41:52.976 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:41:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:41:53.592 261084 INFO neutron.agent.dhcp.agent [None req-bd797fc6-33fd-41fe-a146-138b235f669c - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 28 04:41:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:41:53.592 261084 INFO neutron.agent.dhcp.agent [None req-bd797fc6-33fd-41fe-a146-138b235f669c - - - - - -] Synchronizing state complete#033[00m Nov 28 04:41:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:41:53.644 261084 INFO neutron.agent.dhcp.agent [None req-bd797fc6-33fd-41fe-a146-138b235f669c - - - - - -] DHCP agent started#033[00m Nov 28 04:41:53 localhost nova_compute[228333]: 2025-11-28 09:41:53.677 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:41:54 localhost ovn_metadata_agent[158125]: 2025-11-28 09:41:54.258 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 04:41:54 localhost ovn_metadata_agent[158125]: 2025-11-28 09:41:54.259 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 28 04:41:54 localhost ovn_metadata_agent[158125]: 2025-11-28 09:41:54.261 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:41:54 localhost nova_compute[228333]: 2025-11-28 09:41:54.293 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:41:54 localhost nova_compute[228333]: 2025-11-28 09:41:54.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:41:54 localhost nova_compute[228333]: 2025-11-28 09:41:54.681 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 04:41:54 localhost nova_compute[228333]: 2025-11-28 09:41:54.682 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 04:41:54 localhost nova_compute[228333]: 2025-11-28 09:41:54.744 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:41:54 localhost nova_compute[228333]: 2025-11-28 09:41:54.744 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:41:54 localhost nova_compute[228333]: 2025-11-28 09:41:54.745 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 04:41:54 localhost nova_compute[228333]: 2025-11-28 09:41:54.745 228337 DEBUG nova.objects.instance [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:41:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:41:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:41:54 localhost podman[261161]: 2025-11-28 09:41:54.838270007 +0000 UTC m=+0.075929176 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS) Nov 28 04:41:54 localhost podman[261162]: 2025-11-28 09:41:54.894203522 +0000 UTC m=+0.127740607 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:41:54 localhost podman[261162]: 2025-11-28 09:41:54.898406018 +0000 UTC m=+0.131943123 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 04:41:54 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:41:54 localhost podman[261161]: 2025-11-28 09:41:54.950144857 +0000 UTC m=+0.187804036 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3) Nov 28 04:41:54 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:41:55 localhost nova_compute[228333]: 2025-11-28 09:41:55.695 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:41:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:41:56 localhost nova_compute[228333]: 2025-11-28 09:41:56.791 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 04:41:56 localhost nova_compute[228333]: 2025-11-28 09:41:56.818 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:41:56 localhost nova_compute[228333]: 2025-11-28 09:41:56.819 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 04:41:56 localhost nova_compute[228333]: 2025-11-28 09:41:56.819 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:41:56 localhost nova_compute[228333]: 2025-11-28 09:41:56.820 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:41:56 localhost nova_compute[228333]: 2025-11-28 09:41:56.820 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:41:56 localhost nova_compute[228333]: 2025-11-28 09:41:56.820 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 04:41:56 localhost podman[261205]: 2025-11-28 09:41:56.851286978 +0000 UTC m=+0.083880383 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm) Nov 28 04:41:56 localhost podman[261205]: 2025-11-28 09:41:56.860181647 +0000 UTC m=+0.092775032 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 04:41:56 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:41:57 localhost nova_compute[228333]: 2025-11-28 09:41:57.738 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.669 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.670 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 28 04:42:00 localhost nova_compute[228333]: 2025-11-28 09:42:00.682 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.685 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.685 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8397e059-e03f-4c2c-b93c-7efc7b229807', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:42:00.670801', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7d12023a-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.842700574, 'message_signature': '7272ee69b6c3ad5790e25d857bf8f3f6cc3fb6d0bcaa1fb6cb6ab0b44bfb9c16'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:42:00.670801', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7d1215e0-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.842700574, 'message_signature': '459e264602b02946342039ad59db6f28e22fcbafbc4631d6c0a4381332bd65ae'}]}, 'timestamp': '2025-11-28 09:42:00.686344', '_unique_id': '191fd58ed7b14f6abdb80306ca840bc2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.687 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.688 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.692 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7940f4ba-5fd7-47f7-bb44-59f587d6da61', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:42:00.689045', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '7d1318be-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.860945475, 'message_signature': 'a276e1fefc3092284babd1c473b142e802014933671d22ca806e0a9960554599'}]}, 'timestamp': '2025-11-28 09:42:00.692994', '_unique_id': 'e9df2e1c49904e73b2becdc2e2fd2a9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.693 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.695 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 28 04:42:00 localhost nova_compute[228333]: 2025-11-28 09:42:00.725 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.756 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.756 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c25dc02-7029-4cc6-8760-2e6978bee3b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:42:00.695276', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7d1ccc60-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.867168707, 'message_signature': '19bca1fb4f2f6a3b8fdabd527f6867aec17f712198346aa8c99f8bf82f090d93'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:42:00.695276', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7d1cdd18-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.867168707, 'message_signature': 'dfc4e801fb6f2ed7745d687affcd094e865b1b0c94c65bd32c7f7489dbf31e90'}]}, 'timestamp': '2025-11-28 09:42:00.756967', '_unique_id': 'da07a98f340f4a07b990431baaa92efc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.757 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.759 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.759 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 89 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'adcbcd3d-e864-43ca-949b-7446bb6d6d71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 89, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:42:00.759214', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '7d1d45be-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.860945475, 'message_signature': '12f318c3b9463c4b16a303c4dd4b2703d84ab1979855008e9c69e231d380b3f2'}]}, 'timestamp': '2025-11-28 09:42:00.759677', '_unique_id': '444249e46bb24d1b82a5ed885966ef20'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.760 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.762 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.762 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.762 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b248eb55-a3dc-493d-afe1-a19c9eba1827', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:42:00.762177', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7d1db92c-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.867168707, 'message_signature': '9a3d065680b2c9102d520b2075b996404173d4d8cc76ef6f4b016f2eef1ba69d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:42:00.762177', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7d1dc944-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.867168707, 'message_signature': '645de6d23ac77140be779fb1bc8039a0986d4af52387a37c2c6a4eadf8aec074'}]}, 'timestamp': '2025-11-28 09:42:00.763010', '_unique_id': '22bceb57404f4ac1881119ffc8ef9277'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.763 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.765 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.765 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba1bb1fe-174f-4e8f-add4-39db5d61820f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:42:00.765383', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '7d1e36ea-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.860945475, 'message_signature': '37f2c2f172b0568edb7fff26cdcbd9d69ad80fd770c7d78df269aca6e8b89fe1'}]}, 'timestamp': '2025-11-28 09:42:00.765852', '_unique_id': '491b698a778f481ab38deec370663bc8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.766 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.767 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.768 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1313024378 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.768 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 168520223 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae62639f-9bb6-43ae-af7c-9a49824ce051', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1313024378, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:42:00.768083', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7d1ea0da-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.867168707, 'message_signature': '3e87c5c1b76b133f6b668ddf6ae1b4827d41c8c65d52e1aded3c79078d535598'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 168520223, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:42:00.768083', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7d1eb16a-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.867168707, 'message_signature': '25d7ed23ff8ba5d911fed983dbe68ddc66310d43056d62a27fd60cbd1c4f5dec'}]}, 'timestamp': '2025-11-28 09:42:00.768957', '_unique_id': '49706601d5554413b1cca1bda2482302'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.771 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.771 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.771 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6758c450-1ec0-4d51-a1d0-0ed94312bd40', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:42:00.771405', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '7d1f2276-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.860945475, 'message_signature': '9aad6023e07961ed1e05a3bddfe45581b3f733d616aa1632b8974e1fba53474f'}]}, 'timestamp': '2025-11-28 09:42:00.771912', '_unique_id': '2cca8b4ad6e847428515e5fb7dd1f7ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.773 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.774 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.774 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b30325f2-55ba-41dd-a08e-fbf478b8574e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:42:00.774089', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7d1f8a5e-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.842700574, 'message_signature': '6cd83d7bfeca754f996321fe9539723382d4098c8944f3d02252cb2100d66dc6'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:42:00.774089', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7d1f9a08-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.842700574, 'message_signature': '586db6e4d3271d181e4021788f149eeddee41f96535f7f0518691a16b17e34f9'}]}, 'timestamp': '2025-11-28 09:42:00.774909', '_unique_id': '2a29994f9fcc482f8a85a6541344e36e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.775 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.777 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.777 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 266 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64cc9f67-46d6-4347-b9d4-d1f9f472db5d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 266, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:42:00.777176', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '7d200312-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.860945475, 'message_signature': 'a2df22ad6e6c948c653535a1a268bfd9e70a172e3eba4228880586f8d9cd74a9'}]}, 'timestamp': '2025-11-28 09:42:00.777627', '_unique_id': '149bc3c9a80549249689902f64043eb0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.778 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.779 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.799 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 49930000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21b4c679-f126-4515-b78d-f131268bbb0a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 49930000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:42:00.779756', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '7d236d86-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.971296987, 'message_signature': '63d0ad2f14cdc76fde1ad58e9c7f56fb1539bfa45a8e0ab7576f5936f57f5445'}]}, 'timestamp': '2025-11-28 09:42:00.800128', '_unique_id': '502f7c8b5afc4f389ee2d2ab9b864b7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.801 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.802 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.802 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.802 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 73981952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.802 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b91abf40-df2e-40e3-9785-ed2537c4af2e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73981952, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:42:00.802455', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7d23dec4-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.867168707, 'message_signature': '1f5b2327931a5ed5d67c3bbcda184724c6847584b3c8a8e2c1468c8eacf807a7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:42:00.802455', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7d23efb8-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.867168707, 'message_signature': '94c0db7d1b9e243c931870539d6e77a08309418cea8ce5a561e56472cd962e7c'}]}, 'timestamp': '2025-11-28 09:42:00.803317', '_unique_id': '60f00974a46741fc8632f0335d759007'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.804 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.805 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.805 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0801ef0-a08f-4721-866c-77ddf55bee09', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:42:00.805468', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '7d245444-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.860945475, 'message_signature': 'c05071d6702786a21acfce26e912183e9aa8354f506752f21f6f3de4f0f16134'}]}, 'timestamp': '2025-11-28 09:42:00.805918', '_unique_id': '4e4cf6ff60a64bb0884ae4bcc5bf2597'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.806 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.807 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.808 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.808 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '258ea80d-c571-4cd0-9b78-48efa0006e5f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:42:00.807981', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7d24b754-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.842700574, 'message_signature': 'eca16226cca8aefb6db81ba34beaf0aa136400d32fb766710725de53e110a3ca'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:42:00.807981', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7d24c71c-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.842700574, 'message_signature': '075b1d9e137f1fc91a796d761bccb5e6583fb32cb2f18c9eae8dc8c28e639874'}]}, 'timestamp': '2025-11-28 09:42:00.808832', '_unique_id': '786a9e33d5e045979cf832bfc77aecfa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.809 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.810 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.811 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18bfa783-aaaa-4d84-ad3c-4c0ae47aa724', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:42:00.811062', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '7d252ef0-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.860945475, 'message_signature': '7604c64c510f31a6e92ebf85951f6a037eae55f3bd44cb897f25344068675bd7'}]}, 'timestamp': '2025-11-28 09:42:00.811520', '_unique_id': '28bbb5aa682f486cbae6f7d86874fd68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.812 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.813 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.813 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 566 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.814 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f32572f-b918-402b-be1b-5c7116282906', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 566, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:42:00.813797', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7d259ac0-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.867168707, 'message_signature': 'e9bff0c12ca938a667dd9907508ed7728c053746447e3cc097118ef3b14f49b9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:42:00.813797', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7d25ac9a-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.867168707, 'message_signature': '97d0010aad7cbc4f1cbe3c18902e26c67624e9738286584eabb7b6f422cbc507'}]}, 'timestamp': '2025-11-28 09:42:00.814713', '_unique_id': '67040096a6f143048b27e29cba94d3e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.815 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.816 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.816 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 52.40234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '971a4b5b-4e4e-4ca9-beaa-0602f4a2408b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.40234375, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:42:00.816888', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '7d26170c-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.971296987, 'message_signature': 'fe36d818741872586e848023535130a1086e8ad8b7fc677acc3604e930d05885'}]}, 'timestamp': '2025-11-28 09:42:00.817462', '_unique_id': '5bab9f13f0d34873981b6b300cdc74a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.818 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.819 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.819 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.819 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 9441 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85cfd3aa-b732-46db-813e-74119f7f5860', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9441, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:42:00.819795', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '7d268764-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.860945475, 'message_signature': 'b7fa53f9ef7963ba14cae7a6cb5247485f3d1c4f48aebd5680ad761b1d618703'}]}, 'timestamp': '2025-11-28 09:42:00.820387', '_unique_id': '0e3ad8da8cc94bf9bd33021015f6d9f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.821 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.822 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.822 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 12742 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07886e67-231e-49dd-8abf-98ed5deea5de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12742, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:42:00.822801', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '7d26fe1a-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.860945475, 'message_signature': '8557ff7eeb67763164e885aedf0d152eb40d6dda3548a930f699df87bc767518'}]}, 'timestamp': '2025-11-28 09:42:00.823382', '_unique_id': '8a232ed842984347b982adab06537e3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.824 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.825 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.825 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3037b332-8fcf-4c4f-9cfe-3728a863a9e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 144, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:42:00.825476', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '7d2761ac-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.860945475, 'message_signature': '01499f8464dc08e6c146e4e655092e02fb49227e182e825c28230ba22bed284f'}]}, 'timestamp': '2025-11-28 09:42:00.825926', '_unique_id': '2511ebb1bffa4f7ab3fbcb1b01de7943'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.826 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.827 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.828 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 305908425 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.828 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 30452399 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e943ed3b-6a6c-464a-96b0-75a34a704d31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 305908425, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:42:00.828001', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7d27c5ca-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.867168707, 'message_signature': 'e45a48065a5fd2429393d25101914bade81280059035ac398066e454c6c7d509'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30452399, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:42:00.828001', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7d27d574-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10754.867168707, 'message_signature': '54a318662de78279463810fd77e0d9584ee284c2191b132dba314091f7e49e74'}]}, 'timestamp': '2025-11-28 09:42:00.828858', '_unique_id': 'ffd126483f7a486ba521a608156f6c3b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:42:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:42:00.829 12 ERROR oslo_messaging.notify.messaging Nov 28 04:42:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24392 DF PROTO=TCP SPT=57376 DPT=9102 SEQ=2455724692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB25FD350000000001030307) Nov 28 04:42:02 localhost nova_compute[228333]: 2025-11-28 09:42:02.772 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24393 DF PROTO=TCP SPT=57376 DPT=9102 SEQ=2455724692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2601420000000001030307) Nov 28 04:42:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:42:03 localhost podman[261226]: 2025-11-28 09:42:03.848982719 +0000 UTC m=+0.077224477 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 04:42:03 localhost podman[261226]: 2025-11-28 09:42:03.862434366 +0000 UTC m=+0.090676164 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 04:42:03 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:42:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41793 DF PROTO=TCP SPT=43694 DPT=9102 SEQ=689482158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2603820000000001030307) Nov 28 04:42:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24394 DF PROTO=TCP SPT=57376 DPT=9102 SEQ=2455724692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2609430000000001030307) Nov 28 04:42:05 localhost nova_compute[228333]: 2025-11-28 09:42:05.729 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40562 DF PROTO=TCP SPT=33708 DPT=9102 SEQ=2153988774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB260D820000000001030307) Nov 28 04:42:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:42:06 localhost podman[261250]: 2025-11-28 09:42:06.837494262 +0000 UTC m=+0.077079574 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:42:06 localhost podman[261250]: 2025-11-28 09:42:06.847897889 +0000 UTC m=+0.087483211 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 28 04:42:06 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:42:07 localhost nova_compute[228333]: 2025-11-28 09:42:07.775 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24395 DF PROTO=TCP SPT=57376 DPT=9102 SEQ=2455724692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2619020000000001030307) Nov 28 04:42:10 localhost podman[238687]: time="2025-11-28T09:42:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:42:10 localhost podman[238687]: @ - - [28/Nov/2025:09:42:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1" Nov 28 04:42:10 localhost podman[238687]: @ - - [28/Nov/2025:09:42:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17224 "" "Go-http-client/1.1" Nov 28 04:42:10 localhost nova_compute[228333]: 2025-11-28 09:42:10.760 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:12 localhost nova_compute[228333]: 2025-11-28 09:42:12.815 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:42:15 localhost nova_compute[228333]: 2025-11-28 09:42:15.764 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:15 localhost podman[261270]: 2025-11-28 09:42:15.845633973 +0000 UTC m=+0.079104339 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., release=1755695350, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, container_name=openstack_network_exporter, vendor=Red Hat, Inc.) Nov 28 04:42:15 localhost podman[261270]: 2025-11-28 09:42:15.86032962 +0000 UTC m=+0.093800026 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers) Nov 28 04:42:15 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:42:17 localhost nova_compute[228333]: 2025-11-28 09:42:17.817 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24396 DF PROTO=TCP SPT=57376 DPT=9102 SEQ=2455724692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2639820000000001030307) Nov 28 04:42:18 localhost openstack_network_exporter[240658]: ERROR 09:42:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:42:18 localhost openstack_network_exporter[240658]: ERROR 09:42:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:42:18 localhost openstack_network_exporter[240658]: ERROR 09:42:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:42:18 localhost openstack_network_exporter[240658]: ERROR 09:42:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:42:18 localhost openstack_network_exporter[240658]: Nov 28 04:42:18 localhost openstack_network_exporter[240658]: ERROR 09:42:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:42:18 localhost openstack_network_exporter[240658]: Nov 28 04:42:20 localhost nova_compute[228333]: 2025-11-28 09:42:20.794 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:42:21 localhost podman[261434]: 2025-11-28 09:42:21.843291626 +0000 UTC m=+0.081878319 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:42:21 localhost podman[261434]: 2025-11-28 09:42:21.856979249 +0000 UTC m=+0.095565922 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 04:42:21 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:42:22 localhost nova_compute[228333]: 2025-11-28 09:42:22.865 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:24 localhost ovn_controller[152322]: 2025-11-28T09:42:24Z|00048|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory Nov 28 04:42:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:42:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:42:25 localhost nova_compute[228333]: 2025-11-28 09:42:25.797 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:25 localhost podman[261456]: 2025-11-28 09:42:25.851377438 +0000 UTC m=+0.084276366 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:42:25 localhost podman[261457]: 2025-11-28 09:42:25.907287673 +0000 UTC m=+0.135935582 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:42:25 localhost podman[261457]: 2025-11-28 09:42:25.913592697 +0000 UTC m=+0.142240586 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent) Nov 28 04:42:25 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:42:25 localhost podman[261456]: 2025-11-28 09:42:25.963996343 +0000 UTC m=+0.196895241 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3) Nov 28 04:42:25 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:42:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:42:27 localhost podman[261498]: 2025-11-28 09:42:27.842013384 +0000 UTC m=+0.079830542 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:42:27 localhost podman[261498]: 2025-11-28 09:42:27.857487466 +0000 UTC m=+0.095304664 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:42:27 localhost nova_compute[228333]: 2025-11-28 09:42:27.867 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:27 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:42:30 localhost nova_compute[228333]: 2025-11-28 09:42:30.829 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46571 DF PROTO=TCP SPT=50088 DPT=9102 SEQ=3884471758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2672650000000001030307) Nov 28 04:42:32 localhost nova_compute[228333]: 2025-11-28 09:42:32.895 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:33 localhost sshd[261517]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:42:33 localhost systemd-logind[764]: New session 59 of user zuul. Nov 28 04:42:33 localhost systemd[1]: Started Session 59 of User zuul. Nov 28 04:42:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46572 DF PROTO=TCP SPT=50088 DPT=9102 SEQ=3884471758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2676820000000001030307) Nov 28 04:42:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24397 DF PROTO=TCP SPT=57376 DPT=9102 SEQ=2455724692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2679820000000001030307) Nov 28 04:42:34 localhost python3.9[261628]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:42:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:42:34 localhost podman[261650]: 2025-11-28 09:42:34.854173122 +0000 UTC m=+0.090893271 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:42:34 localhost podman[261650]: 2025-11-28 09:42:34.866343397 +0000 UTC m=+0.103063526 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:42:34 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:42:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46573 DF PROTO=TCP SPT=50088 DPT=9102 SEQ=3884471758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB267E820000000001030307) Nov 28 04:42:35 localhost nova_compute[228333]: 2025-11-28 09:42:35.831 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:36 localhost python3.9[261764]: ansible-ansible.builtin.service_facts Invoked Nov 28 04:42:36 localhost network[261781]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 28 04:42:36 localhost network[261782]: 'network-scripts' will be removed from distribution in near future. Nov 28 04:42:36 localhost network[261783]: It is advised to switch to 'NetworkManager' instead for network management. Nov 28 04:42:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41794 DF PROTO=TCP SPT=43694 DPT=9102 SEQ=689482158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2681820000000001030307) Nov 28 04:42:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:42:36 localhost systemd[1]: tmp-crun.HffRkA.mount: Deactivated successfully. Nov 28 04:42:36 localhost podman[261792]: 2025-11-28 09:42:36.973251217 +0000 UTC m=+0.086735866 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_managed=true, managed_by=edpm_ansible) Nov 28 04:42:36 localhost podman[261792]: 2025-11-28 09:42:36.988413199 +0000 UTC m=+0.101897848 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 28 04:42:37 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:42:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:42:37 localhost nova_compute[228333]: 2025-11-28 09:42:37.898 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46574 DF PROTO=TCP SPT=50088 DPT=9102 SEQ=3884471758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB268E420000000001030307) Nov 28 04:42:40 localhost podman[238687]: time="2025-11-28T09:42:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:42:40 localhost podman[238687]: @ - - [28/Nov/2025:09:42:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1" Nov 28 04:42:40 localhost podman[238687]: @ - - [28/Nov/2025:09:42:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17226 "" "Go-http-client/1.1" Nov 28 04:42:40 localhost python3.9[262036]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Nov 28 04:42:40 localhost nova_compute[228333]: 2025-11-28 09:42:40.872 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:41 localhost python3.9[262099]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 28 04:42:42 localhost nova_compute[228333]: 2025-11-28 09:42:42.938 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:45 localhost python3.9[262211]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:42:45 localhost nova_compute[228333]: 2025-11-28 09:42:45.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:42:45 localhost nova_compute[228333]: 2025-11-28 09:42:45.874 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:42:46 localhost podman[262322]: 2025-11-28 09:42:46.052212502 +0000 UTC m=+0.087073162 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git, name=ubi9-minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Nov 28 04:42:46 localhost podman[262322]: 2025-11-28 09:42:46.066933694 +0000 UTC m=+0.101794374 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.expose-services=, vendor=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers) Nov 28 04:42:46 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:42:46 localhost python3.9[262321]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:42:46 localhost nova_compute[228333]: 2025-11-28 09:42:46.709 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:42:47 localhost python3.9[262452]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:42:47 localhost nova_compute[228333]: 2025-11-28 09:42:47.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:42:47 localhost nova_compute[228333]: 2025-11-28 09:42:47.681 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Nov 28 04:42:47 localhost nova_compute[228333]: 2025-11-28 09:42:47.710 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Nov 28 04:42:47 localhost nova_compute[228333]: 2025-11-28 09:42:47.940 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46575 DF PROTO=TCP SPT=50088 DPT=9102 SEQ=3884471758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB26AF820000000001030307) Nov 28 04:42:48 localhost openstack_network_exporter[240658]: ERROR 09:42:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:42:48 localhost openstack_network_exporter[240658]: ERROR 09:42:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:42:48 localhost openstack_network_exporter[240658]: ERROR 09:42:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:42:48 localhost openstack_network_exporter[240658]: ERROR 09:42:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:42:48 localhost openstack_network_exporter[240658]: Nov 28 04:42:48 localhost openstack_network_exporter[240658]: ERROR 09:42:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:42:48 localhost openstack_network_exporter[240658]: Nov 28 04:42:48 localhost python3.9[262564]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:42:50 localhost python3.9[262674]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:42:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:42:50.819 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:42:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:42:50.820 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:42:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:42:50.821 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:42:50 localhost nova_compute[228333]: 2025-11-28 09:42:50.906 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:51 localhost python3.9[262786]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:42:51 localhost nova_compute[228333]: 2025-11-28 09:42:51.711 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:42:51 localhost nova_compute[228333]: 2025-11-28 09:42:51.739 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:42:51 localhost nova_compute[228333]: 2025-11-28 09:42:51.739 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:42:51 localhost nova_compute[228333]: 2025-11-28 09:42:51.739 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:42:51 localhost nova_compute[228333]: 2025-11-28 09:42:51.740 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 04:42:51 localhost nova_compute[228333]: 2025-11-28 09:42:51.740 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:42:52 localhost nova_compute[228333]: 2025-11-28 09:42:52.203 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:42:52 localhost nova_compute[228333]: 2025-11-28 09:42:52.267 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:42:52 localhost nova_compute[228333]: 2025-11-28 09:42:52.267 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:42:52 localhost nova_compute[228333]: 2025-11-28 09:42:52.428 228337 WARNING nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 04:42:52 localhost nova_compute[228333]: 2025-11-28 09:42:52.429 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12074MB free_disk=41.837093353271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 04:42:52 localhost nova_compute[228333]: 2025-11-28 09:42:52.429 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:42:52 localhost nova_compute[228333]: 2025-11-28 09:42:52.429 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:42:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:42:52 localhost podman[262811]: 2025-11-28 09:42:52.550229459 +0000 UTC m=+0.078423179 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:42:52 localhost podman[262811]: 2025-11-28 09:42:52.557771125 +0000 UTC m=+0.085964815 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:42:52 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:42:52 localhost nova_compute[228333]: 2025-11-28 09:42:52.749 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 04:42:52 localhost nova_compute[228333]: 2025-11-28 09:42:52.750 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 04:42:52 localhost nova_compute[228333]: 2025-11-28 09:42:52.750 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 04:42:52 localhost nova_compute[228333]: 2025-11-28 09:42:52.833 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Refreshing inventories for resource provider 35fead26-0bad-4950-b646-987079d58a17 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 28 04:42:52 localhost nova_compute[228333]: 2025-11-28 09:42:52.852 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Updating ProviderTree inventory for provider 35fead26-0bad-4950-b646-987079d58a17 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 28 04:42:52 localhost nova_compute[228333]: 2025-11-28 09:42:52.852 228337 DEBUG nova.compute.provider_tree [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 28 04:42:52 localhost nova_compute[228333]: 2025-11-28 09:42:52.864 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Refreshing aggregate associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 28 04:42:52 localhost nova_compute[228333]: 2025-11-28 09:42:52.886 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Refreshing trait associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_BMI2,HW_CPU_X86_FMA3,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NODE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AVX2,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_CLMUL,HW_CPU_X86_ABM,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE41,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_F16C,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 28 04:42:52 localhost nova_compute[228333]: 2025-11-28 09:42:52.979 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:52 localhost nova_compute[228333]: 2025-11-28 09:42:52.991 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:42:53 localhost nova_compute[228333]: 2025-11-28 09:42:53.445 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:42:53 localhost nova_compute[228333]: 2025-11-28 09:42:53.451 228337 DEBUG nova.compute.provider_tree [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 04:42:53 localhost nova_compute[228333]: 2025-11-28 09:42:53.471 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 04:42:53 localhost nova_compute[228333]: 2025-11-28 09:42:53.473 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 04:42:53 localhost nova_compute[228333]: 2025-11-28 09:42:53.474 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.044s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:42:53 localhost python3.9[262962]: ansible-ansible.builtin.service_facts Invoked Nov 28 04:42:53 localhost network[262981]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 28 04:42:53 localhost network[262982]: 'network-scripts' will be removed from distribution in near future. Nov 28 04:42:53 localhost network[262983]: It is advised to switch to 'NetworkManager' instead for network management. Nov 28 04:42:53 localhost nova_compute[228333]: 2025-11-28 09:42:53.880 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:42:53 localhost nova_compute[228333]: 2025-11-28 09:42:53.881 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:42:53 localhost nova_compute[228333]: 2025-11-28 09:42:53.907 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:42:53 localhost nova_compute[228333]: 2025-11-28 09:42:53.907 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:42:53 localhost nova_compute[228333]: 2025-11-28 09:42:53.931 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Triggering sync for uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Nov 28 04:42:53 localhost nova_compute[228333]: 2025-11-28 09:42:53.932 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:42:53 localhost nova_compute[228333]: 2025-11-28 09:42:53.933 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:42:53 localhost nova_compute[228333]: 2025-11-28 09:42:53.984 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.052s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:42:54 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:42:55 localhost nova_compute[228333]: 2025-11-28 09:42:55.707 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:42:55 localhost nova_compute[228333]: 2025-11-28 09:42:55.707 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 04:42:55 localhost nova_compute[228333]: 2025-11-28 09:42:55.707 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 04:42:55 localhost nova_compute[228333]: 2025-11-28 09:42:55.908 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:42:55 localhost nova_compute[228333]: 2025-11-28 09:42:55.909 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:42:55 localhost nova_compute[228333]: 2025-11-28 09:42:55.910 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 04:42:55 localhost nova_compute[228333]: 2025-11-28 09:42:55.910 228337 DEBUG nova.objects.instance [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:42:55 localhost nova_compute[228333]: 2025-11-28 09:42:55.912 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:42:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:42:56 localhost podman[263089]: 2025-11-28 09:42:56.092688567 +0000 UTC m=+0.087720364 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 28 04:42:56 localhost podman[263078]: 2025-11-28 09:42:56.0659089 +0000 UTC m=+0.096297785 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:42:56 localhost podman[263089]: 2025-11-28 09:42:56.132681267 +0000 UTC m=+0.127713124 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 28 04:42:56 localhost podman[263078]: 2025-11-28 09:42:56.146765018 +0000 UTC m=+0.177153913 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:42:56 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:42:56 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:42:56 localhost nova_compute[228333]: 2025-11-28 09:42:56.377 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 04:42:56 localhost nova_compute[228333]: 2025-11-28 09:42:56.399 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:42:56 localhost nova_compute[228333]: 2025-11-28 09:42:56.399 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 04:42:56 localhost nova_compute[228333]: 2025-11-28 09:42:56.400 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:42:56 localhost nova_compute[228333]: 2025-11-28 09:42:56.400 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:42:56 localhost nova_compute[228333]: 2025-11-28 09:42:56.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:42:56 localhost nova_compute[228333]: 2025-11-28 09:42:56.681 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 04:42:57 localhost nova_compute[228333]: 2025-11-28 09:42:57.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:42:57 localhost nova_compute[228333]: 2025-11-28 09:42:57.681 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Nov 28 04:42:57 localhost nova_compute[228333]: 2025-11-28 09:42:57.980 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:42:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:42:58 localhost systemd[1]: tmp-crun.CFCK2O.mount: Deactivated successfully. Nov 28 04:42:58 localhost podman[263261]: 2025-11-28 09:42:58.542715681 +0000 UTC m=+0.082145621 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, container_name=ceilometer_agent_compute) Nov 28 04:42:58 localhost podman[263261]: 2025-11-28 09:42:58.557299618 +0000 UTC m=+0.096729608 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 04:42:58 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:42:58 localhost python3.9[263262]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Nov 28 04:42:59 localhost python3.9[263390]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled Nov 28 04:43:00 localhost python3.9[263500]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:43:00 localhost nova_compute[228333]: 2025-11-28 09:43:00.696 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:43:00 localhost python3.9[263557]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/dm-multipath.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/dm-multipath.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:43:00 localhost nova_compute[228333]: 2025-11-28 09:43:00.946 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:43:01 localhost python3.9[263667]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:43:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62567 DF PROTO=TCP SPT=54772 DPT=9102 SEQ=1163547485 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB26E7940000000001030307) Nov 28 04:43:02 localhost python3.9[263777]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:43:02 localhost nova_compute[228333]: 2025-11-28 09:43:02.993 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:43:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62568 DF PROTO=TCP SPT=54772 DPT=9102 SEQ=1163547485 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB26EB830000000001030307) Nov 28 04:43:04 localhost python3.9[263887]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:43:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46576 DF PROTO=TCP SPT=50088 DPT=9102 SEQ=3884471758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB26EF830000000001030307) Nov 28 04:43:05 localhost python3.9[263999]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:43:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:43:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62569 DF PROTO=TCP SPT=54772 DPT=9102 SEQ=1163547485 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB26F3820000000001030307) Nov 28 04:43:05 localhost systemd[1]: tmp-crun.eEca3y.mount: Deactivated successfully. Nov 28 04:43:05 localhost podman[264057]: 2025-11-28 09:43:05.49427505 +0000 UTC m=+0.085378507 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 04:43:05 localhost podman[264057]: 2025-11-28 09:43:05.532346157 +0000 UTC m=+0.123449574 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:43:05 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:43:05 localhost python3.9[264134]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:43:05 localhost nova_compute[228333]: 2025-11-28 09:43:05.949 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:43:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24398 DF PROTO=TCP SPT=57376 DPT=9102 SEQ=2455724692 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB26F7820000000001030307) Nov 28 04:43:06 localhost python3.9[264245]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:43:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:43:07 localhost systemd[1]: tmp-crun.lZmpIu.mount: Deactivated successfully. Nov 28 04:43:07 localhost podman[264356]: 2025-11-28 09:43:07.663398434 +0000 UTC m=+0.091735525 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible) Nov 28 04:43:07 localhost podman[264356]: 2025-11-28 09:43:07.675293443 +0000 UTC m=+0.103630534 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 04:43:07 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:43:07 localhost python3.9[264355]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:43:07 localhost nova_compute[228333]: 2025-11-28 09:43:07.996 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:43:08 localhost python3.9[264484]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:43:08 localhost python3.9[264594]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:43:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62570 DF PROTO=TCP SPT=54772 DPT=9102 SEQ=1163547485 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2703420000000001030307) Nov 28 04:43:09 localhost python3.9[264704]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:43:10 localhost podman[238687]: time="2025-11-28T09:43:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:43:10 localhost podman[238687]: @ - - [28/Nov/2025:09:43:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1" Nov 28 04:43:10 localhost podman[238687]: @ - - [28/Nov/2025:09:43:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17228 "" "Go-http-client/1.1" Nov 28 04:43:10 localhost python3.9[264814]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:43:10 localhost nova_compute[228333]: 2025-11-28 09:43:10.986 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:43:11 localhost python3.9[264926]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:43:12 localhost python3.9[265036]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:43:13 localhost nova_compute[228333]: 2025-11-28 09:43:13.039 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:43:13 localhost python3.9[265093]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:43:14 localhost python3.9[265203]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:43:14 localhost python3.9[265260]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:43:15 localhost python3.9[265370]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:43:15 localhost nova_compute[228333]: 2025-11-28 09:43:15.988 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:43:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:43:16 localhost podman[265481]: 2025-11-28 09:43:16.372088176 +0000 UTC m=+0.080080622 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-type=git, managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public) Nov 28 04:43:16 localhost podman[265481]: 2025-11-28 09:43:16.388351869 +0000 UTC m=+0.096344255 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., version=9.6, config_id=edpm) Nov 28 04:43:16 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:43:16 localhost python3.9[265480]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:43:16 localhost python3.9[265556]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:43:17 localhost python3.9[265666]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:43:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62571 DF PROTO=TCP SPT=54772 DPT=9102 SEQ=1163547485 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2723830000000001030307) Nov 28 04:43:18 localhost nova_compute[228333]: 2025-11-28 09:43:18.041 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:43:18 localhost openstack_network_exporter[240658]: ERROR 09:43:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:43:18 localhost openstack_network_exporter[240658]: ERROR 09:43:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:43:18 localhost openstack_network_exporter[240658]: ERROR 09:43:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:43:18 localhost openstack_network_exporter[240658]: ERROR 09:43:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:43:18 localhost openstack_network_exporter[240658]: Nov 28 04:43:18 localhost openstack_network_exporter[240658]: ERROR 09:43:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:43:18 localhost openstack_network_exporter[240658]: Nov 28 04:43:18 localhost python3.9[265723]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:43:19 localhost python3.9[265833]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:43:19 localhost systemd[1]: Reloading. Nov 28 04:43:19 localhost systemd-sysv-generator[265859]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:43:19 localhost systemd-rc-local-generator[265855]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:43:19 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:43:19 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:43:19 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:43:19 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:43:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:43:19 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:43:19 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:43:19 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:43:19 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:43:20 localhost python3.9[265981]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:43:20 localhost python3.9[266038]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:43:21 localhost nova_compute[228333]: 2025-11-28 09:43:21.019 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:43:21 localhost python3.9[266205]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:43:22 localhost python3.9[266274]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:43:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:43:22 localhost systemd[1]: tmp-crun.E8SDXQ.mount: Deactivated successfully. Nov 28 04:43:22 localhost podman[266382]: 2025-11-28 09:43:22.863505367 +0000 UTC m=+0.087513897 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 04:43:22 localhost podman[266382]: 2025-11-28 09:43:22.874410494 +0000 UTC m=+0.098419024 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 04:43:22 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:43:23 localhost nova_compute[228333]: 2025-11-28 09:43:23.081 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:43:23 localhost python3.9[266414]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:43:23 localhost systemd[1]: Reloading. Nov 28 04:43:23 localhost systemd-sysv-generator[266454]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:43:23 localhost systemd-rc-local-generator[266451]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:43:23 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:43:23 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:43:23 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:43:23 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:43:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:43:23 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:43:23 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:43:23 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:43:23 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:43:23 localhost systemd[1]: Starting Create netns directory... Nov 28 04:43:23 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Nov 28 04:43:23 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Nov 28 04:43:23 localhost systemd[1]: Finished Create netns directory. Nov 28 04:43:24 localhost python3.9[266577]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:43:25 localhost python3.9[266687]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:43:25 localhost python3.9[266744]: ansible-ansible.legacy.file Invoked with group=zuul mode=0700 owner=zuul setype=container_file_t dest=/var/lib/openstack/healthchecks/multipathd/ _original_basename=healthcheck recurse=False state=file path=/var/lib/openstack/healthchecks/multipathd/ force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:43:26 localhost nova_compute[228333]: 2025-11-28 09:43:26.021 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:43:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:43:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:43:26 localhost systemd[1]: tmp-crun.UdwR2W.mount: Deactivated successfully. Nov 28 04:43:26 localhost podman[266855]: 2025-11-28 09:43:26.571046561 +0000 UTC m=+0.131262480 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 28 04:43:26 localhost podman[266856]: 2025-11-28 09:43:26.542562458 +0000 UTC m=+0.102182838 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 28 04:43:26 localhost podman[266856]: 2025-11-28 09:43:26.623402736 +0000 UTC m=+0.183023086 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 28 04:43:26 localhost podman[266855]: 2025-11-28 09:43:26.63331345 +0000 UTC m=+0.193529369 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 28 04:43:26 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:43:26 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:43:26 localhost python3.9[266854]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:43:27 localhost python3.9[267004]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:43:27 localhost python3.9[267061]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/multipathd.json _original_basename=.3pc5sw5c recurse=False state=file path=/var/lib/kolla/config_files/multipathd.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:43:28 localhost nova_compute[228333]: 2025-11-28 09:43:28.082 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:43:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:43:28 localhost python3.9[267171]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:43:28 localhost podman[267172]: 2025-11-28 09:43:28.859333407 +0000 UTC m=+0.094830687 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:43:28 localhost podman[267172]: 2025-11-28 09:43:28.868216268 +0000 UTC m=+0.103713538 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 28 04:43:28 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:43:31 localhost nova_compute[228333]: 2025-11-28 09:43:31.046 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:43:31 localhost python3.9[267467]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False Nov 28 04:43:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5634 DF PROTO=TCP SPT=45426 DPT=9102 SEQ=3421532680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB275CC50000000001030307) Nov 28 04:43:32 localhost python3.9[267577]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 28 04:43:33 localhost nova_compute[228333]: 2025-11-28 09:43:33.117 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:43:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5635 DF PROTO=TCP SPT=45426 DPT=9102 SEQ=3421532680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2760C20000000001030307) Nov 28 04:43:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62572 DF PROTO=TCP SPT=54772 DPT=9102 SEQ=1163547485 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2763820000000001030307) Nov 28 04:43:34 localhost python3.9[267687]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Nov 28 04:43:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5636 DF PROTO=TCP SPT=45426 DPT=9102 SEQ=3421532680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2768C20000000001030307) Nov 28 04:43:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:43:35 localhost systemd[1]: tmp-crun.Ob0Jyl.mount: Deactivated successfully. Nov 28 04:43:35 localhost podman[267731]: 2025-11-28 09:43:35.857541205 +0000 UTC m=+0.089245304 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:43:35 localhost podman[267731]: 2025-11-28 09:43:35.896580833 +0000 UTC m=+0.128284992 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:43:35 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:43:36 localhost nova_compute[228333]: 2025-11-28 09:43:36.049 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:43:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46577 DF PROTO=TCP SPT=50088 DPT=9102 SEQ=3884471758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB276D830000000001030307) Nov 28 04:43:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:43:37 localhost podman[267755]: 2025-11-28 09:43:37.84986616 +0000 UTC m=+0.086187724 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:43:37 localhost podman[267755]: 2025-11-28 09:43:37.89046252 +0000 UTC m=+0.126784064 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 28 04:43:37 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:43:38 localhost nova_compute[228333]: 2025-11-28 09:43:38.119 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:43:39 localhost python3[267865]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False Nov 28 04:43:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5637 DF PROTO=TCP SPT=45426 DPT=9102 SEQ=3421532680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2778820000000001030307) Nov 28 04:43:39 localhost python3[267865]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "f275b8d168f7f57f31e3da49224019f39f95c80a833f083696a964527b07b54f",#012 "Digest": "sha256:6296d2d95faaeb90443ee98443b39aa81b5152414f9542335d72711bb15fefdd",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-multipathd:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6296d2d95faaeb90443ee98443b39aa81b5152414f9542335d72711bb15fefdd"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-11-26T06:12:42.268223466Z",#012 "Config": {#012 "User": "root",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 249482220,#012 "VirtualSize": 249482220,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/da9f726a106a4f4af24ed404443eca5cd50a43c6e5c864c256f158761c28e938/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/da9f726a106a4f4af24ed404443eca5cd50a43c6e5c864c256f158761c28e938/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",#012 "sha256:1e3477d3ea795ca64b46f28aa9428ba791c4250e0fd05e173a4b9c0fb0bdee23",#012 "sha256:135e1f5eea0bd6ac73fc43c122f58d5ed97cb8a56365c4a958c72d470055986b"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "root",#012 "History": [#012 {#012 "created": "2025-11-25T04:02:36.223494528Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:36.223562059Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251125\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:39.054452717Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-11-26T06:10:57.55004106Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550061231Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550071761Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550082711Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550094371Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550104472Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.937139683Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:11:33.845342269Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:11:37.752912815Z",#012 "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:11:38.066850603Z",#012 Nov 28 04:43:40 localhost podman[238687]: time="2025-11-28T09:43:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:43:40 localhost podman[238687]: @ - - [28/Nov/2025:09:43:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1" Nov 28 04:43:40 localhost podman[238687]: @ - - [28/Nov/2025:09:43:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17219 "" "Go-http-client/1.1" Nov 28 04:43:40 localhost python3.9[268037]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:43:41 localhost nova_compute[228333]: 2025-11-28 09:43:41.084 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:43:41 localhost python3.9[268149]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:43:41 localhost python3.9[268204]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:43:42 localhost python3.9[268313]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764323021.6238728-1364-218007772854828/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:43:43 localhost nova_compute[228333]: 2025-11-28 09:43:43.155 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:43:43 localhost python3.9[268368]: ansible-systemd Invoked with state=started name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:43:44 localhost python3.9[268478]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:43:46 localhost python3.9[268588]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:43:46 localhost nova_compute[228333]: 2025-11-28 09:43:46.087 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:43:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:43:46 localhost podman[268606]: 2025-11-28 09:43:46.844170222 +0000 UTC m=+0.082466192 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7) Nov 28 04:43:46 localhost podman[268606]: 2025-11-28 09:43:46.889524477 +0000 UTC m=+0.127820447 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=9.6, config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.expose-services=) Nov 28 04:43:46 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:43:47 localhost python3.9[268716]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Nov 28 04:43:47 localhost nova_compute[228333]: 2025-11-28 09:43:47.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:43:47 localhost python3.9[268826]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled Nov 28 04:43:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5638 DF PROTO=TCP SPT=45426 DPT=9102 SEQ=3421532680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2799820000000001030307) Nov 28 04:43:48 localhost openstack_network_exporter[240658]: ERROR 09:43:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:43:48 localhost openstack_network_exporter[240658]: ERROR 09:43:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:43:48 localhost openstack_network_exporter[240658]: ERROR 09:43:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:43:48 localhost openstack_network_exporter[240658]: ERROR 09:43:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:43:48 localhost openstack_network_exporter[240658]: Nov 28 04:43:48 localhost openstack_network_exporter[240658]: ERROR 09:43:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:43:48 localhost openstack_network_exporter[240658]: Nov 28 04:43:48 localhost nova_compute[228333]: 2025-11-28 09:43:48.156 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:43:48 localhost python3.9[268936]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:43:49 localhost python3.9[268993]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/nvme-fabrics.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/nvme-fabrics.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:43:49 localhost python3.9[269103]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:43:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:43:50.820 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:43:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:43:50.821 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:43:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:43:50.823 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:43:50 localhost python3.9[269213]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Nov 28 04:43:51 localhost nova_compute[228333]: 2025-11-28 09:43:51.125 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:43:52 localhost nova_compute[228333]: 2025-11-28 09:43:52.683 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:43:52 localhost nova_compute[228333]: 2025-11-28 09:43:52.707 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:43:52 localhost nova_compute[228333]: 2025-11-28 09:43:52.708 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:43:52 localhost nova_compute[228333]: 2025-11-28 09:43:52.709 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:43:52 localhost nova_compute[228333]: 2025-11-28 09:43:52.709 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 04:43:52 localhost nova_compute[228333]: 2025-11-28 09:43:52.710 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:43:53 localhost nova_compute[228333]: 2025-11-28 09:43:53.100 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.391s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:43:53 localhost nova_compute[228333]: 2025-11-28 09:43:53.184 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:43:53 localhost nova_compute[228333]: 2025-11-28 09:43:53.336 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:43:53 localhost nova_compute[228333]: 2025-11-28 09:43:53.337 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:43:53 localhost nova_compute[228333]: 2025-11-28 09:43:53.538 228337 WARNING nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 04:43:53 localhost nova_compute[228333]: 2025-11-28 09:43:53.540 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11972MB free_disk=41.837093353271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 04:43:53 localhost nova_compute[228333]: 2025-11-28 09:43:53.540 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:43:53 localhost nova_compute[228333]: 2025-11-28 09:43:53.540 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:43:53 localhost nova_compute[228333]: 2025-11-28 09:43:53.748 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 04:43:53 localhost nova_compute[228333]: 2025-11-28 09:43:53.748 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 04:43:53 localhost nova_compute[228333]: 2025-11-28 09:43:53.749 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 04:43:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:43:53 localhost nova_compute[228333]: 2025-11-28 09:43:53.807 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:43:53 localhost podman[269238]: 2025-11-28 09:43:53.843144754 +0000 UTC m=+0.077918653 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 04:43:53 localhost podman[269238]: 2025-11-28 09:43:53.87937512 +0000 UTC m=+0.114148999 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:43:53 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:43:54 localhost nova_compute[228333]: 2025-11-28 09:43:54.247 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:43:54 localhost nova_compute[228333]: 2025-11-28 09:43:54.254 228337 DEBUG nova.compute.provider_tree [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 04:43:54 localhost nova_compute[228333]: 2025-11-28 09:43:54.272 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 04:43:54 localhost nova_compute[228333]: 2025-11-28 09:43:54.274 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 04:43:54 localhost nova_compute[228333]: 2025-11-28 09:43:54.275 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.734s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:43:54 localhost python3.9[269389]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 28 04:43:55 localhost nova_compute[228333]: 2025-11-28 09:43:55.270 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:43:55 localhost nova_compute[228333]: 2025-11-28 09:43:55.271 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:43:55 localhost nova_compute[228333]: 2025-11-28 09:43:55.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:43:55 localhost python3.9[269503]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:43:56 localhost nova_compute[228333]: 2025-11-28 09:43:56.128 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:43:56 localhost nova_compute[228333]: 2025-11-28 09:43:56.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:43:56 localhost nova_compute[228333]: 2025-11-28 09:43:56.681 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 04:43:56 localhost nova_compute[228333]: 2025-11-28 09:43:56.682 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 04:43:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:43:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:43:56 localhost podman[269615]: 2025-11-28 09:43:56.807929615 +0000 UTC m=+0.086026718 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 28 04:43:56 localhost systemd[1]: tmp-crun.q1NC72.mount: Deactivated successfully. Nov 28 04:43:56 localhost podman[269614]: 2025-11-28 09:43:56.871794766 +0000 UTC m=+0.150375525 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3) Nov 28 04:43:56 localhost podman[269615]: 2025-11-28 09:43:56.889541658 +0000 UTC m=+0.167638781 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 28 04:43:56 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:43:56 localhost podman[269614]: 2025-11-28 09:43:56.914403392 +0000 UTC m=+0.192984151 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible) Nov 28 04:43:56 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:43:57 localhost python3.9[269613]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 28 04:43:57 localhost systemd[1]: Reloading. Nov 28 04:43:57 localhost systemd-rc-local-generator[269680]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:43:57 localhost systemd-sysv-generator[269684]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:43:57 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:43:57 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:43:57 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:43:57 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:43:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:43:57 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:43:57 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:43:57 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:43:57 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:43:57 localhost nova_compute[228333]: 2025-11-28 09:43:57.528 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:43:57 localhost nova_compute[228333]: 2025-11-28 09:43:57.529 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:43:57 localhost nova_compute[228333]: 2025-11-28 09:43:57.529 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 04:43:57 localhost nova_compute[228333]: 2025-11-28 09:43:57.529 228337 DEBUG nova.objects.instance [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:43:57 localhost nova_compute[228333]: 2025-11-28 09:43:57.958 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 04:43:57 localhost nova_compute[228333]: 2025-11-28 09:43:57.972 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:43:57 localhost nova_compute[228333]: 2025-11-28 09:43:57.973 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 04:43:57 localhost nova_compute[228333]: 2025-11-28 09:43:57.973 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:43:58 localhost python3.9[269799]: ansible-ansible.builtin.service_facts Invoked Nov 28 04:43:58 localhost network[269816]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Nov 28 04:43:58 localhost network[269817]: 'network-scripts' will be removed from distribution in near future. Nov 28 04:43:58 localhost network[269818]: It is advised to switch to 'NetworkManager' instead for network management. Nov 28 04:43:58 localhost nova_compute[228333]: 2025-11-28 09:43:58.187 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:43:58 localhost nova_compute[228333]: 2025-11-28 09:43:58.682 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:43:58 localhost nova_compute[228333]: 2025-11-28 09:43:58.682 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 04:43:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:43:59 localhost podman[269829]: 2025-11-28 09:43:59.857791881 +0000 UTC m=+0.091619571 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:43:59 localhost podman[269829]: 2025-11-28 09:43:59.8714893 +0000 UTC m=+0.105316940 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 28 04:43:59 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:44:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.669 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.670 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.674 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 180 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3681520-0443-42ae-82ef-63f5ef41d4b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 180, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:44:00.670358', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'c496ea30-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.842253182, 'message_signature': '2bf10f0406032fad95778c1ac565f464c12c5ac702761631d8f46331caf0d6a4'}]}, 'timestamp': '2025-11-28 09:44:00.675199', '_unique_id': 'cd6d995c4b374e58ae10f8ceef0c5160'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.676 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.678 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.722 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 305908425 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.723 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 30452399 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f894e56-a5ac-438d-96e3-a95720e37c28', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 305908425, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:44:00.678433', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c49e563a-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.850346147, 'message_signature': '2db20201ee58e9682a4f779ab651af0414031bf04b8f8e603082388c9b423305'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30452399, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:44:00.678433', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c49e6d78-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.850346147, 'message_signature': '4c80a147b5bb079b7d3014cbf35c10d0bc2bccbd47ca5af4894b530589361b94'}]}, 'timestamp': '2025-11-28 09:44:00.724330', '_unique_id': 'df23fb4d5a1a4875bdc76b9936d65352'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.725 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.726 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.727 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77a679b7-0ab1-49ee-81bd-cc57e3b0db30', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:44:00.726995', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'c49ee8fc-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.842253182, 'message_signature': '9b208bad587d4a983038230e97acc9e6fdcd2dbc1cf5b3fd8956c357b9c04080'}]}, 'timestamp': '2025-11-28 09:44:00.727511', '_unique_id': '35e1ba9d9d144d94b8a07afd8877f76d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.728 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.729 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.729 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 12742 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '895c0806-d43b-4a0c-8543-ed065502dc38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12742, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:44:00.729672', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'c49f4fcc-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.842253182, 'message_signature': '7b34a2303377bccfc263410b743681b83e8bc1b8ffb01bfd2676b34cccb62910'}]}, 'timestamp': '2025-11-28 09:44:00.730187', '_unique_id': '7fb96e37c74540de9a785b5ac460939c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.731 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.732 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.732 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33c92913-16ec-4256-815d-b51b8a6ce7fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:44:00.732357', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'c49fb8ae-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.842253182, 'message_signature': '3428b4045689ee37f72fb148f80f0cfd0d0002b9b059d58383318c9fe806ca50'}]}, 'timestamp': '2025-11-28 09:44:00.732825', '_unique_id': 'ab618f8657bd48c6935c623cfed63301'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.733 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.734 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.759 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 50880000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ccfbca5-e3cd-4ec6-8e20-e49134b203dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 50880000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:44:00.734975', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'c4a3d6dc-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.93104975, 'message_signature': '29f5d648590cd5575c613da41d5eced1a1982a4fd818c17d82cb583f78dfdc16'}]}, 'timestamp': '2025-11-28 09:44:00.759801', '_unique_id': 'c1ce1fd95c10456a83eab74aa6af75f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.760 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.761 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.775 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.775 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8014d84f-e943-45cc-834d-8ab1c3f3e4a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:44:00.761953', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c4a64372-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.933881413, 'message_signature': '9bc840e56cb787671ebfbdc75ad440039245138affe36ffe8026a2ee7ef4303d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:44:00.761953', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c4a6542a-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.933881413, 'message_signature': '1d582f3400f71ec6bdfecf08533dfc41bf78f61e14b3e0a364d7f6d7e2d28e04'}]}, 'timestamp': '2025-11-28 09:44:00.776132', '_unique_id': '7cf8530e20ae443080eb33d6f7f52b58'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.777 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.778 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.778 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.778 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fcb7df0-183f-4084-a34d-2c611cbf1963', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:44:00.778308', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c4a6bb4a-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.933881413, 'message_signature': '346cc0d6612f0c9022e63f5dcd32b870c8aa9299ac6ada1a0ae5c69f2fac9da4'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:44:00.778308', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c4a6ccd4-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.933881413, 'message_signature': 'a7a66c0fc4fc9e43d058926b41613adf731443e88de922664f021768ce02d365'}]}, 'timestamp': '2025-11-28 09:44:00.779212', '_unique_id': 'dc174723af8341bd91833ffc53c25dcb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.780 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.781 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.781 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.781 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a27fc27-ad17-492b-b445-1697f21675ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:44:00.781358', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c4a7326e-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.933881413, 'message_signature': 'e4bf4d54afa355ac42b92746eb103d7b4d85bae70ae8bd70a40b764dbcf3fedf'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:44:00.781358', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c4a7427c-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.933881413, 'message_signature': 'e94353dc43081b4398a6daeeba92ebc4e8bd8a1e179ea1a97821cf2a37024f14'}]}, 'timestamp': '2025-11-28 09:44:00.782224', '_unique_id': 'cf38a6f63fb9457ebc1da9b9d46cff1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.783 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.784 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.784 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1313024378 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.784 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 168520223 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2307aff1-52c3-4b8b-a252-b22dced9a7be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1313024378, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:44:00.784416', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c4a7a9e2-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.850346147, 'message_signature': 'c1d3fe4eb9f6690f60b197d273222152af4223b41f25b15fb00f81607321baad'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 168520223, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:44:00.784416', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c4a7bb26-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.850346147, 'message_signature': 'c3b440ac4d74a826f387aac76af8bbcedef6e47c19d00fba13d58dcb11f687f0'}]}, 'timestamp': '2025-11-28 09:44:00.785289', '_unique_id': 'd89a09cde2834e929e027ff5d0928a8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.786 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.787 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.787 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '67d19e6b-7141-4213-9c8c-c1c74bf08f6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 144, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:44:00.787639', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'c4a8280e-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.842253182, 'message_signature': '11ec7822c0fe2d089030de5ad6125f2bb6007e3f78447cc2c6375bf6bcf13b93'}]}, 'timestamp': '2025-11-28 09:44:00.788135', '_unique_id': '7b02fab328994d72be9a2cd0cb81df7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.789 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.790 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.790 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 91 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d80f67c-bede-417d-8843-cf16e965d476', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 91, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:44:00.790222', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'c4a88ccc-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.842253182, 'message_signature': 'ab2cb6e580e42562e73a5b3355baa9cb0566fd1a77fad24c262c53a748d15c7c'}]}, 'timestamp': '2025-11-28 09:44:00.790876', '_unique_id': '7c4e7316010145888c8bf582b8f48ac9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.791 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.792 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.793 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.793 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18029ac1-c6c6-4c8b-8cd3-6f610542eeba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:44:00.792979', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c4a8f9be-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.850346147, 'message_signature': 'b434e829766485e4610c43fffe5db840a02ff1403b99c5122e22a405183370df'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:44:00.792979', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c4a909ae-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.850346147, 'message_signature': 'a944736ffa854ccbe0c4a8508a05f167ce23aa6ba6d1c96ae688aaa02c37abd8'}]}, 'timestamp': '2025-11-28 09:44:00.793845', '_unique_id': '8685079441354bddbef2f8ad7eb7e59d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.794 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.795 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.796 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72548d52-276c-4487-bb5c-01b28b0d321a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:44:00.796110', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'c4a972e0-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.842253182, 'message_signature': '184c7775665b02b1910a3df2cd920b4253bfa74d83c4d02bb067302ceb32a0f0'}]}, 'timestamp': '2025-11-28 09:44:00.796570', '_unique_id': '3fc641693b0543aa9f4e48398d4b695d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.797 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.798 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.798 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16bd6754-9c90-4d27-bbc3-43220673c3c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:44:00.798680', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'c4a9d712-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.842253182, 'message_signature': 'c4662313073a4f387bab679b0dcdaa82fd4eef2b69073243d996774d74089365'}]}, 'timestamp': '2025-11-28 09:44:00.799165', '_unique_id': 'c5f4f20f278a406685df177e70ab521a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.800 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.801 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.801 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 73981952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.801 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37df9261-6ca9-4e15-be7e-d603ea7e6db8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73981952, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:44:00.801274', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c4aa3c52-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.850346147, 'message_signature': 'b80ae72e00a2b9d664f172ce2cdf43f7521ed0bbdf1b0dbf1628b64bbc68fc3b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:44:00.801274', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c4aa4c7e-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.850346147, 'message_signature': 'de43b4de1ec1ae929898eb12c0758de7fee6b4adce2a585ea7b798a90ee73b3a'}]}, 'timestamp': '2025-11-28 09:44:00.802139', '_unique_id': 'd69416f51e88481f9fd2b299281d4aa9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.803 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.804 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.804 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.804 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 52.40234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f640eb14-bea8-4dae-9ae0-1919acf3087c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.40234375, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:44:00.804790', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'c4aac6ea-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.93104975, 'message_signature': 'a71728d0aeb4b58e2bb30facddf52f1f963f9789aaf31612fc0bc122cfd05070'}]}, 'timestamp': '2025-11-28 09:44:00.805281', '_unique_id': 'a0119a9f9190435fbed9dfe4985f2c4b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.806 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.807 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.807 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.807 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0377595-d671-4240-8423-d02f3a110586', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:44:00.807739', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'c4ab38dc-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.842253182, 'message_signature': '885cebf6fd11564aeb74dd5ec3b964d9e41765762e42fa1ee341dbd50187dfdc'}]}, 'timestamp': '2025-11-28 09:44:00.808223', '_unique_id': '1a3cfa32cf9e4ca6b304f3edc4f3cbe8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.809 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.810 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.810 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 9621 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3c0342d-f1df-4b28-af36-ba265779a962', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9621, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:44:00.810546', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'c4abaa38-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.842253182, 'message_signature': '9b05a9f42070b59a6fc09b0acc65b35835d5ec29f105d7250eaf15c0f8d6982c'}]}, 'timestamp': '2025-11-28 09:44:00.811131', '_unique_id': 'f940f747287344c087d2f3e2fcbbf917'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.812 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.813 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.813 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 566 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.813 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '963003f6-262d-4102-a394-56131c7f2126', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 566, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:44:00.813329', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c4ac13a6-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.850346147, 'message_signature': 'bee82a325234a6a44693dfecebbd74c49e2bddbccfe43bdd533aaba7e0112273'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:44:00.813329', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c4ac247c-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.850346147, 'message_signature': '3caaea39f62aedef229bc5e121f850f3f1d19be6d56cf35fb83a4ce3164fdbe6'}]}, 'timestamp': '2025-11-28 09:44:00.814250', '_unique_id': '05533cea7204499e846d7540ad24dd7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.815 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.816 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.816 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da07d8b8-7d3f-4b1b-91f0-ec2ac8b5f334', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:44:00.816115', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c4ac7d28-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.850346147, 'message_signature': '52c1be4b61344d0711412091919fb607831dc4a023f90d0049c362cf07e7a91e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:44:00.816115', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c4ac898a-cc3e-11f0-a370-fa163eb02593', 'monotonic_time': 10874.850346147, 'message_signature': 'cbb09714ffe6c0ea4c5f61a4770d27407666f57dbbdb97e28cbba7b4025f04d6'}]}, 'timestamp': '2025-11-28 09:44:00.816711', '_unique_id': 'a45cdde572f541a1b0ee2ff53ff91933'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 ERROR oslo_messaging.notify.messaging Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.817 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:44:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:44:00.818 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:44:01 localhost nova_compute[228333]: 2025-11-28 09:44:01.159 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:44:01 localhost nova_compute[228333]: 2025-11-28 09:44:01.682 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:44:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15675 DF PROTO=TCP SPT=38844 DPT=9102 SEQ=3389022923 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB27D1F50000000001030307) Nov 28 04:44:03 localhost nova_compute[228333]: 2025-11-28 09:44:03.231 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:44:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15676 DF PROTO=TCP SPT=38844 DPT=9102 SEQ=3389022923 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB27D6020000000001030307) Nov 28 04:44:04 localhost python3.9[270069]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:44:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5639 DF PROTO=TCP SPT=45426 DPT=9102 SEQ=3421532680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB27D9820000000001030307) Nov 28 04:44:04 localhost python3.9[270180]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:44:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15677 DF PROTO=TCP SPT=38844 DPT=9102 SEQ=3389022923 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB27DE030000000001030307) Nov 28 04:44:05 localhost python3.9[270291]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:44:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:44:06 localhost nova_compute[228333]: 2025-11-28 09:44:06.162 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:44:06 localhost systemd[1]: tmp-crun.LxB72V.mount: Deactivated successfully. Nov 28 04:44:06 localhost podman[270403]: 2025-11-28 09:44:06.237050689 +0000 UTC m=+0.103965086 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:44:06 localhost podman[270403]: 2025-11-28 09:44:06.245903329 +0000 UTC m=+0.112817726 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 04:44:06 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:44:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62573 DF PROTO=TCP SPT=54772 DPT=9102 SEQ=1163547485 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB27E1820000000001030307) Nov 28 04:44:06 localhost python3.9[270402]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:44:07 localhost python3.9[270537]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:44:08 localhost python3.9[270648]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:44:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:44:08 localhost podman[270650]: 2025-11-28 09:44:08.158549034 +0000 UTC m=+0.083476435 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Nov 28 04:44:08 localhost podman[270650]: 2025-11-28 09:44:08.17339242 +0000 UTC m=+0.098319851 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.schema-version=1.0) Nov 28 04:44:08 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:44:08 localhost nova_compute[228333]: 2025-11-28 09:44:08.234 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:44:08 localhost python3.9[270778]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:44:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15678 DF PROTO=TCP SPT=38844 DPT=9102 SEQ=3389022923 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB27EDC20000000001030307) Nov 28 04:44:09 localhost python3.9[270889]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:44:10 localhost podman[238687]: time="2025-11-28T09:44:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:44:10 localhost podman[238687]: @ - - [28/Nov/2025:09:44:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1" Nov 28 04:44:10 localhost podman[238687]: @ - - [28/Nov/2025:09:44:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17216 "" "Go-http-client/1.1" Nov 28 04:44:11 localhost nova_compute[228333]: 2025-11-28 09:44:11.204 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:44:13 localhost nova_compute[228333]: 2025-11-28 09:44:13.274 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:44:13 localhost python3.9[271000]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:44:14 localhost python3.9[271110]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:44:14 localhost python3.9[271220]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:44:15 localhost python3.9[271330]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:44:16 localhost nova_compute[228333]: 2025-11-28 09:44:16.205 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:44:16 localhost python3.9[271440]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:44:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:44:17 localhost podman[271551]: 2025-11-28 09:44:17.185793117 +0000 UTC m=+0.079047800 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.) Nov 28 04:44:17 localhost podman[271551]: 2025-11-28 09:44:17.202301798 +0000 UTC m=+0.095556511 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vendor=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Nov 28 04:44:17 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:44:17 localhost python3.9[271550]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:44:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15679 DF PROTO=TCP SPT=38844 DPT=9102 SEQ=3389022923 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB280D830000000001030307) Nov 28 04:44:18 localhost openstack_network_exporter[240658]: ERROR 09:44:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:44:18 localhost openstack_network_exporter[240658]: ERROR 09:44:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:44:18 localhost openstack_network_exporter[240658]: ERROR 09:44:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:44:18 localhost openstack_network_exporter[240658]: ERROR 09:44:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:44:18 localhost openstack_network_exporter[240658]: Nov 28 04:44:18 localhost openstack_network_exporter[240658]: ERROR 09:44:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:44:18 localhost openstack_network_exporter[240658]: Nov 28 04:44:18 localhost nova_compute[228333]: 2025-11-28 09:44:18.274 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:44:18 localhost python3.9[271681]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:44:19 localhost python3.9[271791]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:44:19 localhost python3.9[271901]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:44:20 localhost python3.9[272011]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:44:21 localhost python3.9[272121]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:44:21 localhost nova_compute[228333]: 2025-11-28 09:44:21.229 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:44:21 localhost python3.9[272231]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:44:22 localhost python3.9[272341]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:44:22 localhost python3.9[272487]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:44:23 localhost nova_compute[228333]: 2025-11-28 09:44:23.327 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:44:23 localhost podman[272666]: 2025-11-28 09:44:23.335748204 +0000 UTC m=+0.134199885 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, version=7, vendor=Red Hat, Inc., GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, distribution-scope=public, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 28 04:44:23 localhost python3.9[272679]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:44:23 localhost podman[272666]: 2025-11-28 09:44:23.441407015 +0000 UTC m=+0.239858706 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, com.redhat.component=rhceph-container, distribution-scope=public, CEPH_POINT_RELEASE=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, name=rhceph, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12) Nov 28 04:44:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:44:24 localhost podman[272882]: 2025-11-28 09:44:24.041784616 +0000 UTC m=+0.075356479 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 28 04:44:24 localhost podman[272882]: 2025-11-28 09:44:24.057266373 +0000 UTC m=+0.090838236 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 04:44:24 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:44:24 localhost python3.9[272881]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:44:24 localhost python3.9[273045]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:44:26 localhost nova_compute[228333]: 2025-11-28 09:44:26.232 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:44:26 localhost python3.9[273173]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Nov 28 04:44:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:44:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:44:27 localhost podman[273261]: 2025-11-28 09:44:27.856574502 +0000 UTC m=+0.080169757 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:44:27 localhost podman[273263]: 2025-11-28 09:44:27.945778023 +0000 UTC m=+0.166328188 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 28 04:44:27 localhost podman[273261]: 2025-11-28 09:44:27.953486885 +0000 UTC m=+0.177082130 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Nov 28 04:44:27 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:44:28 localhost podman[273263]: 2025-11-28 09:44:28.004520477 +0000 UTC m=+0.225070612 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true) Nov 28 04:44:28 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:44:28 localhost python3.9[273311]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Nov 28 04:44:28 localhost systemd[1]: Reloading. Nov 28 04:44:28 localhost systemd-rc-local-generator[273346]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:44:28 localhost systemd-sysv-generator[273356]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:44:28 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:44:28 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:44:28 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:44:28 localhost nova_compute[228333]: 2025-11-28 09:44:28.330 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:44:28 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:44:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:44:28 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:44:28 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:44:28 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:44:28 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:44:29 localhost python3.9[273471]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:44:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:44:30 localhost systemd[1]: tmp-crun.0f2JIp.mount: Deactivated successfully. Nov 28 04:44:30 localhost podman[273583]: 2025-11-28 09:44:30.26952364 +0000 UTC m=+0.092708297 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 28 04:44:30 localhost podman[273583]: 2025-11-28 09:44:30.283434256 +0000 UTC m=+0.106618923 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:44:30 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:44:30 localhost python3.9[273582]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:44:31 localhost python3.9[273715]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:44:31 localhost nova_compute[228333]: 2025-11-28 09:44:31.260 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:44:31 localhost python3.9[273826]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:44:32 localhost python3.9[273937]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:44:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7598 DF PROTO=TCP SPT=33082 DPT=9102 SEQ=3782079456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2847250000000001030307) Nov 28 04:44:32 localhost python3.9[274048]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:44:33 localhost nova_compute[228333]: 2025-11-28 09:44:33.371 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:44:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7599 DF PROTO=TCP SPT=33082 DPT=9102 SEQ=3782079456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB284B420000000001030307) Nov 28 04:44:33 localhost python3.9[274159]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:44:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15680 DF PROTO=TCP SPT=38844 DPT=9102 SEQ=3389022923 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB284D820000000001030307) Nov 28 04:44:34 localhost python3.9[274270]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:44:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7600 DF PROTO=TCP SPT=33082 DPT=9102 SEQ=3782079456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2853420000000001030307) Nov 28 04:44:36 localhost nova_compute[228333]: 2025-11-28 09:44:36.264 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:44:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5640 DF PROTO=TCP SPT=45426 DPT=9102 SEQ=3421532680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2857830000000001030307) Nov 28 04:44:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:44:36 localhost podman[274343]: 2025-11-28 09:44:36.86779723 +0000 UTC m=+0.105020370 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 04:44:36 localhost podman[274343]: 2025-11-28 09:44:36.90840491 +0000 UTC m=+0.145628000 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 04:44:36 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:44:37 localhost python3.9[274402]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:44:37 localhost python3.9[274512]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:44:38 localhost python3.9[274622]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:44:38 localhost nova_compute[228333]: 2025-11-28 09:44:38.373 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:44:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:44:38 localhost podman[274732]: 2025-11-28 09:44:38.852140984 +0000 UTC m=+0.084053614 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 28 04:44:38 localhost podman[274732]: 2025-11-28 09:44:38.861228801 +0000 UTC m=+0.093141391 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=multipathd) Nov 28 04:44:38 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:44:38 localhost python3.9[274741]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:44:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7601 DF PROTO=TCP SPT=33082 DPT=9102 SEQ=3782079456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2863020000000001030307) Nov 28 04:44:40 localhost podman[238687]: time="2025-11-28T09:44:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:44:40 localhost podman[238687]: @ - - [28/Nov/2025:09:44:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1" Nov 28 04:44:40 localhost podman[238687]: @ - - [28/Nov/2025:09:44:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17223 "" "Go-http-client/1.1" Nov 28 04:44:40 localhost python3.9[274861]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:44:41 localhost nova_compute[228333]: 2025-11-28 09:44:41.301 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:44:41 localhost python3.9[274971]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:44:42 localhost python3.9[275081]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Nov 28 04:44:43 localhost nova_compute[228333]: 2025-11-28 09:44:43.403 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:44:43 localhost python3.9[275191]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 28 04:44:44 localhost python3.9[275301]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 28 04:44:44 localhost python3.9[275411]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Nov 28 04:44:46 localhost nova_compute[228333]: 2025-11-28 09:44:46.303 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:44:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:44:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7602 DF PROTO=TCP SPT=33082 DPT=9102 SEQ=3782079456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2883820000000001030307) Nov 28 04:44:47 localhost podman[275429]: 2025-11-28 09:44:47.852191507 +0000 UTC m=+0.080073003 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, name=ubi9-minimal, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=) Nov 28 04:44:47 localhost podman[275429]: 2025-11-28 09:44:47.894581596 +0000 UTC m=+0.122463102 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41) Nov 28 04:44:47 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:44:48 localhost openstack_network_exporter[240658]: ERROR 09:44:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:44:48 localhost openstack_network_exporter[240658]: ERROR 09:44:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:44:48 localhost openstack_network_exporter[240658]: ERROR 09:44:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:44:48 localhost openstack_network_exporter[240658]: ERROR 09:44:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:44:48 localhost openstack_network_exporter[240658]: Nov 28 04:44:48 localhost openstack_network_exporter[240658]: ERROR 09:44:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:44:48 localhost openstack_network_exporter[240658]: Nov 28 04:44:48 localhost nova_compute[228333]: 2025-11-28 09:44:48.404 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:44:49 localhost nova_compute[228333]: 2025-11-28 09:44:49.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:44:50 localhost python3.9[275541]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None Nov 28 04:44:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:44:50.823 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:44:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:44:50.824 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:44:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:44:50.826 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:44:51 localhost nova_compute[228333]: 2025-11-28 09:44:51.334 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:44:51 localhost sshd[275560]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:44:51 localhost systemd-logind[764]: New session 60 of user zuul. Nov 28 04:44:51 localhost systemd[1]: Started Session 60 of User zuul. Nov 28 04:44:51 localhost systemd[1]: session-60.scope: Deactivated successfully. Nov 28 04:44:51 localhost systemd-logind[764]: Session 60 logged out. Waiting for processes to exit. Nov 28 04:44:51 localhost systemd-logind[764]: Removed session 60. Nov 28 04:44:53 localhost nova_compute[228333]: 2025-11-28 09:44:53.448 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:44:53 localhost python3.9[275671]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:44:53 localhost nova_compute[228333]: 2025-11-28 09:44:53.676 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:44:54 localhost python3.9[275757]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764323093.0572686-3037-135414968884635/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:44:54 localhost nova_compute[228333]: 2025-11-28 09:44:54.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:44:54 localhost nova_compute[228333]: 2025-11-28 09:44:54.700 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:44:54 localhost nova_compute[228333]: 2025-11-28 09:44:54.700 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:44:54 localhost nova_compute[228333]: 2025-11-28 09:44:54.700 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:44:54 localhost nova_compute[228333]: 2025-11-28 09:44:54.701 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 04:44:54 localhost nova_compute[228333]: 2025-11-28 09:44:54.702 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:44:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:44:54 localhost python3.9[275865]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:44:54 localhost podman[275867]: 2025-11-28 09:44:54.84593431 +0000 UTC m=+0.078574899 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 28 04:44:54 localhost podman[275867]: 2025-11-28 09:44:54.879087446 +0000 UTC m=+0.111727995 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 28 04:44:54 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:44:55 localhost nova_compute[228333]: 2025-11-28 09:44:55.164 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:44:55 localhost nova_compute[228333]: 2025-11-28 09:44:55.250 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:44:55 localhost nova_compute[228333]: 2025-11-28 09:44:55.251 228337 DEBUG nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:44:55 localhost nova_compute[228333]: 2025-11-28 09:44:55.449 228337 WARNING nova.virt.libvirt.driver [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 04:44:55 localhost nova_compute[228333]: 2025-11-28 09:44:55.451 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12117MB free_disk=41.837093353271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 04:44:55 localhost nova_compute[228333]: 2025-11-28 09:44:55.451 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:44:55 localhost nova_compute[228333]: 2025-11-28 09:44:55.451 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:44:55 localhost nova_compute[228333]: 2025-11-28 09:44:55.542 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 04:44:55 localhost nova_compute[228333]: 2025-11-28 09:44:55.543 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 04:44:55 localhost nova_compute[228333]: 2025-11-28 09:44:55.543 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 04:44:55 localhost nova_compute[228333]: 2025-11-28 09:44:55.587 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:44:55 localhost python3.9[275965]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:44:56 localhost nova_compute[228333]: 2025-11-28 09:44:56.080 228337 DEBUG oslo_concurrency.processutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:44:56 localhost nova_compute[228333]: 2025-11-28 09:44:56.087 228337 DEBUG nova.compute.provider_tree [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 04:44:56 localhost nova_compute[228333]: 2025-11-28 09:44:56.111 228337 DEBUG nova.scheduler.client.report [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 04:44:56 localhost nova_compute[228333]: 2025-11-28 09:44:56.114 228337 DEBUG nova.compute.resource_tracker [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 04:44:56 localhost nova_compute[228333]: 2025-11-28 09:44:56.114 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.663s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:44:56 localhost python3.9[276093]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:44:56 localhost nova_compute[228333]: 2025-11-28 09:44:56.336 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:44:56 localhost python3.9[276181]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764323095.7662373-3037-63201533351861/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:44:57 localhost nova_compute[228333]: 2025-11-28 09:44:57.110 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:44:57 localhost nova_compute[228333]: 2025-11-28 09:44:57.143 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:44:57 localhost python3.9[276289]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:44:57 localhost nova_compute[228333]: 2025-11-28 09:44:57.680 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:44:57 localhost nova_compute[228333]: 2025-11-28 09:44:57.681 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 04:44:57 localhost nova_compute[228333]: 2025-11-28 09:44:57.681 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 04:44:57 localhost python3.9[276375]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764323096.9453437-3037-205330200693193/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=534005c01c7af821d962fad87e973f668cecbdc9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:44:58 localhost nova_compute[228333]: 2025-11-28 09:44:58.452 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:44:58 localhost python3.9[276483]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:44:58 localhost nova_compute[228333]: 2025-11-28 09:44:58.675 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:44:58 localhost nova_compute[228333]: 2025-11-28 09:44:58.675 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:44:58 localhost nova_compute[228333]: 2025-11-28 09:44:58.676 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 04:44:58 localhost nova_compute[228333]: 2025-11-28 09:44:58.676 228337 DEBUG nova.objects.instance [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:44:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:44:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:44:58 localhost podman[276521]: 2025-11-28 09:44:58.863817047 +0000 UTC m=+0.091729832 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 04:44:58 localhost podman[276523]: 2025-11-28 09:44:58.917070409 +0000 UTC m=+0.141234449 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible) Nov 28 04:44:58 localhost podman[276523]: 2025-11-28 09:44:58.926282281 +0000 UTC m=+0.150446291 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:44:58 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:44:58 localhost podman[276521]: 2025-11-28 09:44:58.982857485 +0000 UTC m=+0.210770250 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 28 04:44:58 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:44:59 localhost python3.9[276609]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764323098.084483-3037-210506789598648/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:44:59 localhost nova_compute[228333]: 2025-11-28 09:44:59.230 228337 DEBUG nova.network.neutron [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 04:44:59 localhost nova_compute[228333]: 2025-11-28 09:44:59.260 228337 DEBUG oslo_concurrency.lockutils [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:44:59 localhost nova_compute[228333]: 2025-11-28 09:44:59.260 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 04:44:59 localhost nova_compute[228333]: 2025-11-28 09:44:59.261 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:44:59 localhost nova_compute[228333]: 2025-11-28 09:44:59.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:44:59 localhost nova_compute[228333]: 2025-11-28 09:44:59.681 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:44:59 localhost nova_compute[228333]: 2025-11-28 09:44:59.682 228337 DEBUG nova.compute.manager [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 04:44:59 localhost python3.9[276717]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:45:00 localhost python3.9[276803]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764323099.291147-3037-86133503458409/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:45:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:45:00 localhost podman[276875]: 2025-11-28 09:45:00.846276859 +0000 UTC m=+0.083178440 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm) Nov 28 04:45:00 localhost podman[276875]: 2025-11-28 09:45:00.861271038 +0000 UTC m=+0.098172589 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:45:00 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:45:01 localhost python3.9[276932]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:45:01 localhost nova_compute[228333]: 2025-11-28 09:45:01.396 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:01 localhost python3.9[277042]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:45:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2452 DF PROTO=TCP SPT=34262 DPT=9102 SEQ=3361221625 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB28BC550000000001030307) Nov 28 04:45:02 localhost python3.9[277152]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:45:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2453 DF PROTO=TCP SPT=34262 DPT=9102 SEQ=3361221625 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB28C0430000000001030307) Nov 28 04:45:03 localhost nova_compute[228333]: 2025-11-28 09:45:03.478 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:03 localhost python3.9[277264]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:45:03 localhost nova_compute[228333]: 2025-11-28 09:45:03.682 228337 DEBUG oslo_service.periodic_task [None req-2d7d294a-25a6-498f-a454-50761933b8d0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:45:04 localhost python3.9[277372]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:45:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7603 DF PROTO=TCP SPT=33082 DPT=9102 SEQ=3782079456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB28C3820000000001030307) Nov 28 04:45:04 localhost python3.9[277482]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:45:05 localhost python3.9[277537]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute.json _original_basename=nova_compute.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:45:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2454 DF PROTO=TCP SPT=34262 DPT=9102 SEQ=3361221625 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB28C8420000000001030307) Nov 28 04:45:05 localhost python3.9[277645]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Nov 28 04:45:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15681 DF PROTO=TCP SPT=38844 DPT=9102 SEQ=3389022923 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB28CB820000000001030307) Nov 28 04:45:06 localhost nova_compute[228333]: 2025-11-28 09:45:06.398 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:06 localhost python3.9[277700]: ansible-ansible.legacy.file Invoked with mode=0700 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute_init.json _original_basename=nova_compute_init.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute_init.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Nov 28 04:45:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:45:07 localhost podman[277811]: 2025-11-28 09:45:07.281444301 +0000 UTC m=+0.086959405 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:45:07 localhost podman[277811]: 2025-11-28 09:45:07.290181609 +0000 UTC m=+0.095696673 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:45:07 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:45:07 localhost python3.9[277810]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False Nov 28 04:45:08 localhost python3.9[277942]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 28 04:45:08 localhost nova_compute[228333]: 2025-11-28 09:45:08.481 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:45:09 localhost systemd[1]: tmp-crun.kG4RNn.mount: Deactivated successfully. Nov 28 04:45:09 localhost podman[278052]: 2025-11-28 09:45:09.053741534 +0000 UTC m=+0.068292864 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:45:09 localhost podman[278052]: 2025-11-28 09:45:09.06242013 +0000 UTC m=+0.076971500 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 28 04:45:09 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:45:09 localhost python3[278053]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False Nov 28 04:45:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2455 DF PROTO=TCP SPT=34262 DPT=9102 SEQ=3361221625 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB28D8020000000001030307) Nov 28 04:45:09 localhost python3[278053]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a",#012 "Digest": "sha256:647f1d5dc1b70ffa3e1832199619d57bfaeceac8823ff53ece64b8e42cc9688e",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:647f1d5dc1b70ffa3e1832199619d57bfaeceac8823ff53ece64b8e42cc9688e"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-11-26T06:36:07.10279245Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1211782527,#012 "VirtualSize": 1211782527,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309/diff:/var/lib/containers/storage/overlay/f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a/diff:/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",#012 "sha256:1e3477d3ea795ca64b46f28aa9428ba791c4250e0fd05e173a4b9c0fb0bdee23",#012 "sha256:c136b33417f134a3b932677bcf7a2df089c29f20eca250129eafd2132d4708bb",#012 "sha256:7913bde445307e7f24767d9149b2e7f498930793ac9f073ccec69b608c009d31",#012 "sha256:084b2323a717fe711217b0ec21da61f4804f7a0d506adae935888421b80809cf"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2025-11-25T04:02:36.223494528Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:36.223562059Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251125\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:39.054452717Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-11-26T06:10:57.55004106Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550061231Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550071761Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550082711Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550094371Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550104472Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.937139683Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:11:33.845342269Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Nov 28 04:45:10 localhost podman[238687]: time="2025-11-28T09:45:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:45:10 localhost podman[238687]: @ - - [28/Nov/2025:09:45:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1" Nov 28 04:45:10 localhost podman[238687]: @ - - [28/Nov/2025:09:45:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17227 "" "Go-http-client/1.1" Nov 28 04:45:10 localhost python3.9[278241]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:45:11 localhost nova_compute[228333]: 2025-11-28 09:45:11.444 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:11 localhost python3.9[278353]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False Nov 28 04:45:12 localhost python3.9[278463]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data Nov 28 04:45:13 localhost python3[278573]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False Nov 28 04:45:13 localhost nova_compute[228333]: 2025-11-28 09:45:13.512 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:13 localhost python3[278573]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a",#012 "Digest": "sha256:647f1d5dc1b70ffa3e1832199619d57bfaeceac8823ff53ece64b8e42cc9688e",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:647f1d5dc1b70ffa3e1832199619d57bfaeceac8823ff53ece64b8e42cc9688e"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-11-26T06:36:07.10279245Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1211782527,#012 "VirtualSize": 1211782527,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/c3914bdda39f47c0c497a56396d11c84b489b87df2bfd019b00ddced1e1ae309/diff:/var/lib/containers/storage/overlay/f20c3ba929bbb53a84e323dddb8c0eaf3ca74b6729310e964e1fa9eee119e43a/diff:/var/lib/containers/storage/overlay/06a1fa74af6494e3f3865876d25e5a11b62fb12ede8164b96bce734f8d084c66/diff:/var/lib/containers/storage/overlay/cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/f7726cecd9e8969401979ecd2369f385c53efc762aea19175eca5dfbffa00449/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:cf752d9babba20815c6849e3dd587209dffdfbbc56c600ddbc26d05721943ffa",#012 "sha256:1e3477d3ea795ca64b46f28aa9428ba791c4250e0fd05e173a4b9c0fb0bdee23",#012 "sha256:c136b33417f134a3b932677bcf7a2df089c29f20eca250129eafd2132d4708bb",#012 "sha256:7913bde445307e7f24767d9149b2e7f498930793ac9f073ccec69b608c009d31",#012 "sha256:084b2323a717fe711217b0ec21da61f4804f7a0d506adae935888421b80809cf"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251125",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "1f5c0439f2433cb462b222a5bb23e629",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2025-11-25T04:02:36.223494528Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:cacf1a97b4abfca5db2db22f7ddbca8fd7daa5076a559639c109f09aaf55871d in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:36.223562059Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251125\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-25T04:02:39.054452717Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-11-26T06:10:57.55004106Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550061231Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550071761Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550082711Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550094371Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.550104472Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:10:57.937139683Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-11-26T06:11:33.845342269Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Nov 28 04:45:14 localhost python3.9[278747]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:45:15 localhost python3.9[278859]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:45:16 localhost python3.9[278968]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764323115.443625-3715-193584405947285/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:45:16 localhost nova_compute[228333]: 2025-11-28 09:45:16.444 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:16 localhost python3.9[279023]: ansible-systemd Invoked with state=started name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:45:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2456 DF PROTO=TCP SPT=34262 DPT=9102 SEQ=3361221625 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB28F7830000000001030307) Nov 28 04:45:17 localhost python3.9[279133]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:45:18 localhost openstack_network_exporter[240658]: ERROR 09:45:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:45:18 localhost openstack_network_exporter[240658]: ERROR 09:45:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:45:18 localhost openstack_network_exporter[240658]: ERROR 09:45:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:45:18 localhost openstack_network_exporter[240658]: ERROR 09:45:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:45:18 localhost openstack_network_exporter[240658]: Nov 28 04:45:18 localhost openstack_network_exporter[240658]: ERROR 09:45:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:45:18 localhost openstack_network_exporter[240658]: Nov 28 04:45:18 localhost nova_compute[228333]: 2025-11-28 09:45:18.514 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:18 localhost python3.9[279241]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:45:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:45:18 localhost podman[279259]: 2025-11-28 09:45:18.847239295 +0000 UTC m=+0.080381535 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, release=1755695350, distribution-scope=public, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Nov 28 04:45:18 localhost podman[279259]: 2025-11-28 09:45:18.865966018 +0000 UTC m=+0.099108288 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Nov 28 04:45:18 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:45:19 localhost python3.9[279369]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Nov 28 04:45:20 localhost python3.9[279479]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Nov 28 04:45:20 localhost systemd-journald[47227]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 119.8 (399 of 333 items), suggesting rotation. Nov 28 04:45:20 localhost systemd-journald[47227]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 28 04:45:20 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 28 04:45:20 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 28 04:45:21 localhost nova_compute[228333]: 2025-11-28 09:45:21.446 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:21 localhost python3.9[279613]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Nov 28 04:45:22 localhost systemd[1]: Stopping nova_compute container... Nov 28 04:45:22 localhost nova_compute[228333]: 2025-11-28 09:45:22.826 228337 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170#033[00m Nov 28 04:45:23 localhost nova_compute[228333]: 2025-11-28 09:45:23.546 228337 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:24 localhost nova_compute[228333]: 2025-11-28 09:45:24.143 228337 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored#033[00m Nov 28 04:45:24 localhost nova_compute[228333]: 2025-11-28 09:45:24.145 228337 DEBUG oslo_concurrency.lockutils [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:45:24 localhost nova_compute[228333]: 2025-11-28 09:45:24.146 228337 DEBUG oslo_concurrency.lockutils [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:45:24 localhost nova_compute[228333]: 2025-11-28 09:45:24.146 228337 DEBUG oslo_concurrency.lockutils [None req-f9875077-f19e-49c5-85f3-e0d361fc86d6 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:45:24 localhost systemd[1]: libpod-11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf.scope: Deactivated successfully. Nov 28 04:45:24 localhost journal[201490]: End of file while reading data: Input/output error Nov 28 04:45:24 localhost systemd[1]: libpod-11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf.scope: Consumed 20.296s CPU time. Nov 28 04:45:24 localhost podman[279617]: 2025-11-28 09:45:24.513381954 +0000 UTC m=+1.757115059 container died 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:45:24 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf-userdata-shm.mount: Deactivated successfully. Nov 28 04:45:24 localhost systemd[1]: var-lib-containers-storage-overlay-3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c-merged.mount: Deactivated successfully. Nov 28 04:45:24 localhost podman[279617]: 2025-11-28 09:45:24.676226093 +0000 UTC m=+1.919959158 container cleanup 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:45:24 localhost podman[279617]: nova_compute Nov 28 04:45:24 localhost podman[279656]: error opening file `/run/crun/11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf/status`: No such file or directory Nov 28 04:45:24 localhost podman[279643]: 2025-11-28 09:45:24.762132955 +0000 UTC m=+0.055909874 container cleanup 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:45:24 localhost podman[279643]: nova_compute Nov 28 04:45:24 localhost systemd[1]: edpm_nova_compute.service: Deactivated successfully. Nov 28 04:45:24 localhost systemd[1]: Stopped nova_compute container. Nov 28 04:45:24 localhost systemd[1]: Starting nova_compute container... Nov 28 04:45:24 localhost systemd[1]: Started libcrun container. Nov 28 04:45:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Nov 28 04:45:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:45:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Nov 28 04:45:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Nov 28 04:45:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Nov 28 04:45:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e6c13305c390aa375458ab2cfcebed7330dd8de5c53d8dba2a4cc295c02b63c/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 28 04:45:24 localhost podman[279658]: 2025-11-28 09:45:24.913878615 +0000 UTC m=+0.120021669 container init 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 04:45:24 localhost podman[279658]: 2025-11-28 09:45:24.924568392 +0000 UTC m=+0.130711426 container start 11fe6a68c81046d9ea285b6d86287149d1dae86b8e949cdf4d5a0683258b28cf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 28 04:45:24 localhost podman[279658]: nova_compute Nov 28 04:45:24 localhost nova_compute[279673]: + sudo -E kolla_set_configs Nov 28 04:45:24 localhost systemd[1]: Started nova_compute container. Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Validating config file Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Copying service configuration files Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Deleting /etc/nova/nova.conf Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Setting permission for /etc/nova/nova.conf Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Nov 28 04:45:24 localhost podman[279676]: 2025-11-28 09:45:24.98355735 +0000 UTC m=+0.069747108 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Deleting /etc/ceph Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Creating directory /etc/ceph Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Setting permission for /etc/ceph Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Deleting /usr/sbin/iscsiadm Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Writing out command to execute Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Nov 28 04:45:24 localhost nova_compute[279673]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Nov 28 04:45:24 localhost nova_compute[279673]: ++ cat /run_command Nov 28 04:45:24 localhost nova_compute[279673]: + CMD=nova-compute Nov 28 04:45:24 localhost nova_compute[279673]: + ARGS= Nov 28 04:45:24 localhost nova_compute[279673]: + sudo kolla_copy_cacerts Nov 28 04:45:25 localhost nova_compute[279673]: + [[ ! -n '' ]] Nov 28 04:45:25 localhost nova_compute[279673]: + . kolla_extend_start Nov 28 04:45:25 localhost nova_compute[279673]: + echo 'Running command: '\''nova-compute'\''' Nov 28 04:45:25 localhost nova_compute[279673]: Running command: 'nova-compute' Nov 28 04:45:25 localhost nova_compute[279673]: + umask 0022 Nov 28 04:45:25 localhost nova_compute[279673]: + exec nova-compute Nov 28 04:45:25 localhost podman[279676]: 2025-11-28 09:45:25.01946105 +0000 UTC m=+0.105650868 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:45:25 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:45:26 localhost nova_compute[279673]: 2025-11-28 09:45:26.610 279685 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 28 04:45:26 localhost nova_compute[279673]: 2025-11-28 09:45:26.611 279685 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 28 04:45:26 localhost nova_compute[279673]: 2025-11-28 09:45:26.611 279685 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Nov 28 04:45:26 localhost nova_compute[279673]: 2025-11-28 09:45:26.611 279685 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Nov 28 04:45:26 localhost nova_compute[279673]: 2025-11-28 09:45:26.738 279685 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:45:26 localhost nova_compute[279673]: 2025-11-28 09:45:26.762 279685 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:45:26 localhost nova_compute[279673]: 2025-11-28 09:45:26.762 279685 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.163 279685 INFO nova.virt.driver [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.295 279685 INFO nova.compute.provider_config [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.305 279685 DEBUG oslo_concurrency.lockutils [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.306 279685 DEBUG oslo_concurrency.lockutils [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.306 279685 DEBUG oslo_concurrency.lockutils [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.306 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.307 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.307 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.307 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.307 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.307 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.307 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.308 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.308 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.308 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.308 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.308 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.308 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.309 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.309 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.309 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.309 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.309 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.310 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.310 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] console_host = np0005538513.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.310 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.310 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.310 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.310 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.311 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.311 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.311 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.311 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.311 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.312 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.312 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.312 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.312 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.312 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.312 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.313 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.313 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.313 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.313 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] host = np0005538513.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.313 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.314 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.314 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.314 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.314 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.314 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.315 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.315 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.315 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.315 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.315 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.316 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.316 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.316 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.316 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.316 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.316 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.317 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.317 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.317 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.317 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.317 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.317 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.318 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.318 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.318 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.318 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.318 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.319 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.319 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.319 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.319 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.319 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.319 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.320 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.320 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.320 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.320 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.320 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.320 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.321 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.321 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] my_block_storage_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.321 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] my_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.321 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.321 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.322 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.322 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.322 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.322 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.322 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.322 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.323 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.323 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.323 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.323 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.323 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.324 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.324 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.324 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.324 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.324 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.324 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.325 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.325 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.325 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.325 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.325 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.325 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.326 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.326 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.326 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.326 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.326 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.327 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.327 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.327 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.327 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.327 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.328 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.328 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.328 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.328 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.328 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.328 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.329 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.329 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.329 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.329 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.329 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.329 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.330 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.330 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.330 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.330 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.330 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.331 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.331 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.331 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.331 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.331 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.331 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.332 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.332 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.332 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.332 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.332 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.332 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.333 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.333 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.333 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.333 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.333 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.334 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.334 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.334 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.334 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.334 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.335 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.335 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.335 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.335 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.335 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.335 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.336 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.336 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.336 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.336 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.336 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.337 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.337 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.337 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.337 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.337 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.337 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.338 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.338 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.338 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.338 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.338 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.339 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.339 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.339 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.339 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.339 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.339 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.340 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.340 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.340 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.340 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.340 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.341 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.341 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.341 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.341 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.341 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.341 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.342 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.342 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.342 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.342 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.342 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.343 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.343 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.343 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.343 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.343 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.343 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.344 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.344 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.344 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.344 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.344 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.345 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.345 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.345 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.345 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.345 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.345 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.346 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.346 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.346 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.346 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.346 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.346 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.os_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.347 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.347 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.347 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.347 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.347 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.348 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.348 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.348 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.348 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.348 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.348 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.349 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.349 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.349 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.349 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.349 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.350 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.350 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.350 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.350 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.350 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.350 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.351 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.351 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.351 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.351 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.351 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.352 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.352 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.352 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.352 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.352 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.352 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.353 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.353 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.353 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.353 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.353 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.354 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.354 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.354 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.354 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.354 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.354 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.355 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.355 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.355 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.355 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.355 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.356 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.356 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.356 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.356 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.356 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.356 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.357 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.357 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.357 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.357 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.357 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.358 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.358 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.358 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.358 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.358 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.358 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.359 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.359 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.359 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.359 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.359 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.359 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.360 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.360 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.360 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.360 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.360 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.361 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.361 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.361 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.361 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.362 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.362 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.362 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.362 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.362 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.362 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.363 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.363 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.363 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.363 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.363 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.364 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.364 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.364 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.364 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.364 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.364 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.365 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.365 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.365 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.365 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.365 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.365 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.366 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.366 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.366 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.366 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.366 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.367 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.367 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.367 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.367 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.367 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.367 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.368 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.368 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.368 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.368 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.368 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.369 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.369 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.369 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.369 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.369 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.369 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.370 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.370 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.370 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.370 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.370 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.371 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.371 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.371 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.371 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.371 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.372 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.372 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.372 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.372 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.372 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.373 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.373 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.373 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.373 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.373 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.373 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.374 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.374 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.374 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.374 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.374 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.375 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.375 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.375 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.375 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.375 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.375 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.376 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.376 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.376 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.376 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.376 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.377 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.377 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.377 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.377 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.377 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.377 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.378 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.378 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.378 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.barbican_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.378 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.378 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.379 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.379 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.379 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.379 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.379 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.379 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.380 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.380 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.380 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.380 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.380 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.381 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.381 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.381 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.381 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.381 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.381 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.382 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.382 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.383 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.383 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.383 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.383 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.383 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.384 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.384 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.384 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.384 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.384 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.385 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.385 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.385 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.385 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.385 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.385 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.386 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.386 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.386 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.386 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.386 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.387 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.387 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.387 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.387 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.387 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.387 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.388 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.388 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.388 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.388 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.388 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.388 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.389 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.389 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.389 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.389 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.389 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.390 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.390 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.390 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.390 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.390 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.390 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.391 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.391 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.391 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.391 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.391 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.392 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.392 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.392 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.392 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.392 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.392 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.393 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.393 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.393 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.393 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.393 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.394 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.394 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.394 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.394 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.394 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.394 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.395 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.395 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.395 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.395 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.395 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.395 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.396 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.396 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.396 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.396 279685 WARNING oslo_config.cfg [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Nov 28 04:45:27 localhost nova_compute[279673]: live_migration_uri is deprecated for removal in favor of two other options that Nov 28 04:45:27 localhost nova_compute[279673]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Nov 28 04:45:27 localhost nova_compute[279673]: and ``live_migration_inbound_addr`` respectively. Nov 28 04:45:27 localhost nova_compute[279673]: ). Its value may be silently ignored in the future.#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.397 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.397 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.397 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.397 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.397 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.397 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.398 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.398 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.398 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.398 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.399 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.399 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.399 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.399 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.399 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.400 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.400 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.400 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.400 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.rbd_secret_uuid = 2c5417c9-00eb-57d5-a565-ddecbc7995c1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.400 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.400 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.401 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.401 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.401 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.401 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.401 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.402 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.402 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.402 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.402 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.402 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.403 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.403 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.403 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.403 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.403 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.403 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.404 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.404 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.404 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.404 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.404 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.405 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.405 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.405 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.405 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.405 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.405 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.406 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.406 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.406 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.406 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.406 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.407 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.407 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.407 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.407 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.407 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.407 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.408 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.408 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.408 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.408 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.408 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.409 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.409 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.409 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.409 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.409 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.409 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.410 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.410 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.410 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.410 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.410 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.411 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.411 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.411 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.411 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.411 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.411 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.412 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.412 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.412 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.412 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.412 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.413 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.413 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.413 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.413 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.413 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.413 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.414 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.414 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.414 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.417 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.417 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.417 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.418 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.418 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.418 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.419 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.419 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.419 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.419 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.420 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.420 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.420 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.421 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.421 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.421 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.422 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.422 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.422 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.422 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.423 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.423 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.423 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.424 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.424 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.424 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.424 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.425 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.425 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.425 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.426 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.426 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.426 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.427 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.427 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.427 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.427 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.428 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.428 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.428 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.429 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.429 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.429 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.430 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.430 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.430 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.431 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.431 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.431 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.432 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.432 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.432 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.433 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.433 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.433 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.434 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.434 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.434 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.434 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.435 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.435 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.435 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.436 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.436 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.436 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.437 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.437 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.437 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.438 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.438 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.438 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.438 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.439 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.439 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.439 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.440 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.440 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.440 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.440 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.441 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.441 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.441 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.442 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.442 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.443 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.443 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.443 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.444 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.444 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.444 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.444 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.445 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.445 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.445 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.446 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.446 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.446 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.447 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.447 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.447 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.447 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.448 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.448 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.449 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.449 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.449 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.449 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.450 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.450 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.450 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.451 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.451 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.451 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.451 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.452 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.452 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.452 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.453 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.453 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.453 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.454 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.454 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.454 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.455 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.455 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.455 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.455 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.456 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.456 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.456 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.457 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.457 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.457 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.458 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.458 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.458 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.458 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.459 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.459 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.459 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.460 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.460 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.460 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.460 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.461 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.461 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.461 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.462 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.462 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.462 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.463 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.463 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.463 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.464 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.464 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.464 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.465 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.465 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.466 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.466 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.466 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.466 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.467 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.467 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.467 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.468 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.468 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.468 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.469 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.469 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.469 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.469 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.470 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.470 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.470 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.470 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.470 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.470 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.471 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.471 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.471 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.471 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.471 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.472 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.472 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.472 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.472 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.472 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.473 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.473 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.473 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.473 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.473 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.473 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.474 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.474 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.474 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.474 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.474 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.475 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.475 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.475 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.475 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.475 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.476 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.476 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.476 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.476 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.476 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.477 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.477 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.477 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.477 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.477 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.477 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.478 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.478 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.478 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.478 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.478 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.479 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.479 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.479 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.479 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.479 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.480 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.480 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.480 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.480 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.480 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.480 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.481 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.481 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.481 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.481 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.481 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.482 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.482 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.482 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.482 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.482 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.483 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.483 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.483 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.483 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.483 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.483 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.484 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.484 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.484 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.484 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.484 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.485 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.485 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.485 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.485 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.485 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.485 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.486 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.486 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.486 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.486 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.486 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.487 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.487 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.487 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.487 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.487 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.487 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.488 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.488 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.488 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.488 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.488 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.489 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.489 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.489 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.489 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.489 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.489 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.490 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.490 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.490 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.490 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.490 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.491 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.491 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.491 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.491 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.491 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.492 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.492 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.492 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.492 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.492 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.493 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.493 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.493 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.493 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.493 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.493 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.494 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.494 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.494 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.494 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.494 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.495 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.495 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.495 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.495 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.495 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.495 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.496 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.496 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.496 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.496 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.496 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.497 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.497 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.497 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.497 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.497 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.497 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.498 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.498 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.498 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.498 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.498 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.499 279685 DEBUG oslo_service.service [None req-cf884f0a-6145-4f87-8ac7-be9cc0de7faf - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.500 279685 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.517 279685 INFO nova.virt.node [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Determined node identity 35fead26-0bad-4950-b646-987079d58a17 from /var/lib/nova/compute_id#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.518 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.518 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.519 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.519 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.532 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.536 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.537 279685 INFO nova.virt.libvirt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Connection event '1' reason 'None'#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.542 279685 INFO nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Libvirt host capabilities Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: eb468aed-e0e9-4528-988f-9267a3530b7a Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: x86_64 Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome-v4 Nov 28 04:45:27 localhost nova_compute[279673]: AMD Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: tcp Nov 28 04:45:27 localhost nova_compute[279673]: rdma Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: 16116612 Nov 28 04:45:27 localhost nova_compute[279673]: 4029153 Nov 28 04:45:27 localhost nova_compute[279673]: 0 Nov 28 04:45:27 localhost nova_compute[279673]: 0 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: selinux Nov 28 04:45:27 localhost nova_compute[279673]: 0 Nov 28 04:45:27 localhost nova_compute[279673]: system_u:system_r:svirt_t:s0 Nov 28 04:45:27 localhost nova_compute[279673]: system_u:system_r:svirt_tcg_t:s0 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: dac Nov 28 04:45:27 localhost nova_compute[279673]: 0 Nov 28 04:45:27 localhost nova_compute[279673]: +107:+107 Nov 28 04:45:27 localhost nova_compute[279673]: +107:+107 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: hvm Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: 32 Nov 28 04:45:27 localhost nova_compute[279673]: /usr/libexec/qemu-kvm Nov 28 04:45:27 localhost nova_compute[279673]: pc-i440fx-rhel7.6.0 Nov 28 04:45:27 localhost nova_compute[279673]: pc Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel9.8.0 Nov 28 04:45:27 localhost nova_compute[279673]: q35 Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel9.6.0 Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel8.6.0 Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel9.4.0 Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel8.5.0 Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel8.3.0 Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel7.6.0 Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel8.4.0 Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel9.2.0 Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel8.2.0 Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel9.0.0 Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel8.0.0 Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel8.1.0 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: hvm Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: 64 Nov 28 04:45:27 localhost nova_compute[279673]: /usr/libexec/qemu-kvm Nov 28 04:45:27 localhost nova_compute[279673]: pc-i440fx-rhel7.6.0 Nov 28 04:45:27 localhost nova_compute[279673]: pc Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel9.8.0 Nov 28 04:45:27 localhost nova_compute[279673]: q35 Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel9.6.0 Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel8.6.0 Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel9.4.0 Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel8.5.0 Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel8.3.0 Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel7.6.0 Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel8.4.0 Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel9.2.0 Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel8.2.0 Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel9.0.0 Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel8.0.0 Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel8.1.0 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: #033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.549 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.554 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: /usr/libexec/qemu-kvm Nov 28 04:45:27 localhost nova_compute[279673]: kvm Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel9.8.0 Nov 28 04:45:27 localhost nova_compute[279673]: i686 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: rom Nov 28 04:45:27 localhost nova_compute[279673]: pflash Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: yes Nov 28 04:45:27 localhost nova_compute[279673]: no Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: no Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: on Nov 28 04:45:27 localhost nova_compute[279673]: off Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: on Nov 28 04:45:27 localhost nova_compute[279673]: off Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome Nov 28 04:45:27 localhost nova_compute[279673]: AMD Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: 486 Nov 28 04:45:27 localhost nova_compute[279673]: 486-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-noTSX Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-noTSX-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server-noTSX Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server-v5 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Conroe Nov 28 04:45:27 localhost nova_compute[279673]: Conroe-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Cooperlake Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cooperlake-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cooperlake-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Denverton Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Denverton-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Denverton-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Denverton-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Dhyana Nov 28 04:45:27 localhost nova_compute[279673]: Dhyana-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Dhyana-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Genoa Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Genoa-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-IBPB Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Milan Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Milan-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Milan-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome-v4 Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-v1 Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-v2 Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: GraniteRapids Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: GraniteRapids-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: GraniteRapids-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-noTSX Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-noTSX-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-noTSX Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v5 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v6 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v7 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: IvyBridge Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: IvyBridge-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: IvyBridge-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: IvyBridge-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: KnightsMill Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: KnightsMill-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nehalem Nov 28 04:45:27 localhost nova_compute[279673]: Nehalem-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nehalem-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nehalem-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G1 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G1-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G2 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G2-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G3 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G3-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G4-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G5 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G5-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Penryn Nov 28 04:45:27 localhost nova_compute[279673]: Penryn-v1 Nov 28 04:45:27 localhost nova_compute[279673]: SandyBridge Nov 28 04:45:27 localhost nova_compute[279673]: SandyBridge-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: SandyBridge-v1 Nov 28 04:45:27 localhost nova_compute[279673]: SandyBridge-v2 Nov 28 04:45:27 localhost nova_compute[279673]: SapphireRapids Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: SapphireRapids-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: SapphireRapids-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: SapphireRapids-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: SierraForest Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: SierraForest-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client-noTSX-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-noTSX-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-v5 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Snowridge Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Snowridge-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Snowridge-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Snowridge-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Snowridge-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Westmere Nov 28 04:45:27 localhost nova_compute[279673]: Westmere-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Westmere-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Westmere-v2 Nov 28 04:45:27 localhost nova_compute[279673]: athlon Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: athlon-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: core2duo Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: core2duo-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: coreduo Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: coreduo-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: kvm32 Nov 28 04:45:27 localhost nova_compute[279673]: kvm32-v1 Nov 28 04:45:27 localhost nova_compute[279673]: kvm64 Nov 28 04:45:27 localhost nova_compute[279673]: kvm64-v1 Nov 28 04:45:27 localhost nova_compute[279673]: n270 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: n270-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: pentium Nov 28 04:45:27 localhost nova_compute[279673]: pentium-v1 Nov 28 04:45:27 localhost nova_compute[279673]: pentium2 Nov 28 04:45:27 localhost nova_compute[279673]: pentium2-v1 Nov 28 04:45:27 localhost nova_compute[279673]: pentium3 Nov 28 04:45:27 localhost nova_compute[279673]: pentium3-v1 Nov 28 04:45:27 localhost nova_compute[279673]: phenom Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: phenom-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: qemu32 Nov 28 04:45:27 localhost nova_compute[279673]: qemu32-v1 Nov 28 04:45:27 localhost nova_compute[279673]: qemu64 Nov 28 04:45:27 localhost nova_compute[279673]: qemu64-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: file Nov 28 04:45:27 localhost nova_compute[279673]: anonymous Nov 28 04:45:27 localhost nova_compute[279673]: memfd Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: disk Nov 28 04:45:27 localhost nova_compute[279673]: cdrom Nov 28 04:45:27 localhost nova_compute[279673]: floppy Nov 28 04:45:27 localhost nova_compute[279673]: lun Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: fdc Nov 28 04:45:27 localhost nova_compute[279673]: scsi Nov 28 04:45:27 localhost nova_compute[279673]: virtio Nov 28 04:45:27 localhost nova_compute[279673]: usb Nov 28 04:45:27 localhost nova_compute[279673]: sata Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: virtio Nov 28 04:45:27 localhost nova_compute[279673]: virtio-transitional Nov 28 04:45:27 localhost nova_compute[279673]: virtio-non-transitional Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: vnc Nov 28 04:45:27 localhost nova_compute[279673]: egl-headless Nov 28 04:45:27 localhost nova_compute[279673]: dbus Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: subsystem Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: default Nov 28 04:45:27 localhost nova_compute[279673]: mandatory Nov 28 04:45:27 localhost nova_compute[279673]: requisite Nov 28 04:45:27 localhost nova_compute[279673]: optional Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: usb Nov 28 04:45:27 localhost nova_compute[279673]: pci Nov 28 04:45:27 localhost nova_compute[279673]: scsi Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: virtio Nov 28 04:45:27 localhost nova_compute[279673]: virtio-transitional Nov 28 04:45:27 localhost nova_compute[279673]: virtio-non-transitional Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: random Nov 28 04:45:27 localhost nova_compute[279673]: egd Nov 28 04:45:27 localhost nova_compute[279673]: builtin Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: path Nov 28 04:45:27 localhost nova_compute[279673]: handle Nov 28 04:45:27 localhost nova_compute[279673]: virtiofs Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: tpm-tis Nov 28 04:45:27 localhost nova_compute[279673]: tpm-crb Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: emulator Nov 28 04:45:27 localhost nova_compute[279673]: external Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: 2.0 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: usb Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: pty Nov 28 04:45:27 localhost nova_compute[279673]: unix Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: qemu Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: builtin Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: default Nov 28 04:45:27 localhost nova_compute[279673]: passt Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: isa Nov 28 04:45:27 localhost nova_compute[279673]: hyperv Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: null Nov 28 04:45:27 localhost nova_compute[279673]: vc Nov 28 04:45:27 localhost nova_compute[279673]: pty Nov 28 04:45:27 localhost nova_compute[279673]: dev Nov 28 04:45:27 localhost nova_compute[279673]: file Nov 28 04:45:27 localhost nova_compute[279673]: pipe Nov 28 04:45:27 localhost nova_compute[279673]: stdio Nov 28 04:45:27 localhost nova_compute[279673]: udp Nov 28 04:45:27 localhost nova_compute[279673]: tcp Nov 28 04:45:27 localhost nova_compute[279673]: unix Nov 28 04:45:27 localhost nova_compute[279673]: qemu-vdagent Nov 28 04:45:27 localhost nova_compute[279673]: dbus Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: relaxed Nov 28 04:45:27 localhost nova_compute[279673]: vapic Nov 28 04:45:27 localhost nova_compute[279673]: spinlocks Nov 28 04:45:27 localhost nova_compute[279673]: vpindex Nov 28 04:45:27 localhost nova_compute[279673]: runtime Nov 28 04:45:27 localhost nova_compute[279673]: synic Nov 28 04:45:27 localhost nova_compute[279673]: stimer Nov 28 04:45:27 localhost nova_compute[279673]: reset Nov 28 04:45:27 localhost nova_compute[279673]: vendor_id Nov 28 04:45:27 localhost nova_compute[279673]: frequencies Nov 28 04:45:27 localhost nova_compute[279673]: reenlightenment Nov 28 04:45:27 localhost nova_compute[279673]: tlbflush Nov 28 04:45:27 localhost nova_compute[279673]: ipi Nov 28 04:45:27 localhost nova_compute[279673]: avic Nov 28 04:45:27 localhost nova_compute[279673]: emsr_bitmap Nov 28 04:45:27 localhost nova_compute[279673]: xmm_input Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: 4095 Nov 28 04:45:27 localhost nova_compute[279673]: on Nov 28 04:45:27 localhost nova_compute[279673]: off Nov 28 04:45:27 localhost nova_compute[279673]: off Nov 28 04:45:27 localhost nova_compute[279673]: Linux KVM Hv Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: tdx Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.560 279685 DEBUG nova.virt.libvirt.volume.mount [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.564 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: /usr/libexec/qemu-kvm Nov 28 04:45:27 localhost nova_compute[279673]: kvm Nov 28 04:45:27 localhost nova_compute[279673]: pc-i440fx-rhel7.6.0 Nov 28 04:45:27 localhost nova_compute[279673]: i686 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: rom Nov 28 04:45:27 localhost nova_compute[279673]: pflash Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: yes Nov 28 04:45:27 localhost nova_compute[279673]: no Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: no Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: on Nov 28 04:45:27 localhost nova_compute[279673]: off Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: on Nov 28 04:45:27 localhost nova_compute[279673]: off Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome Nov 28 04:45:27 localhost nova_compute[279673]: AMD Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: 486 Nov 28 04:45:27 localhost nova_compute[279673]: 486-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-noTSX Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-noTSX-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server-noTSX Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server-v5 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Conroe Nov 28 04:45:27 localhost nova_compute[279673]: Conroe-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Cooperlake Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cooperlake-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cooperlake-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Denverton Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Denverton-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Denverton-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Denverton-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Dhyana Nov 28 04:45:27 localhost nova_compute[279673]: Dhyana-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Dhyana-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Genoa Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Genoa-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-IBPB Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Milan Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Milan-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Milan-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome-v4 Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-v1 Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-v2 Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: GraniteRapids Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: GraniteRapids-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: GraniteRapids-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-noTSX Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-noTSX-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-noTSX Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v5 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v6 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v7 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: IvyBridge Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: IvyBridge-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: IvyBridge-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: IvyBridge-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: KnightsMill Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: KnightsMill-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nehalem Nov 28 04:45:27 localhost nova_compute[279673]: Nehalem-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nehalem-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nehalem-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G1 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G1-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G2 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G2-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G3 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G3-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G4-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G5 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G5-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Penryn Nov 28 04:45:27 localhost nova_compute[279673]: Penryn-v1 Nov 28 04:45:27 localhost nova_compute[279673]: SandyBridge Nov 28 04:45:27 localhost nova_compute[279673]: SandyBridge-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: SandyBridge-v1 Nov 28 04:45:27 localhost nova_compute[279673]: SandyBridge-v2 Nov 28 04:45:27 localhost nova_compute[279673]: SapphireRapids Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: SapphireRapids-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: SapphireRapids-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: SapphireRapids-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: SierraForest Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: SierraForest-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client-noTSX-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-noTSX-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-v5 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Snowridge Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Snowridge-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Snowridge-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Snowridge-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Snowridge-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Westmere Nov 28 04:45:27 localhost nova_compute[279673]: Westmere-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Westmere-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Westmere-v2 Nov 28 04:45:27 localhost nova_compute[279673]: athlon Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: athlon-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: core2duo Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: core2duo-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: coreduo Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: coreduo-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: kvm32 Nov 28 04:45:27 localhost nova_compute[279673]: kvm32-v1 Nov 28 04:45:27 localhost nova_compute[279673]: kvm64 Nov 28 04:45:27 localhost nova_compute[279673]: kvm64-v1 Nov 28 04:45:27 localhost nova_compute[279673]: n270 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: n270-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: pentium Nov 28 04:45:27 localhost nova_compute[279673]: pentium-v1 Nov 28 04:45:27 localhost nova_compute[279673]: pentium2 Nov 28 04:45:27 localhost nova_compute[279673]: pentium2-v1 Nov 28 04:45:27 localhost nova_compute[279673]: pentium3 Nov 28 04:45:27 localhost nova_compute[279673]: pentium3-v1 Nov 28 04:45:27 localhost nova_compute[279673]: phenom Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: phenom-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: qemu32 Nov 28 04:45:27 localhost nova_compute[279673]: qemu32-v1 Nov 28 04:45:27 localhost nova_compute[279673]: qemu64 Nov 28 04:45:27 localhost nova_compute[279673]: qemu64-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: file Nov 28 04:45:27 localhost nova_compute[279673]: anonymous Nov 28 04:45:27 localhost nova_compute[279673]: memfd Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: disk Nov 28 04:45:27 localhost nova_compute[279673]: cdrom Nov 28 04:45:27 localhost nova_compute[279673]: floppy Nov 28 04:45:27 localhost nova_compute[279673]: lun Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: ide Nov 28 04:45:27 localhost nova_compute[279673]: fdc Nov 28 04:45:27 localhost nova_compute[279673]: scsi Nov 28 04:45:27 localhost nova_compute[279673]: virtio Nov 28 04:45:27 localhost nova_compute[279673]: usb Nov 28 04:45:27 localhost nova_compute[279673]: sata Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: virtio Nov 28 04:45:27 localhost nova_compute[279673]: virtio-transitional Nov 28 04:45:27 localhost nova_compute[279673]: virtio-non-transitional Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: vnc Nov 28 04:45:27 localhost nova_compute[279673]: egl-headless Nov 28 04:45:27 localhost nova_compute[279673]: dbus Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: subsystem Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: default Nov 28 04:45:27 localhost nova_compute[279673]: mandatory Nov 28 04:45:27 localhost nova_compute[279673]: requisite Nov 28 04:45:27 localhost nova_compute[279673]: optional Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: usb Nov 28 04:45:27 localhost nova_compute[279673]: pci Nov 28 04:45:27 localhost nova_compute[279673]: scsi Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: virtio Nov 28 04:45:27 localhost nova_compute[279673]: virtio-transitional Nov 28 04:45:27 localhost nova_compute[279673]: virtio-non-transitional Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: random Nov 28 04:45:27 localhost nova_compute[279673]: egd Nov 28 04:45:27 localhost nova_compute[279673]: builtin Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: path Nov 28 04:45:27 localhost nova_compute[279673]: handle Nov 28 04:45:27 localhost nova_compute[279673]: virtiofs Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: tpm-tis Nov 28 04:45:27 localhost nova_compute[279673]: tpm-crb Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: emulator Nov 28 04:45:27 localhost nova_compute[279673]: external Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: 2.0 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: usb Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: pty Nov 28 04:45:27 localhost nova_compute[279673]: unix Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: qemu Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: builtin Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: default Nov 28 04:45:27 localhost nova_compute[279673]: passt Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: isa Nov 28 04:45:27 localhost nova_compute[279673]: hyperv Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: null Nov 28 04:45:27 localhost nova_compute[279673]: vc Nov 28 04:45:27 localhost nova_compute[279673]: pty Nov 28 04:45:27 localhost nova_compute[279673]: dev Nov 28 04:45:27 localhost nova_compute[279673]: file Nov 28 04:45:27 localhost nova_compute[279673]: pipe Nov 28 04:45:27 localhost nova_compute[279673]: stdio Nov 28 04:45:27 localhost nova_compute[279673]: udp Nov 28 04:45:27 localhost nova_compute[279673]: tcp Nov 28 04:45:27 localhost nova_compute[279673]: unix Nov 28 04:45:27 localhost nova_compute[279673]: qemu-vdagent Nov 28 04:45:27 localhost nova_compute[279673]: dbus Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: relaxed Nov 28 04:45:27 localhost nova_compute[279673]: vapic Nov 28 04:45:27 localhost nova_compute[279673]: spinlocks Nov 28 04:45:27 localhost nova_compute[279673]: vpindex Nov 28 04:45:27 localhost nova_compute[279673]: runtime Nov 28 04:45:27 localhost nova_compute[279673]: synic Nov 28 04:45:27 localhost nova_compute[279673]: stimer Nov 28 04:45:27 localhost nova_compute[279673]: reset Nov 28 04:45:27 localhost nova_compute[279673]: vendor_id Nov 28 04:45:27 localhost nova_compute[279673]: frequencies Nov 28 04:45:27 localhost nova_compute[279673]: reenlightenment Nov 28 04:45:27 localhost nova_compute[279673]: tlbflush Nov 28 04:45:27 localhost nova_compute[279673]: ipi Nov 28 04:45:27 localhost nova_compute[279673]: avic Nov 28 04:45:27 localhost nova_compute[279673]: emsr_bitmap Nov 28 04:45:27 localhost nova_compute[279673]: xmm_input Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: 4095 Nov 28 04:45:27 localhost nova_compute[279673]: on Nov 28 04:45:27 localhost nova_compute[279673]: off Nov 28 04:45:27 localhost nova_compute[279673]: off Nov 28 04:45:27 localhost nova_compute[279673]: Linux KVM Hv Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: tdx Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.598 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.603 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: /usr/libexec/qemu-kvm Nov 28 04:45:27 localhost nova_compute[279673]: kvm Nov 28 04:45:27 localhost nova_compute[279673]: pc-q35-rhel9.8.0 Nov 28 04:45:27 localhost nova_compute[279673]: x86_64 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: efi Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Nov 28 04:45:27 localhost nova_compute[279673]: /usr/share/edk2/ovmf/OVMF_CODE.fd Nov 28 04:45:27 localhost nova_compute[279673]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Nov 28 04:45:27 localhost nova_compute[279673]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: rom Nov 28 04:45:27 localhost nova_compute[279673]: pflash Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: yes Nov 28 04:45:27 localhost nova_compute[279673]: no Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: yes Nov 28 04:45:27 localhost nova_compute[279673]: no Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: on Nov 28 04:45:27 localhost nova_compute[279673]: off Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: on Nov 28 04:45:27 localhost nova_compute[279673]: off Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome Nov 28 04:45:27 localhost nova_compute[279673]: AMD Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: 486 Nov 28 04:45:27 localhost nova_compute[279673]: 486-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-noTSX Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-noTSX-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server-noTSX Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server-v5 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Conroe Nov 28 04:45:27 localhost nova_compute[279673]: Conroe-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Cooperlake Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cooperlake-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cooperlake-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Denverton Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Denverton-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Denverton-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Denverton-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Dhyana Nov 28 04:45:27 localhost nova_compute[279673]: Dhyana-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Dhyana-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Genoa Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Genoa-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-IBPB Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Milan Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Milan-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Milan-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome-v4 Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-v1 Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-v2 Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: GraniteRapids Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: GraniteRapids-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: GraniteRapids-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-noTSX Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-noTSX-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-noTSX Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v5 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v6 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v7 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: IvyBridge Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: IvyBridge-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: IvyBridge-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: IvyBridge-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: KnightsMill Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: KnightsMill-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nehalem Nov 28 04:45:27 localhost nova_compute[279673]: Nehalem-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nehalem-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nehalem-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G1 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G1-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G2 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G2-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G3 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G3-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G4-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G5 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G5-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Penryn Nov 28 04:45:27 localhost nova_compute[279673]: Penryn-v1 Nov 28 04:45:27 localhost nova_compute[279673]: SandyBridge Nov 28 04:45:27 localhost nova_compute[279673]: SandyBridge-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: SandyBridge-v1 Nov 28 04:45:27 localhost nova_compute[279673]: SandyBridge-v2 Nov 28 04:45:27 localhost nova_compute[279673]: SapphireRapids Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: SapphireRapids-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: SapphireRapids-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: SapphireRapids-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: SierraForest Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: SierraForest-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client-noTSX-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-noTSX-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-v5 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Snowridge Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Snowridge-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Snowridge-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Snowridge-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Snowridge-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Westmere Nov 28 04:45:27 localhost nova_compute[279673]: Westmere-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Westmere-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Westmere-v2 Nov 28 04:45:27 localhost nova_compute[279673]: athlon Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: athlon-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: core2duo Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: core2duo-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: coreduo Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: coreduo-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: kvm32 Nov 28 04:45:27 localhost nova_compute[279673]: kvm32-v1 Nov 28 04:45:27 localhost nova_compute[279673]: kvm64 Nov 28 04:45:27 localhost nova_compute[279673]: kvm64-v1 Nov 28 04:45:27 localhost nova_compute[279673]: n270 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: n270-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: pentium Nov 28 04:45:27 localhost nova_compute[279673]: pentium-v1 Nov 28 04:45:27 localhost nova_compute[279673]: pentium2 Nov 28 04:45:27 localhost nova_compute[279673]: pentium2-v1 Nov 28 04:45:27 localhost nova_compute[279673]: pentium3 Nov 28 04:45:27 localhost nova_compute[279673]: pentium3-v1 Nov 28 04:45:27 localhost nova_compute[279673]: phenom Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: phenom-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: qemu32 Nov 28 04:45:27 localhost nova_compute[279673]: qemu32-v1 Nov 28 04:45:27 localhost nova_compute[279673]: qemu64 Nov 28 04:45:27 localhost nova_compute[279673]: qemu64-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: file Nov 28 04:45:27 localhost nova_compute[279673]: anonymous Nov 28 04:45:27 localhost nova_compute[279673]: memfd Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: disk Nov 28 04:45:27 localhost nova_compute[279673]: cdrom Nov 28 04:45:27 localhost nova_compute[279673]: floppy Nov 28 04:45:27 localhost nova_compute[279673]: lun Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: fdc Nov 28 04:45:27 localhost nova_compute[279673]: scsi Nov 28 04:45:27 localhost nova_compute[279673]: virtio Nov 28 04:45:27 localhost nova_compute[279673]: usb Nov 28 04:45:27 localhost nova_compute[279673]: sata Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: virtio Nov 28 04:45:27 localhost nova_compute[279673]: virtio-transitional Nov 28 04:45:27 localhost nova_compute[279673]: virtio-non-transitional Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: vnc Nov 28 04:45:27 localhost nova_compute[279673]: egl-headless Nov 28 04:45:27 localhost nova_compute[279673]: dbus Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: subsystem Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: default Nov 28 04:45:27 localhost nova_compute[279673]: mandatory Nov 28 04:45:27 localhost nova_compute[279673]: requisite Nov 28 04:45:27 localhost nova_compute[279673]: optional Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: usb Nov 28 04:45:27 localhost nova_compute[279673]: pci Nov 28 04:45:27 localhost nova_compute[279673]: scsi Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: virtio Nov 28 04:45:27 localhost nova_compute[279673]: virtio-transitional Nov 28 04:45:27 localhost nova_compute[279673]: virtio-non-transitional Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: random Nov 28 04:45:27 localhost nova_compute[279673]: egd Nov 28 04:45:27 localhost nova_compute[279673]: builtin Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: path Nov 28 04:45:27 localhost nova_compute[279673]: handle Nov 28 04:45:27 localhost nova_compute[279673]: virtiofs Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: tpm-tis Nov 28 04:45:27 localhost nova_compute[279673]: tpm-crb Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: emulator Nov 28 04:45:27 localhost nova_compute[279673]: external Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: 2.0 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: usb Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: pty Nov 28 04:45:27 localhost nova_compute[279673]: unix Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: qemu Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: builtin Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: default Nov 28 04:45:27 localhost nova_compute[279673]: passt Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: isa Nov 28 04:45:27 localhost nova_compute[279673]: hyperv Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: null Nov 28 04:45:27 localhost nova_compute[279673]: vc Nov 28 04:45:27 localhost nova_compute[279673]: pty Nov 28 04:45:27 localhost nova_compute[279673]: dev Nov 28 04:45:27 localhost nova_compute[279673]: file Nov 28 04:45:27 localhost nova_compute[279673]: pipe Nov 28 04:45:27 localhost nova_compute[279673]: stdio Nov 28 04:45:27 localhost nova_compute[279673]: udp Nov 28 04:45:27 localhost nova_compute[279673]: tcp Nov 28 04:45:27 localhost nova_compute[279673]: unix Nov 28 04:45:27 localhost nova_compute[279673]: qemu-vdagent Nov 28 04:45:27 localhost nova_compute[279673]: dbus Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: relaxed Nov 28 04:45:27 localhost nova_compute[279673]: vapic Nov 28 04:45:27 localhost nova_compute[279673]: spinlocks Nov 28 04:45:27 localhost nova_compute[279673]: vpindex Nov 28 04:45:27 localhost nova_compute[279673]: runtime Nov 28 04:45:27 localhost nova_compute[279673]: synic Nov 28 04:45:27 localhost nova_compute[279673]: stimer Nov 28 04:45:27 localhost nova_compute[279673]: reset Nov 28 04:45:27 localhost nova_compute[279673]: vendor_id Nov 28 04:45:27 localhost nova_compute[279673]: frequencies Nov 28 04:45:27 localhost nova_compute[279673]: reenlightenment Nov 28 04:45:27 localhost nova_compute[279673]: tlbflush Nov 28 04:45:27 localhost nova_compute[279673]: ipi Nov 28 04:45:27 localhost nova_compute[279673]: avic Nov 28 04:45:27 localhost nova_compute[279673]: emsr_bitmap Nov 28 04:45:27 localhost nova_compute[279673]: xmm_input Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: 4095 Nov 28 04:45:27 localhost nova_compute[279673]: on Nov 28 04:45:27 localhost nova_compute[279673]: off Nov 28 04:45:27 localhost nova_compute[279673]: off Nov 28 04:45:27 localhost nova_compute[279673]: Linux KVM Hv Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: tdx Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.655 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: /usr/libexec/qemu-kvm Nov 28 04:45:27 localhost nova_compute[279673]: kvm Nov 28 04:45:27 localhost nova_compute[279673]: pc-i440fx-rhel7.6.0 Nov 28 04:45:27 localhost nova_compute[279673]: x86_64 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: /usr/share/OVMF/OVMF_CODE.secboot.fd Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: rom Nov 28 04:45:27 localhost nova_compute[279673]: pflash Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: yes Nov 28 04:45:27 localhost nova_compute[279673]: no Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: no Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: on Nov 28 04:45:27 localhost nova_compute[279673]: off Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: on Nov 28 04:45:27 localhost nova_compute[279673]: off Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome Nov 28 04:45:27 localhost nova_compute[279673]: AMD Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: 486 Nov 28 04:45:27 localhost nova_compute[279673]: 486-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-noTSX Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-noTSX-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Broadwell-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server-noTSX Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cascadelake-Server-v5 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Conroe Nov 28 04:45:27 localhost nova_compute[279673]: Conroe-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Cooperlake Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cooperlake-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Cooperlake-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Denverton Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Denverton-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Denverton-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Denverton-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Dhyana Nov 28 04:45:27 localhost nova_compute[279673]: Dhyana-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Dhyana-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Genoa Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Genoa-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-IBPB Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Milan Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Milan-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Milan-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-Rome-v4 Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-v1 Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-v2 Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: EPYC-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: GraniteRapids Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: GraniteRapids-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: GraniteRapids-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-noTSX Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-noTSX-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Haswell-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-noTSX Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v5 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v6 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Icelake-Server-v7 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: IvyBridge Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: IvyBridge-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: IvyBridge-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: IvyBridge-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: KnightsMill Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: KnightsMill-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nehalem Nov 28 04:45:27 localhost nova_compute[279673]: Nehalem-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nehalem-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nehalem-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G1 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G1-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G2 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G2-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G3 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G3-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G4-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G5 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Opteron_G5-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Penryn Nov 28 04:45:27 localhost nova_compute[279673]: Penryn-v1 Nov 28 04:45:27 localhost nova_compute[279673]: SandyBridge Nov 28 04:45:27 localhost nova_compute[279673]: SandyBridge-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: SandyBridge-v1 Nov 28 04:45:27 localhost nova_compute[279673]: SandyBridge-v2 Nov 28 04:45:27 localhost nova_compute[279673]: SapphireRapids Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: SapphireRapids-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: SapphireRapids-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: SapphireRapids-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: SierraForest Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: SierraForest-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client-noTSX-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Client-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-noTSX-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Skylake-Server-v5 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Snowridge Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Snowridge-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Snowridge-v2 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Snowridge-v3 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Snowridge-v4 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Westmere Nov 28 04:45:27 localhost nova_compute[279673]: Westmere-IBRS Nov 28 04:45:27 localhost nova_compute[279673]: Westmere-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Westmere-v2 Nov 28 04:45:27 localhost nova_compute[279673]: athlon Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: athlon-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: core2duo Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: core2duo-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: coreduo Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: coreduo-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: kvm32 Nov 28 04:45:27 localhost nova_compute[279673]: kvm32-v1 Nov 28 04:45:27 localhost nova_compute[279673]: kvm64 Nov 28 04:45:27 localhost nova_compute[279673]: kvm64-v1 Nov 28 04:45:27 localhost nova_compute[279673]: n270 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: n270-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: pentium Nov 28 04:45:27 localhost nova_compute[279673]: pentium-v1 Nov 28 04:45:27 localhost nova_compute[279673]: pentium2 Nov 28 04:45:27 localhost nova_compute[279673]: pentium2-v1 Nov 28 04:45:27 localhost nova_compute[279673]: pentium3 Nov 28 04:45:27 localhost nova_compute[279673]: pentium3-v1 Nov 28 04:45:27 localhost nova_compute[279673]: phenom Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: phenom-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: qemu32 Nov 28 04:45:27 localhost nova_compute[279673]: qemu32-v1 Nov 28 04:45:27 localhost nova_compute[279673]: qemu64 Nov 28 04:45:27 localhost nova_compute[279673]: qemu64-v1 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: file Nov 28 04:45:27 localhost nova_compute[279673]: anonymous Nov 28 04:45:27 localhost nova_compute[279673]: memfd Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: disk Nov 28 04:45:27 localhost nova_compute[279673]: cdrom Nov 28 04:45:27 localhost nova_compute[279673]: floppy Nov 28 04:45:27 localhost nova_compute[279673]: lun Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: ide Nov 28 04:45:27 localhost nova_compute[279673]: fdc Nov 28 04:45:27 localhost nova_compute[279673]: scsi Nov 28 04:45:27 localhost nova_compute[279673]: virtio Nov 28 04:45:27 localhost nova_compute[279673]: usb Nov 28 04:45:27 localhost nova_compute[279673]: sata Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: virtio Nov 28 04:45:27 localhost nova_compute[279673]: virtio-transitional Nov 28 04:45:27 localhost nova_compute[279673]: virtio-non-transitional Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: vnc Nov 28 04:45:27 localhost nova_compute[279673]: egl-headless Nov 28 04:45:27 localhost nova_compute[279673]: dbus Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: subsystem Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: default Nov 28 04:45:27 localhost nova_compute[279673]: mandatory Nov 28 04:45:27 localhost nova_compute[279673]: requisite Nov 28 04:45:27 localhost nova_compute[279673]: optional Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: usb Nov 28 04:45:27 localhost nova_compute[279673]: pci Nov 28 04:45:27 localhost nova_compute[279673]: scsi Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: virtio Nov 28 04:45:27 localhost nova_compute[279673]: virtio-transitional Nov 28 04:45:27 localhost nova_compute[279673]: virtio-non-transitional Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: random Nov 28 04:45:27 localhost nova_compute[279673]: egd Nov 28 04:45:27 localhost nova_compute[279673]: builtin Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: path Nov 28 04:45:27 localhost nova_compute[279673]: handle Nov 28 04:45:27 localhost nova_compute[279673]: virtiofs Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: tpm-tis Nov 28 04:45:27 localhost nova_compute[279673]: tpm-crb Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: emulator Nov 28 04:45:27 localhost nova_compute[279673]: external Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: 2.0 Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: usb Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: pty Nov 28 04:45:27 localhost nova_compute[279673]: unix Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: qemu Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: builtin Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: default Nov 28 04:45:27 localhost nova_compute[279673]: passt Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: isa Nov 28 04:45:27 localhost nova_compute[279673]: hyperv Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: null Nov 28 04:45:27 localhost nova_compute[279673]: vc Nov 28 04:45:27 localhost nova_compute[279673]: pty Nov 28 04:45:27 localhost nova_compute[279673]: dev Nov 28 04:45:27 localhost nova_compute[279673]: file Nov 28 04:45:27 localhost nova_compute[279673]: pipe Nov 28 04:45:27 localhost nova_compute[279673]: stdio Nov 28 04:45:27 localhost nova_compute[279673]: udp Nov 28 04:45:27 localhost nova_compute[279673]: tcp Nov 28 04:45:27 localhost nova_compute[279673]: unix Nov 28 04:45:27 localhost nova_compute[279673]: qemu-vdagent Nov 28 04:45:27 localhost nova_compute[279673]: dbus Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: relaxed Nov 28 04:45:27 localhost nova_compute[279673]: vapic Nov 28 04:45:27 localhost nova_compute[279673]: spinlocks Nov 28 04:45:27 localhost nova_compute[279673]: vpindex Nov 28 04:45:27 localhost nova_compute[279673]: runtime Nov 28 04:45:27 localhost nova_compute[279673]: synic Nov 28 04:45:27 localhost nova_compute[279673]: stimer Nov 28 04:45:27 localhost nova_compute[279673]: reset Nov 28 04:45:27 localhost nova_compute[279673]: vendor_id Nov 28 04:45:27 localhost nova_compute[279673]: frequencies Nov 28 04:45:27 localhost nova_compute[279673]: reenlightenment Nov 28 04:45:27 localhost nova_compute[279673]: tlbflush Nov 28 04:45:27 localhost nova_compute[279673]: ipi Nov 28 04:45:27 localhost nova_compute[279673]: avic Nov 28 04:45:27 localhost nova_compute[279673]: emsr_bitmap Nov 28 04:45:27 localhost nova_compute[279673]: xmm_input Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: 4095 Nov 28 04:45:27 localhost nova_compute[279673]: on Nov 28 04:45:27 localhost nova_compute[279673]: off Nov 28 04:45:27 localhost nova_compute[279673]: off Nov 28 04:45:27 localhost nova_compute[279673]: Linux KVM Hv Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: tdx Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: Nov 28 04:45:27 localhost nova_compute[279673]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.713 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.714 279685 INFO nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Secure Boot support detected#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.716 279685 INFO nova.virt.libvirt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.717 279685 INFO nova.virt.libvirt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.732 279685 DEBUG nova.virt.libvirt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.761 279685 INFO nova.virt.node [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Determined node identity 35fead26-0bad-4950-b646-987079d58a17 from /var/lib/nova/compute_id#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.781 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Verified node 35fead26-0bad-4950-b646-987079d58a17 matches my host np0005538513.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.820 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.825 279685 DEBUG nova.virt.libvirt.vif [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T08:32:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=,hidden=False,host='np0005538513.localdomain',hostname='test',id=2,image_ref='391767f1-35f2-4b68-ae15-e0b29db66dcb',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-28T08:33:06Z,launched_on='np0005538513.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=,node='np0005538513.localdomain',numa_topology=None,old_flavor=,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='9dda653c53224db086060962b0702694',ramdisk_id='',reservation_id='r-a3c307c0',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata=,tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2025-11-28T08:33:07Z,user_data=None,user_id='4d9169247d4447d0a8dd4c33f8b23dee',uuid=c2f0c7d6-df5f-4541-8b2c-bc1eaf805812,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.825 279685 DEBUG nova.network.os_vif_util [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Converting VIF {"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.826 279685 DEBUG nova.network.os_vif_util [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.827 279685 DEBUG os_vif [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.902 279685 DEBUG ovsdbapp.backend.ovs_idl [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.902 279685 DEBUG ovsdbapp.backend.ovs_idl [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.902 279685 DEBUG ovsdbapp.backend.ovs_idl [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.903 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.903 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.903 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.904 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.905 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.908 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.924 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.924 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.924 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 28 04:45:27 localhost nova_compute[279673]: 2025-11-28 09:45:27.926 279685 INFO oslo.privsep.daemon [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpboyt0avz/privsep.sock']#033[00m Nov 28 04:45:28 localhost nova_compute[279673]: 2025-11-28 09:45:28.519 279685 INFO oslo.privsep.daemon [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Nov 28 04:45:28 localhost nova_compute[279673]: 2025-11-28 09:45:28.426 279933 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 28 04:45:28 localhost nova_compute[279673]: 2025-11-28 09:45:28.431 279933 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 28 04:45:28 localhost nova_compute[279673]: 2025-11-28 09:45:28.434 279933 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m Nov 28 04:45:28 localhost nova_compute[279673]: 2025-11-28 09:45:28.434 279933 INFO oslo.privsep.daemon [-] privsep daemon running as pid 279933#033[00m Nov 28 04:45:28 localhost python3.9[279932]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Nov 28 04:45:28 localhost nova_compute[279673]: 2025-11-28 09:45:28.776 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:28 localhost nova_compute[279673]: 2025-11-28 09:45:28.777 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09612b07-51, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:45:28 localhost nova_compute[279673]: 2025-11-28 09:45:28.777 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap09612b07-51, col_values=(('external_ids', {'iface-id': '09612b07-5142-4b0f-9dab-74bf4403f69f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f4:fc:6c', 'vm-uuid': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:45:28 localhost nova_compute[279673]: 2025-11-28 09:45:28.777 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 28 04:45:28 localhost nova_compute[279673]: 2025-11-28 09:45:28.778 279685 INFO os_vif [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51')#033[00m Nov 28 04:45:28 localhost nova_compute[279673]: 2025-11-28 09:45:28.778 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 04:45:28 localhost nova_compute[279673]: 2025-11-28 09:45:28.782 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304#033[00m Nov 28 04:45:28 localhost nova_compute[279673]: 2025-11-28 09:45:28.782 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m Nov 28 04:45:28 localhost nova_compute[279673]: 2025-11-28 09:45:28.896 279685 DEBUG oslo_concurrency.lockutils [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:45:28 localhost nova_compute[279673]: 2025-11-28 09:45:28.896 279685 DEBUG oslo_concurrency.lockutils [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:45:28 localhost nova_compute[279673]: 2025-11-28 09:45:28.896 279685 DEBUG oslo_concurrency.lockutils [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:45:28 localhost nova_compute[279673]: 2025-11-28 09:45:28.897 279685 DEBUG nova.compute.resource_tracker [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 04:45:28 localhost nova_compute[279673]: 2025-11-28 09:45:28.897 279685 DEBUG oslo_concurrency.processutils [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:45:29 localhost systemd[1]: Started libpod-conmon-f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967.scope. Nov 28 04:45:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:45:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:45:29 localhost systemd[1]: tmp-crun.aqJ26p.mount: Deactivated successfully. Nov 28 04:45:29 localhost systemd[1]: Started libcrun container. Nov 28 04:45:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91ffc49efb3c7a4f0573f98c172baf0327c9cb7db605d9d237beebdefd054f1a/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff) Nov 28 04:45:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91ffc49efb3c7a4f0573f98c172baf0327c9cb7db605d9d237beebdefd054f1a/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Nov 28 04:45:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91ffc49efb3c7a4f0573f98c172baf0327c9cb7db605d9d237beebdefd054f1a/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Nov 28 04:45:29 localhost podman[279962]: 2025-11-28 09:45:29.07530636 +0000 UTC m=+0.206430156 container init f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, container_name=nova_compute_init, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 28 04:45:29 localhost podman[279962]: 2025-11-28 09:45:29.087671029 +0000 UTC m=+0.218794825 container start f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 04:45:29 localhost python3.9[279932]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Applying nova statedir ownership Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436 Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/ Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436 Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0 Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/ Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436 Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0 Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/ Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 already 42436:42436 Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 to system_u:object_r:container_file_t:s0 Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/console.log Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436 Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0 Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/2b1009b2ab824d8ddaaa3afb1ca6ce3f88abf415 Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66 Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/ Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436 Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0 Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-2b1009b2ab824d8ddaaa3afb1ca6ce3f88abf415 Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66 Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436 Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0 Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/ Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436 Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0 Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/ Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436 Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0 Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/b234715fc878456b41e32c4fbc669b417044dbe6c6684bbc9059e5c93396ffea Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/469bc4441baff9216df986857f9ff45dbf25965a8d2f755a6449ac2645cb7191 Nov 28 04:45:29 localhost nova_compute_init[280019]: INFO:nova_statedir:Nova statedir ownership complete Nov 28 04:45:29 localhost systemd[1]: libpod-f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967.scope: Deactivated successfully. Nov 28 04:45:29 localhost podman[279977]: 2025-11-28 09:45:29.17159748 +0000 UTC m=+0.138888807 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 04:45:29 localhost podman[280034]: 2025-11-28 09:45:29.223763719 +0000 UTC m=+0.060993410 container died f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 28 04:45:29 localhost podman[279988]: 2025-11-28 09:45:29.224697778 +0000 UTC m=+0.190091686 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:45:29 localhost podman[279977]: 2025-11-28 09:45:29.241451971 +0000 UTC m=+0.208743308 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 28 04:45:29 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:45:29 localhost podman[279988]: 2025-11-28 09:45:29.305109612 +0000 UTC m=+0.270503560 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:45:29 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:45:29 localhost podman[280034]: 2025-11-28 09:45:29.365858702 +0000 UTC m=+0.203088353 container cleanup f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125) Nov 28 04:45:29 localhost systemd[1]: libpod-conmon-f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967.scope: Deactivated successfully. Nov 28 04:45:29 localhost nova_compute[279673]: 2025-11-28 09:45:29.371 279685 DEBUG oslo_concurrency.processutils [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:45:29 localhost nova_compute[279673]: 2025-11-28 09:45:29.440 279685 DEBUG nova.virt.libvirt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:45:29 localhost nova_compute[279673]: 2025-11-28 09:45:29.440 279685 DEBUG nova.virt.libvirt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:45:29 localhost nova_compute[279673]: 2025-11-28 09:45:29.664 279685 WARNING nova.virt.libvirt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 04:45:29 localhost nova_compute[279673]: 2025-11-28 09:45:29.667 279685 DEBUG nova.compute.resource_tracker [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12110MB free_disk=41.837093353271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 04:45:29 localhost nova_compute[279673]: 2025-11-28 09:45:29.667 279685 DEBUG oslo_concurrency.lockutils [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:45:29 localhost nova_compute[279673]: 2025-11-28 09:45:29.668 279685 DEBUG oslo_concurrency.lockutils [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:45:29 localhost nova_compute[279673]: 2025-11-28 09:45:29.840 279685 DEBUG nova.compute.resource_tracker [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 04:45:29 localhost nova_compute[279673]: 2025-11-28 09:45:29.841 279685 DEBUG nova.compute.resource_tracker [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 04:45:29 localhost nova_compute[279673]: 2025-11-28 09:45:29.841 279685 DEBUG nova.compute.resource_tracker [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 04:45:29 localhost nova_compute[279673]: 2025-11-28 09:45:29.901 279685 DEBUG nova.scheduler.client.report [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Refreshing inventories for resource provider 35fead26-0bad-4950-b646-987079d58a17 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 28 04:45:29 localhost nova_compute[279673]: 2025-11-28 09:45:29.919 279685 DEBUG nova.scheduler.client.report [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Updating ProviderTree inventory for provider 35fead26-0bad-4950-b646-987079d58a17 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 28 04:45:29 localhost nova_compute[279673]: 2025-11-28 09:45:29.919 279685 DEBUG nova.compute.provider_tree [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 28 04:45:29 localhost nova_compute[279673]: 2025-11-28 09:45:29.932 279685 DEBUG nova.scheduler.client.report [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Refreshing aggregate associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 28 04:45:29 localhost systemd[1]: var-lib-containers-storage-overlay-91ffc49efb3c7a4f0573f98c172baf0327c9cb7db605d9d237beebdefd054f1a-merged.mount: Deactivated successfully. Nov 28 04:45:29 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f45a1302bc34e54c92d7122c0aac4751e2c90e7e3b430604aa748a383ee0a967-userdata-shm.mount: Deactivated successfully. Nov 28 04:45:29 localhost nova_compute[279673]: 2025-11-28 09:45:29.971 279685 DEBUG nova.scheduler.client.report [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Refreshing trait associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_FMA3,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_ABM,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AMD_SVM,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_CLMUL,HW_CPU_X86_BMI2,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 28 04:45:30 localhost nova_compute[279673]: 2025-11-28 09:45:30.014 279685 DEBUG oslo_concurrency.processutils [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:45:30 localhost nova_compute[279673]: 2025-11-28 09:45:30.492 279685 DEBUG oslo_concurrency.processutils [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:45:30 localhost nova_compute[279673]: 2025-11-28 09:45:30.499 279685 DEBUG nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N Nov 28 04:45:30 localhost nova_compute[279673]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m Nov 28 04:45:30 localhost nova_compute[279673]: 2025-11-28 09:45:30.500 279685 INFO nova.virt.libvirt.host [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] kernel doesn't support AMD SEV#033[00m Nov 28 04:45:30 localhost nova_compute[279673]: 2025-11-28 09:45:30.502 279685 DEBUG nova.compute.provider_tree [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 04:45:30 localhost nova_compute[279673]: 2025-11-28 09:45:30.503 279685 DEBUG nova.virt.libvirt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Nov 28 04:45:30 localhost nova_compute[279673]: 2025-11-28 09:45:30.527 279685 DEBUG nova.scheduler.client.report [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 04:45:30 localhost nova_compute[279673]: 2025-11-28 09:45:30.555 279685 DEBUG nova.compute.resource_tracker [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 04:45:30 localhost nova_compute[279673]: 2025-11-28 09:45:30.556 279685 DEBUG oslo_concurrency.lockutils [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.888s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:45:30 localhost nova_compute[279673]: 2025-11-28 09:45:30.556 279685 DEBUG nova.service [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m Nov 28 04:45:30 localhost systemd[1]: session-59.scope: Deactivated successfully. Nov 28 04:45:30 localhost systemd[1]: session-59.scope: Consumed 1min 29.582s CPU time. Nov 28 04:45:30 localhost systemd-logind[764]: Session 59 logged out. Waiting for processes to exit. Nov 28 04:45:30 localhost systemd-logind[764]: Removed session 59. Nov 28 04:45:30 localhost nova_compute[279673]: 2025-11-28 09:45:30.585 279685 DEBUG nova.service [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m Nov 28 04:45:30 localhost nova_compute[279673]: 2025-11-28 09:45:30.586 279685 DEBUG nova.servicegroup.drivers.db [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] DB_Driver: join new ServiceGroup member np0005538513.localdomain to the compute group, service = join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m Nov 28 04:45:31 localhost nova_compute[279673]: 2025-11-28 09:45:31.483 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:45:31 localhost systemd[1]: tmp-crun.ruTr97.mount: Deactivated successfully. Nov 28 04:45:31 localhost podman[280122]: 2025-11-28 09:45:31.863313794 +0000 UTC m=+0.097948032 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Nov 28 04:45:31 localhost podman[280122]: 2025-11-28 09:45:31.903780133 +0000 UTC m=+0.138414381 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 04:45:31 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:45:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49116 DF PROTO=TCP SPT=47508 DPT=9102 SEQ=3958754893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2931840000000001030307) Nov 28 04:45:32 localhost nova_compute[279673]: 2025-11-28 09:45:32.942 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49117 DF PROTO=TCP SPT=47508 DPT=9102 SEQ=3958754893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2935830000000001030307) Nov 28 04:45:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2457 DF PROTO=TCP SPT=34262 DPT=9102 SEQ=3361221625 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2937830000000001030307) Nov 28 04:45:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49118 DF PROTO=TCP SPT=47508 DPT=9102 SEQ=3958754893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB293D830000000001030307) Nov 28 04:45:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7604 DF PROTO=TCP SPT=33082 DPT=9102 SEQ=3782079456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2941820000000001030307) Nov 28 04:45:36 localhost nova_compute[279673]: 2025-11-28 09:45:36.486 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:45:37 localhost podman[280141]: 2025-11-28 09:45:37.869775841 +0000 UTC m=+0.104046399 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 28 04:45:37 localhost podman[280141]: 2025-11-28 09:45:37.906543607 +0000 UTC m=+0.140814165 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:45:37 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:45:37 localhost nova_compute[279673]: 2025-11-28 09:45:37.979 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49119 DF PROTO=TCP SPT=47508 DPT=9102 SEQ=3958754893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB294D420000000001030307) Nov 28 04:45:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:45:39 localhost podman[280162]: 2025-11-28 09:45:39.841763952 +0000 UTC m=+0.077152865 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible) Nov 28 04:45:39 localhost podman[280162]: 2025-11-28 09:45:39.877500107 +0000 UTC m=+0.112888970 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:45:39 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:45:40 localhost podman[238687]: time="2025-11-28T09:45:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:45:40 localhost podman[238687]: @ - - [28/Nov/2025:09:45:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1" Nov 28 04:45:40 localhost podman[238687]: @ - - [28/Nov/2025:09:45:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17224 "" "Go-http-client/1.1" Nov 28 04:45:41 localhost nova_compute[279673]: 2025-11-28 09:45:41.521 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:43 localhost nova_compute[279673]: 2025-11-28 09:45:43.012 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:46 localhost nova_compute[279673]: 2025-11-28 09:45:46.523 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:47 localhost ovn_metadata_agent[158125]: 2025-11-28 09:45:47.611 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 04:45:47 localhost ovn_metadata_agent[158125]: 2025-11-28 09:45:47.612 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 28 04:45:47 localhost nova_compute[279673]: 2025-11-28 09:45:47.647 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49120 DF PROTO=TCP SPT=47508 DPT=9102 SEQ=3958754893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB296D820000000001030307) Nov 28 04:45:48 localhost nova_compute[279673]: 2025-11-28 09:45:48.014 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:48 localhost openstack_network_exporter[240658]: ERROR 09:45:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:45:48 localhost openstack_network_exporter[240658]: ERROR 09:45:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:45:48 localhost openstack_network_exporter[240658]: ERROR 09:45:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:45:48 localhost openstack_network_exporter[240658]: ERROR 09:45:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:45:48 localhost openstack_network_exporter[240658]: Nov 28 04:45:48 localhost openstack_network_exporter[240658]: ERROR 09:45:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:45:48 localhost openstack_network_exporter[240658]: Nov 28 04:45:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:45:49 localhost podman[280181]: 2025-11-28 09:45:49.844451604 +0000 UTC m=+0.081804708 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, release=1755695350, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_id=edpm) Nov 28 04:45:49 localhost podman[280181]: 2025-11-28 09:45:49.881059585 +0000 UTC m=+0.118412709 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers) Nov 28 04:45:49 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:45:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:45:50.824 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:45:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:45:50.824 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:45:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:45:50.825 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:45:51 localhost nova_compute[279673]: 2025-11-28 09:45:51.525 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:52 localhost ovn_metadata_agent[158125]: 2025-11-28 09:45:52.614 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:45:53 localhost nova_compute[279673]: 2025-11-28 09:45:53.017 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:45:55 localhost systemd[1]: tmp-crun.CqC34Q.mount: Deactivated successfully. Nov 28 04:45:55 localhost podman[280199]: 2025-11-28 09:45:55.856682457 +0000 UTC m=+0.090565306 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 04:45:55 localhost podman[280199]: 2025-11-28 09:45:55.864429054 +0000 UTC m=+0.098311853 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:45:55 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:45:56 localhost nova_compute[279673]: 2025-11-28 09:45:56.530 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:58 localhost nova_compute[279673]: 2025-11-28 09:45:58.053 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:45:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:45:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:45:59 localhost podman[280224]: 2025-11-28 09:45:59.848895758 +0000 UTC m=+0.080584541 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:45:59 localhost podman[280224]: 2025-11-28 09:45:59.857328506 +0000 UTC m=+0.089017299 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 04:45:59 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:45:59 localhost podman[280223]: 2025-11-28 09:45:59.904238074 +0000 UTC m=+0.140737104 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:45:59 localhost podman[280223]: 2025-11-28 09:45:59.945533878 +0000 UTC m=+0.182032908 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:45:59 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.671 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.671 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.672 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.713 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 73981952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.714 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69795790-826f-4bef-a74b-eb3f7175364c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73981952, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:46:00.672169', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0c237468-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.844066844, 'message_signature': '8b7ef05c8063d77fe61d50f6e9481e9000bd035efd5656c63336dbfb83da9490'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:46:00.672169', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0c238872-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.844066844, 'message_signature': '78d95ddc81d06d2747dc15175db1bbc24a6e6572e9ff8709400fbcbcc8a06e2c'}]}, 'timestamp': '2025-11-28 09:46:00.714879', '_unique_id': '0ce8be25a6fb40f099f3332d0a798b57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.716 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.717 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.721 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8daddac-88b0-4d90-8a22-bf55f6417c18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:46:00.717926', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '0c24a310-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.889866817, 'message_signature': '523bded1def110a37da60e221beb7ef334eab4dc793805692c44ea470b88f468'}]}, 'timestamp': '2025-11-28 09:46:00.722172', '_unique_id': '204814ff4323457ca860467356b9698f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.723 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.724 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.724 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 96 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86b838e6-9c7d-44d2-a208-3495cb2c814e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 96, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:46:00.724349', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '0c250bca-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.889866817, 'message_signature': '0b097dad012fbeeb49350ed6ee606448c1c13c2e8f5eb0b18a606ca545049281'}]}, 'timestamp': '2025-11-28 09:46:00.724844', '_unique_id': '73d1b305818f4770930368a08e11cd33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.725 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.726 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.727 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 13176 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '135cf464-05ba-401a-b896-e83f20a13754', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 13176, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:46:00.726977', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '0c2573ee-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.889866817, 'message_signature': '01bc09f7c39a44f7d3d5b7b05fbc69788a8962f7ad8590931e334806d236a0df'}]}, 'timestamp': '2025-11-28 09:46:00.727477', '_unique_id': '5998fc29cbe04b2a89e71a2743c15f4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.728 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.729 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.742 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.742 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91e6f2da-48d7-465c-942b-556c4834ebd9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:46:00.729586', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0c27cf40-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.901483573, 'message_signature': '386a3c2fd828ea846f4ad3e78f703b73fffc8b31bdfce2e06436586655f36e42'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:46:00.729586', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0c27e516-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.901483573, 'message_signature': 'ead8d58a284c557a3d05df491689badecf643c8bbf2f22ca20258bfb3770a485'}]}, 'timestamp': '2025-11-28 09:46:00.743476', '_unique_id': '29b44f8bbcb64680987a3d56a4a1f0b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.744 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.745 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.745 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1313024378 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.746 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 168520223 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc69c382-3319-4ce6-9aa0-58968821ea84', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1313024378, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:46:00.745708', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0c284eb6-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.844066844, 'message_signature': '3258451ffdb1ccb82f61e4b50c6ec0b7658c7b9680113c8178273157093f70c5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 168520223, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:46:00.745708', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0c2860ae-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.844066844, 'message_signature': '2d14517fd5281626c14e578e009e6b2a458f93b86179f9ce674b8aa61f803360'}]}, 'timestamp': '2025-11-28 09:46:00.746616', '_unique_id': '431e20af5f0e492988b5d901faa69230'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.747 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.748 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.748 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f7dff75-0abd-4047-8337-d18e48d3d201', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:46:00.748762', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '0c28c558-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.889866817, 'message_signature': 'd6b660a35bb0126b13d9b57b3e2833700d4cb51855de781a456ae36583eb8044'}]}, 'timestamp': '2025-11-28 09:46:00.749254', '_unique_id': '9d539671901a44ba90bfd0f7c8548da1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.750 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.751 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.751 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.751 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c7c9666-8c20-44f1-a894-e68a81711013', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:46:00.751330', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0c292944-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.844066844, 'message_signature': 'ef9311d13208d97796d31b5547c6582d009cb9ef84cd04c3646af0abd8a316e8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:46:00.751330', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0c29395c-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.844066844, 'message_signature': '9c01dff6c6fc517c826e15727d864b6e62049c94194aee090e98ce30de956210'}]}, 'timestamp': '2025-11-28 09:46:00.752197', '_unique_id': '18da2b7e742e4610ae49c17ce8d0f546'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.753 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.754 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.754 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c78dc005-4d73-465b-b394-117f56e550a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:46:00.754342', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '0c299f1e-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.889866817, 'message_signature': '90ba47ec2fcb6bb9494767464a50dbf3eb8a2cbe18a14cb22b74c3e761dea17b'}]}, 'timestamp': '2025-11-28 09:46:00.754796', '_unique_id': '739e6e622fe84f8ca736e22a404b122c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.755 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.756 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.756 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.757 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 434 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a1bd1b9-4f7f-4b23-acbb-e63eb9bfe398', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 434, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:46:00.757051', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '0c2a09ae-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.889866817, 'message_signature': '085cf82ef6fddfb905e509bc13f0c60047366c3b89938156e9f72aa2742e05df'}]}, 'timestamp': '2025-11-28 09:46:00.757529', '_unique_id': '77e2696f88d74e49aaa5bfb27cc6322f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.758 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.759 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.759 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.760 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b7c66d8-cabd-4d9d-83ce-7c5ccc404daa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:46:00.759592', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0c2a6bec-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.901483573, 'message_signature': '321b2f59f821755c10aed941ca6d150215d1def685e8f3e9b00ddd0fb03707c5'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:46:00.759592', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0c2a7d62-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.901483573, 'message_signature': '56c337727df1cf2d47d45e12e61f2ee77524135afd3810e9753c9db6330adf8f'}]}, 'timestamp': '2025-11-28 09:46:00.760461', '_unique_id': '7873be90d23b40b2b8b64f064b903aaf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.761 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.762 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.762 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '138178a3-31e3-44f8-8bd3-0ada4a9f7d3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:46:00.762603', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '0c2ae1e4-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.889866817, 'message_signature': '43ed3b5583aa0630e8e15ff57c1e60bc64e30d37eab8101126cb225968767ab6'}]}, 'timestamp': '2025-11-28 09:46:00.763104', '_unique_id': '01aab747f41d4fc3928edd4add67a1ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.764 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.765 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.765 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 10055 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1cfc7ae2-9b36-44fe-8586-ca26c39987c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10055, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:46:00.765297', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '0c2b4b0c-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.889866817, 'message_signature': 'e793db2d08de9654940dd4947743122ec9dc2f078207c8e20eae19054dae3f95'}]}, 'timestamp': '2025-11-28 09:46:00.765753', '_unique_id': 'ee33acd72e534df99b03bc81a52ae093'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.766 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.767 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.767 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 149 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bfbaeb64-9e2e-44ad-ad5a-af01ddb2ad42', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 149, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:46:00.767822', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '0c2baed0-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.889866817, 'message_signature': '3b63c6e0de31a770bd415a42c4f26756d21758aa801d4f95c91bdba5d8bf4c86'}]}, 'timestamp': '2025-11-28 09:46:00.768310', '_unique_id': '44c0e3ce270f4dbb82720bbc4a3ec6d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.770 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.770 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.770 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc9dced6-5aaa-44e0-9187-e911f856fd82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:46:00.770410', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0c2c12e4-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.901483573, 'message_signature': '12faf3e2fefc2de52e586590577ae527febaae7df5b9a199bebd5ddb92f791cc'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:46:00.770410', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0c2c2414-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.901483573, 'message_signature': 'e151b9572d2b55bad3e0bcaf00ed0a0b0b56f5047e94f5281708e4de1b351a5e'}]}, 'timestamp': '2025-11-28 09:46:00.771283', '_unique_id': '50f8c5ef8beb4d0187772979fe9e1a9e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.773 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.795 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 51860000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15209a9c-d1dd-475f-b66c-c99748c0cff1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 51860000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:46:00.773457', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '0c2ff616-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.96762273, 'message_signature': '0fc68ce59dd98b4d64c4e98689528022d6a466d5630b0ada185cd02cbfdff1e8'}]}, 'timestamp': '2025-11-28 09:46:00.796337', '_unique_id': '893573b226c44ee5877cfb6f66343a35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.797 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.798 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.798 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.799 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '352549ca-b39a-47a1-b588-077b2998cafe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:46:00.798691', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0c306358-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.844066844, 'message_signature': '203466ffe0361555e440b291c3eb28d23170ddf4e7c2fc7fe419273657149c0d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:46:00.798691', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0c30762c-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.844066844, 'message_signature': '63bff0a7175da3fd27af2c25e3cc9275eb9caf1dc5d95674e6d95881baf9c312'}]}, 'timestamp': '2025-11-28 09:46:00.799596', '_unique_id': '643908a4add3432aa4f948515d1457df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.800 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.801 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.801 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 52.40234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c11e75d3-0419-4a2c-bb6e-3251deb091d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.40234375, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:46:00.801750', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '0c30dacc-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.96762273, 'message_signature': '0206c2771afd58b28758b00ac77c2fc5e47beeac372a012b0fa44549025ea87e'}]}, 'timestamp': '2025-11-28 09:46:00.802213', '_unique_id': '35fef72b6feb4fe1ae76afea5fa17f11'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.803 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.804 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.804 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 566 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.805 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3152f87a-bba1-4041-adce-ff07282fbdb7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 566, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:46:00.804561', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0c31499e-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.844066844, 'message_signature': 'b0fe6014b30681eb1274db46ed7f63d01c94f0b6877166cea8645953e2d89715'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:46:00.804561', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0c315b46-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.844066844, 'message_signature': 'bb97819670396f47e9cf4256d43bcb6e56978762efaac2e0d10c5619d7b2df5c'}]}, 'timestamp': '2025-11-28 09:46:00.805461', '_unique_id': 'a454df7b4c884cde883753e710a32300'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.806 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.807 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.807 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 305908425 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.808 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 30452399 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4cab442-3178-471e-ab10-31d1d80bcc27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 305908425, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:46:00.807770', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0c31c61c-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.844066844, 'message_signature': '4e476332997d5399ad1a893f11b455ae6553d57ce26c81ca916d882575d49f6b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30452399, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:46:00.807770', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0c31d79c-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.844066844, 'message_signature': 'e0a7338b9a9a234e642b96d422c18d37a62ae589975bf5932bc858b69a590e34'}]}, 'timestamp': '2025-11-28 09:46:00.808642', '_unique_id': 'b77a29afbf7d437b83876e12d67f66bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.809 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.810 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.810 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.810 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 434 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ab9d566-b352-4a24-8c62-5cd93c69ad65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 434, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:46:00.810917', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '0c32425e-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 10994.889866817, 'message_signature': '570ce8a0f62c3048d0663cd2c42eeae1829ce851dacf64f3c758c2e244d2812e'}]}, 'timestamp': '2025-11-28 09:46:00.811407', '_unique_id': 'e07ff4235af844439981cf48b30cd636'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.813 12 ERROR oslo_messaging.notify.messaging Nov 28 04:46:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:46:00.814 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:46:01 localhost nova_compute[279673]: 2025-11-28 09:46:01.567 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52243 DF PROTO=TCP SPT=41230 DPT=9102 SEQ=2153373983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB29A6B40000000001030307) Nov 28 04:46:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:46:02 localhost podman[280264]: 2025-11-28 09:46:02.85486874 +0000 UTC m=+0.082971553 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 28 04:46:02 localhost podman[280264]: 2025-11-28 09:46:02.870569471 +0000 UTC m=+0.098672244 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 28 04:46:02 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:46:03 localhost nova_compute[279673]: 2025-11-28 09:46:03.145 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52244 DF PROTO=TCP SPT=41230 DPT=9102 SEQ=2153373983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB29AAC20000000001030307) Nov 28 04:46:03 localhost nova_compute[279673]: 2025-11-28 09:46:03.898 279685 DEBUG nova.compute.manager [None req-0ea6a6a1-b9ca-4efa-989e-4c0fd01ead00 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 04:46:03 localhost nova_compute[279673]: 2025-11-28 09:46:03.904 279685 INFO nova.compute.manager [None req-0ea6a6a1-b9ca-4efa-989e-4c0fd01ead00 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Retrieving diagnostics#033[00m Nov 28 04:46:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49121 DF PROTO=TCP SPT=47508 DPT=9102 SEQ=3958754893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB29AD830000000001030307) Nov 28 04:46:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52245 DF PROTO=TCP SPT=41230 DPT=9102 SEQ=2153373983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB29B2C20000000001030307) Nov 28 04:46:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2458 DF PROTO=TCP SPT=34262 DPT=9102 SEQ=3361221625 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB29B5820000000001030307) Nov 28 04:46:06 localhost nova_compute[279673]: 2025-11-28 09:46:06.569 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:08 localhost nova_compute[279673]: 2025-11-28 09:46:08.185 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:46:08 localhost podman[280284]: 2025-11-28 09:46:08.858650814 +0000 UTC m=+0.087572234 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 04:46:08 localhost podman[280284]: 2025-11-28 09:46:08.895963077 +0000 UTC m=+0.124884497 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 04:46:08 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:46:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52246 DF PROTO=TCP SPT=41230 DPT=9102 SEQ=2153373983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB29C2820000000001030307) Nov 28 04:46:09 localhost nova_compute[279673]: 2025-11-28 09:46:09.551 279685 DEBUG oslo_concurrency.lockutils [None req-1537ba4d-75cc-4878-bfd6-c825657acb5e 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Acquiring lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:46:09 localhost nova_compute[279673]: 2025-11-28 09:46:09.552 279685 DEBUG oslo_concurrency.lockutils [None req-1537ba4d-75cc-4878-bfd6-c825657acb5e 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" acquired by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:46:09 localhost nova_compute[279673]: 2025-11-28 09:46:09.552 279685 DEBUG nova.compute.manager [None req-1537ba4d-75cc-4878-bfd6-c825657acb5e 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 04:46:09 localhost nova_compute[279673]: 2025-11-28 09:46:09.556 279685 DEBUG nova.compute.manager [None req-1537ba4d-75cc-4878-bfd6-c825657acb5e 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m Nov 28 04:46:09 localhost nova_compute[279673]: 2025-11-28 09:46:09.561 279685 DEBUG nova.objects.instance [None req-1537ba4d-75cc-4878-bfd6-c825657acb5e 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Lazy-loading 'flavor' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:46:09 localhost nova_compute[279673]: 2025-11-28 09:46:09.610 279685 DEBUG nova.virt.libvirt.driver [None req-1537ba4d-75cc-4878-bfd6-c825657acb5e 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m Nov 28 04:46:10 localhost podman[238687]: time="2025-11-28T09:46:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:46:10 localhost podman[238687]: @ - - [28/Nov/2025:09:46:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148612 "" "Go-http-client/1.1" Nov 28 04:46:10 localhost podman[238687]: @ - - [28/Nov/2025:09:46:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17225 "" "Go-http-client/1.1" Nov 28 04:46:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:46:10 localhost podman[280307]: 2025-11-28 09:46:10.845769479 +0000 UTC m=+0.084023096 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:46:10 localhost podman[280307]: 2025-11-28 09:46:10.881964358 +0000 UTC m=+0.120217985 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible) Nov 28 04:46:10 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:46:11 localhost nova_compute[279673]: 2025-11-28 09:46:11.602 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:12 localhost kernel: device tap09612b07-51 left promiscuous mode Nov 28 04:46:12 localhost NetworkManager[5967]: [1764323172.1172] device (tap09612b07-51): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed') Nov 28 04:46:12 localhost nova_compute[279673]: 2025-11-28 09:46:12.131 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:12 localhost ovn_controller[152322]: 2025-11-28T09:46:12Z|00049|binding|INFO|Releasing lport 09612b07-5142-4b0f-9dab-74bf4403f69f from this chassis (sb_readonly=0) Nov 28 04:46:12 localhost ovn_controller[152322]: 2025-11-28T09:46:12Z|00050|binding|INFO|Setting lport 09612b07-5142-4b0f-9dab-74bf4403f69f down in Southbound Nov 28 04:46:12 localhost ovn_controller[152322]: 2025-11-28T09:46:12Z|00051|binding|INFO|Removing iface tap09612b07-51 ovn-installed in OVS Nov 28 04:46:12 localhost nova_compute[279673]: 2025-11-28 09:46:12.134 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:12 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:12.146 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:fc:6c 192.168.0.142'], port_security=['fa:16:3e:f4:fc:6c 192.168.0.142'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.142/24', 'neutron:device_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005538513.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-40d5da59-6201-424a-8380-80ecc3d67c7e', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '9dda653c53224db086060962b0702694', 'neutron:revision_number': '7', 'neutron:security_group_ids': '6d2c5a31-c9e5-413a-bccf-f97c7687bd94 b3c60f08-3369-426b-b744-9cef04caaa7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f3122580-f73f-40fa-a838-6bad2ff9da2f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=09612b07-5142-4b0f-9dab-74bf4403f69f) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 04:46:12 localhost nova_compute[279673]: 2025-11-28 09:46:12.148 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:12 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:12.148 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 09612b07-5142-4b0f-9dab-74bf4403f69f in datapath 40d5da59-6201-424a-8380-80ecc3d67c7e unbound from our chassis#033[00m Nov 28 04:46:12 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:12.150 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 40d5da59-6201-424a-8380-80ecc3d67c7e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 04:46:12 localhost ovn_controller[152322]: 2025-11-28T09:46:12Z|00052|ovn_bfd|INFO|Disabled BFD on interface ovn-07900d-0 Nov 28 04:46:12 localhost ovn_controller[152322]: 2025-11-28T09:46:12Z|00053|ovn_bfd|INFO|Disabled BFD on interface ovn-c3237d-0 Nov 28 04:46:12 localhost ovn_controller[152322]: 2025-11-28T09:46:12Z|00054|ovn_bfd|INFO|Disabled BFD on interface ovn-11aa47-0 Nov 28 04:46:12 localhost ovn_controller[152322]: 2025-11-28T09:46:12Z|00055|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 04:46:12 localhost nova_compute[279673]: 2025-11-28 09:46:12.154 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:12 localhost systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully. Nov 28 04:46:12 localhost systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 3min 37.832s CPU time. Nov 28 04:46:12 localhost nova_compute[279673]: 2025-11-28 09:46:12.162 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:12 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:12.163 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[4820d0d9-7e2f-41c0-b3ba-1697627cf918]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:46:12 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:12.167 158130 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e namespace which is not needed anymore#033[00m Nov 28 04:46:12 localhost systemd-machined[83422]: Machine qemu-1-instance-00000002 terminated. Nov 28 04:46:12 localhost nova_compute[279673]: 2025-11-28 09:46:12.197 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:12 localhost ovn_controller[152322]: 2025-11-28T09:46:12Z|00056|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 04:46:12 localhost nova_compute[279673]: 2025-11-28 09:46:12.207 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:12 localhost systemd[1]: libpod-9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef.scope: Deactivated successfully. Nov 28 04:46:12 localhost podman[280352]: 2025-11-28 09:46:12.379196623 +0000 UTC m=+0.082845899 container died 9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12) Nov 28 04:46:12 localhost podman[280352]: 2025-11-28 09:46:12.567299387 +0000 UTC m=+0.270948603 container cleanup 9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Nov 28 04:46:12 localhost podman[280374]: 2025-11-28 09:46:12.581625485 +0000 UTC m=+0.188969411 container cleanup 9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, io.openshift.expose-services=) Nov 28 04:46:12 localhost systemd[1]: libpod-conmon-9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef.scope: Deactivated successfully. Nov 28 04:46:12 localhost nova_compute[279673]: 2025-11-28 09:46:12.631 279685 INFO nova.virt.libvirt.driver [None req-1537ba4d-75cc-4878-bfd6-c825657acb5e 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Instance shutdown successfully after 3 seconds.#033[00m Nov 28 04:46:12 localhost nova_compute[279673]: 2025-11-28 09:46:12.641 279685 INFO nova.virt.libvirt.driver [-] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Instance destroyed successfully.#033[00m Nov 28 04:46:12 localhost nova_compute[279673]: 2025-11-28 09:46:12.641 279685 DEBUG nova.objects.instance [None req-1537ba4d-75cc-4878-bfd6-c825657acb5e 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Lazy-loading 'numa_topology' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:46:12 localhost nova_compute[279673]: 2025-11-28 09:46:12.672 279685 DEBUG nova.compute.manager [None req-1537ba4d-75cc-4878-bfd6-c825657acb5e 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 04:46:12 localhost podman[280393]: 2025-11-28 09:46:12.675165991 +0000 UTC m=+0.084346685 container remove 9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, release=1761123044, distribution-scope=public, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Nov 28 04:46:12 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:12.680 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[c14db70d-78af-48a1-84fc-62e817cae636]: (4, ('Fri Nov 28 09:46:12 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e (9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef)\n9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef\nFri Nov 28 09:46:12 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e (9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef)\n9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:46:12 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:12.682 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[21a9fbfe-2952-42ee-bb21-3c7e39aa4540]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:46:12 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:12.683 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap40d5da59-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:46:12 localhost kernel: device tap40d5da59-60 left promiscuous mode Nov 28 04:46:12 localhost nova_compute[279673]: 2025-11-28 09:46:12.731 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:12 localhost nova_compute[279673]: 2025-11-28 09:46:12.735 279685 DEBUG nova.compute.manager [req-4f08f0dc-612c-4385-be28-ef925e539829 req-2e976fac-48e1-4662-a489-6685eb8d4499 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Received event network-vif-unplugged-09612b07-5142-4b0f-9dab-74bf4403f69f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 28 04:46:12 localhost nova_compute[279673]: 2025-11-28 09:46:12.736 279685 DEBUG oslo_concurrency.lockutils [req-4f08f0dc-612c-4385-be28-ef925e539829 req-2e976fac-48e1-4662-a489-6685eb8d4499 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:46:12 localhost nova_compute[279673]: 2025-11-28 09:46:12.736 279685 DEBUG oslo_concurrency.lockutils [req-4f08f0dc-612c-4385-be28-ef925e539829 req-2e976fac-48e1-4662-a489-6685eb8d4499 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:46:12 localhost nova_compute[279673]: 2025-11-28 09:46:12.737 279685 DEBUG oslo_concurrency.lockutils [req-4f08f0dc-612c-4385-be28-ef925e539829 req-2e976fac-48e1-4662-a489-6685eb8d4499 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:46:12 localhost nova_compute[279673]: 2025-11-28 09:46:12.737 279685 DEBUG nova.compute.manager [req-4f08f0dc-612c-4385-be28-ef925e539829 req-2e976fac-48e1-4662-a489-6685eb8d4499 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] No waiting events found dispatching network-vif-unplugged-09612b07-5142-4b0f-9dab-74bf4403f69f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 28 04:46:12 localhost nova_compute[279673]: 2025-11-28 09:46:12.738 279685 WARNING nova.compute.manager [req-4f08f0dc-612c-4385-be28-ef925e539829 req-2e976fac-48e1-4662-a489-6685eb8d4499 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Received unexpected event network-vif-unplugged-09612b07-5142-4b0f-9dab-74bf4403f69f for instance with vm_state active and task_state powering-off.#033[00m Nov 28 04:46:12 localhost nova_compute[279673]: 2025-11-28 09:46:12.743 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:12 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:12.749 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[15198271-8fdd-47bf-ab40-06127765ab2a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:46:12 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:12.766 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[bc040bb8-8af1-4cf2-88be-381b34075c0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:46:12 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:12.768 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[4cfc5912-5099-45ab-90ca-ae40740bb8b1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:46:12 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:12.784 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[730baa0a-cdf1-4d49-8382-e17d4c69f412]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662040, 'reachable_time': 20124, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280412, 'error': None, 'target': 'ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:46:12 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:12.794 158264 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Nov 28 04:46:12 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:12.795 158264 DEBUG oslo.privsep.daemon [-] privsep: reply[b58c1bb3-d57d-4548-9296-dd6fddf99439]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:46:12 localhost nova_compute[279673]: 2025-11-28 09:46:12.804 279685 DEBUG oslo_concurrency.lockutils [None req-1537ba4d-75cc-4878-bfd6-c825657acb5e 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" "released" by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" :: held 3.253s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:46:13 localhost nova_compute[279673]: 2025-11-28 09:46:13.187 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:13 localhost systemd[1]: var-lib-containers-storage-overlay-b264a93705d5a28ba8f902d268499c1bea32890d992fb54a7c6890490d1eeb3f-merged.mount: Deactivated successfully. Nov 28 04:46:13 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9429b40bec6298d8707c7d22e14abb78b10b40e119349c163383e8143bb88fef-userdata-shm.mount: Deactivated successfully. Nov 28 04:46:13 localhost systemd[1]: run-netns-ovnmeta\x2d40d5da59\x2d6201\x2d424a\x2d8380\x2d80ecc3d67c7e.mount: Deactivated successfully. Nov 28 04:46:14 localhost nova_compute[279673]: 2025-11-28 09:46:14.588 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:46:14 localhost nova_compute[279673]: 2025-11-28 09:46:14.617 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Triggering sync for uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Nov 28 04:46:14 localhost nova_compute[279673]: 2025-11-28 09:46:14.618 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:46:14 localhost nova_compute[279673]: 2025-11-28 09:46:14.619 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:46:14 localhost nova_compute[279673]: 2025-11-28 09:46:14.619 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:46:14 localhost nova_compute[279673]: 2025-11-28 09:46:14.652 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.033s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:46:14 localhost nova_compute[279673]: 2025-11-28 09:46:14.758 279685 DEBUG nova.compute.manager [req-d911f796-b120-46d0-9e51-57e02329e71d req-2c2b322c-3821-4542-a7ea-9674aa755611 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Received event network-vif-plugged-09612b07-5142-4b0f-9dab-74bf4403f69f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 28 04:46:14 localhost nova_compute[279673]: 2025-11-28 09:46:14.758 279685 DEBUG oslo_concurrency.lockutils [req-d911f796-b120-46d0-9e51-57e02329e71d req-2c2b322c-3821-4542-a7ea-9674aa755611 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:46:14 localhost nova_compute[279673]: 2025-11-28 09:46:14.759 279685 DEBUG oslo_concurrency.lockutils [req-d911f796-b120-46d0-9e51-57e02329e71d req-2c2b322c-3821-4542-a7ea-9674aa755611 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:46:14 localhost nova_compute[279673]: 2025-11-28 09:46:14.759 279685 DEBUG oslo_concurrency.lockutils [req-d911f796-b120-46d0-9e51-57e02329e71d req-2c2b322c-3821-4542-a7ea-9674aa755611 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:46:14 localhost nova_compute[279673]: 2025-11-28 09:46:14.760 279685 DEBUG nova.compute.manager [req-d911f796-b120-46d0-9e51-57e02329e71d req-2c2b322c-3821-4542-a7ea-9674aa755611 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] No waiting events found dispatching network-vif-plugged-09612b07-5142-4b0f-9dab-74bf4403f69f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 28 04:46:14 localhost nova_compute[279673]: 2025-11-28 09:46:14.760 279685 WARNING nova.compute.manager [req-d911f796-b120-46d0-9e51-57e02329e71d req-2c2b322c-3821-4542-a7ea-9674aa755611 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Received unexpected event network-vif-plugged-09612b07-5142-4b0f-9dab-74bf4403f69f for instance with vm_state stopped and task_state None.#033[00m Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.009 279685 DEBUG nova.compute.manager [None req-1366b4cf-2d04-4af5-9fd8-70514cd5e7d7 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server [None req-1366b4cf-2d04-4af5-9fd8-70514cd5e7d7 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 in power state shutdown. Cannot get_diagnostics while the instance is in this state. Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server Traceback (most recent call last): Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server _emit_versioned_exception_notification( Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server self.force_reraise() Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server raise self.value Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server compute_utils.add_instance_fault_from_exc(context, Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server self.force_reraise() Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server raise self.value Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server raise exception.InstanceInvalidState( Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 in power state shutdown. Cannot get_diagnostics while the instance is in this state. Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.034 279685 ERROR oslo_messaging.rpc.server #033[00m Nov 28 04:46:16 localhost nova_compute[279673]: 2025-11-28 09:46:16.606 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52247 DF PROTO=TCP SPT=41230 DPT=9102 SEQ=2153373983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB29E3820000000001030307) Nov 28 04:46:18 localhost openstack_network_exporter[240658]: ERROR 09:46:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:46:18 localhost openstack_network_exporter[240658]: ERROR 09:46:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:46:18 localhost openstack_network_exporter[240658]: ERROR 09:46:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:46:18 localhost openstack_network_exporter[240658]: ERROR 09:46:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:46:18 localhost openstack_network_exporter[240658]: Nov 28 04:46:18 localhost openstack_network_exporter[240658]: ERROR 09:46:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:46:18 localhost openstack_network_exporter[240658]: Nov 28 04:46:18 localhost nova_compute[279673]: 2025-11-28 09:46:18.189 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:46:20 localhost podman[280414]: 2025-11-28 09:46:20.850282336 +0000 UTC m=+0.086423149 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, release=1755695350, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public) Nov 28 04:46:20 localhost podman[280414]: 2025-11-28 09:46:20.864379618 +0000 UTC m=+0.100520431 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6) Nov 28 04:46:20 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:46:21 localhost nova_compute[279673]: 2025-11-28 09:46:21.644 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:23 localhost nova_compute[279673]: 2025-11-28 09:46:23.191 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:26 localhost nova_compute[279673]: 2025-11-28 09:46:26.645 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:46:26 localhost podman[280440]: 2025-11-28 09:46:26.852283856 +0000 UTC m=+0.090404781 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 28 04:46:26 localhost nova_compute[279673]: 2025-11-28 09:46:26.854 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:46:26 localhost nova_compute[279673]: 2025-11-28 09:46:26.855 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:46:26 localhost nova_compute[279673]: 2025-11-28 09:46:26.855 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 04:46:26 localhost nova_compute[279673]: 2025-11-28 09:46:26.856 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 04:46:26 localhost podman[280440]: 2025-11-28 09:46:26.86644893 +0000 UTC m=+0.104569895 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 28 04:46:26 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:46:27 localhost nova_compute[279673]: 2025-11-28 09:46:27.362 279685 DEBUG nova.virt.driver [-] Emitting event Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 28 04:46:27 localhost nova_compute[279673]: 2025-11-28 09:46:27.363 279685 INFO nova.compute.manager [-] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] VM Stopped (Lifecycle Event)#033[00m Nov 28 04:46:27 localhost nova_compute[279673]: 2025-11-28 09:46:27.388 279685 DEBUG nova.compute.manager [None req-c13de2fd-2c13-401e-8178-2bdaf5d565fa - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 04:46:27 localhost nova_compute[279673]: 2025-11-28 09:46:27.393 279685 DEBUG nova.compute.manager [None req-c13de2fd-2c13-401e-8178-2bdaf5d565fa - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: None, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Nov 28 04:46:27 localhost nova_compute[279673]: 2025-11-28 09:46:27.724 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:46:27 localhost nova_compute[279673]: 2025-11-28 09:46:27.724 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:46:27 localhost nova_compute[279673]: 2025-11-28 09:46:27.725 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 04:46:27 localhost nova_compute[279673]: 2025-11-28 09:46:27.725 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:46:28 localhost nova_compute[279673]: 2025-11-28 09:46:28.193 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:28 localhost nova_compute[279673]: 2025-11-28 09:46:28.244 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 04:46:28 localhost nova_compute[279673]: 2025-11-28 09:46:28.278 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:46:28 localhost nova_compute[279673]: 2025-11-28 09:46:28.278 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 04:46:28 localhost nova_compute[279673]: 2025-11-28 09:46:28.279 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:46:28 localhost nova_compute[279673]: 2025-11-28 09:46:28.280 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:46:28 localhost nova_compute[279673]: 2025-11-28 09:46:28.280 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:46:28 localhost nova_compute[279673]: 2025-11-28 09:46:28.280 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:46:28 localhost nova_compute[279673]: 2025-11-28 09:46:28.281 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:46:28 localhost nova_compute[279673]: 2025-11-28 09:46:28.282 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:46:28 localhost nova_compute[279673]: 2025-11-28 09:46:28.282 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 04:46:28 localhost nova_compute[279673]: 2025-11-28 09:46:28.283 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:46:28 localhost nova_compute[279673]: 2025-11-28 09:46:28.301 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:46:28 localhost nova_compute[279673]: 2025-11-28 09:46:28.302 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:46:28 localhost nova_compute[279673]: 2025-11-28 09:46:28.302 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:46:28 localhost nova_compute[279673]: 2025-11-28 09:46:28.303 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 04:46:28 localhost nova_compute[279673]: 2025-11-28 09:46:28.303 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:46:28 localhost nova_compute[279673]: 2025-11-28 09:46:28.811 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:46:28 localhost nova_compute[279673]: 2025-11-28 09:46:28.896 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:46:28 localhost nova_compute[279673]: 2025-11-28 09:46:28.897 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:46:29 localhost nova_compute[279673]: 2025-11-28 09:46:29.105 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 04:46:29 localhost nova_compute[279673]: 2025-11-28 09:46:29.108 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12568MB free_disk=41.83693313598633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 04:46:29 localhost nova_compute[279673]: 2025-11-28 09:46:29.108 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:46:29 localhost nova_compute[279673]: 2025-11-28 09:46:29.109 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:46:29 localhost nova_compute[279673]: 2025-11-28 09:46:29.202 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 04:46:29 localhost nova_compute[279673]: 2025-11-28 09:46:29.202 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 04:46:29 localhost nova_compute[279673]: 2025-11-28 09:46:29.203 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 04:46:29 localhost nova_compute[279673]: 2025-11-28 09:46:29.246 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:46:29 localhost nova_compute[279673]: 2025-11-28 09:46:29.763 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:46:29 localhost nova_compute[279673]: 2025-11-28 09:46:29.771 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 04:46:29 localhost nova_compute[279673]: 2025-11-28 09:46:29.793 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 04:46:29 localhost nova_compute[279673]: 2025-11-28 09:46:29.818 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 04:46:29 localhost nova_compute[279673]: 2025-11-28 09:46:29.819 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.710s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:46:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:46:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:46:30 localhost podman[280586]: 2025-11-28 09:46:30.856244856 +0000 UTC m=+0.085370476 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:46:30 localhost systemd[1]: tmp-crun.yJrWyr.mount: Deactivated successfully. Nov 28 04:46:30 localhost podman[280587]: 2025-11-28 09:46:30.927230572 +0000 UTC m=+0.156671372 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 28 04:46:30 localhost podman[280587]: 2025-11-28 09:46:30.962327527 +0000 UTC m=+0.191768317 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent) Nov 28 04:46:30 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:46:31 localhost podman[280586]: 2025-11-28 09:46:31.012639478 +0000 UTC m=+0.241765088 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 28 04:46:31 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:46:31 localhost nova_compute[279673]: 2025-11-28 09:46:31.648 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49395 DF PROTO=TCP SPT=45152 DPT=9102 SEQ=3869660939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2A1BE40000000001030307) Nov 28 04:46:33 localhost nova_compute[279673]: 2025-11-28 09:46:33.216 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49396 DF PROTO=TCP SPT=45152 DPT=9102 SEQ=3869660939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2A20020000000001030307) Nov 28 04:46:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:46:33 localhost podman[280630]: 2025-11-28 09:46:33.850912382 +0000 UTC m=+0.086450830 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Nov 28 04:46:33 localhost podman[280630]: 2025-11-28 09:46:33.888161894 +0000 UTC m=+0.123700312 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 04:46:33 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:46:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52248 DF PROTO=TCP SPT=41230 DPT=9102 SEQ=2153373983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2A23820000000001030307) Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.165 279685 DEBUG nova.compute.manager [None req-071dfc59-4eda-4122-96c1-bdcf9d99ded5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server [None req-071dfc59-4eda-4122-96c1-bdcf9d99ded5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 in power state shutdown. Cannot get_diagnostics while the instance is in this state. Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server Traceback (most recent call last): Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server _emit_versioned_exception_notification( Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server self.force_reraise() Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server raise self.value Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server compute_utils.add_instance_fault_from_exc(context, Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server self.force_reraise() Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server raise self.value Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server raise exception.InstanceInvalidState( Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 in power state shutdown. Cannot get_diagnostics while the instance is in this state. Nov 28 04:46:35 localhost nova_compute[279673]: 2025-11-28 09:46:35.209 279685 ERROR oslo_messaging.rpc.server #033[00m Nov 28 04:46:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49397 DF PROTO=TCP SPT=45152 DPT=9102 SEQ=3869660939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2A28020000000001030307) Nov 28 04:46:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49122 DF PROTO=TCP SPT=47508 DPT=9102 SEQ=3958754893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2A2B820000000001030307) Nov 28 04:46:36 localhost nova_compute[279673]: 2025-11-28 09:46:36.650 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:38 localhost nova_compute[279673]: 2025-11-28 09:46:38.219 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49398 DF PROTO=TCP SPT=45152 DPT=9102 SEQ=3869660939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2A37C20000000001030307) Nov 28 04:46:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:46:39 localhost systemd[1]: tmp-crun.RsYKlQ.mount: Deactivated successfully. Nov 28 04:46:39 localhost podman[280649]: 2025-11-28 09:46:39.858802854 +0000 UTC m=+0.092142184 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:46:39 localhost podman[280649]: 2025-11-28 09:46:39.870341927 +0000 UTC m=+0.103681247 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 04:46:39 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:46:40 localhost podman[238687]: time="2025-11-28T09:46:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:46:40 localhost podman[238687]: @ - - [28/Nov/2025:09:46:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146336 "" "Go-http-client/1.1" Nov 28 04:46:40 localhost podman[238687]: @ - - [28/Nov/2025:09:46:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16747 "" "Go-http-client/1.1" Nov 28 04:46:41 localhost nova_compute[279673]: 2025-11-28 09:46:41.680 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:46:41 localhost podman[280672]: 2025-11-28 09:46:41.840200353 +0000 UTC m=+0.079293320 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:46:41 localhost podman[280672]: 2025-11-28 09:46:41.853420569 +0000 UTC m=+0.092513536 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:46:41 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:46:41 localhost nova_compute[279673]: 2025-11-28 09:46:41.882 279685 DEBUG nova.objects.instance [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Lazy-loading 'flavor' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:46:41 localhost nova_compute[279673]: 2025-11-28 09:46:41.907 279685 DEBUG oslo_concurrency.lockutils [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:46:41 localhost nova_compute[279673]: 2025-11-28 09:46:41.908 279685 DEBUG oslo_concurrency.lockutils [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:46:41 localhost nova_compute[279673]: 2025-11-28 09:46:41.908 279685 DEBUG nova.network.neutron [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Nov 28 04:46:41 localhost nova_compute[279673]: 2025-11-28 09:46:41.908 279685 DEBUG nova.objects.instance [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:46:42 localhost ovn_controller[152322]: 2025-11-28T09:46:42Z|00057|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.467 279685 DEBUG nova.network.neutron [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.492 279685 DEBUG oslo_concurrency.lockutils [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.531 279685 INFO nova.virt.libvirt.driver [-] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Instance destroyed successfully.#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.531 279685 DEBUG nova.objects.instance [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Lazy-loading 'numa_topology' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.548 279685 DEBUG nova.objects.instance [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Lazy-loading 'resources' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.562 279685 DEBUG nova.virt.libvirt.vif [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T08:32:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(2),hidden=False,host='np0005538513.localdomain',hostname='test',id=2,image_ref='391767f1-35f2-4b68-ae15-e0b29db66dcb',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-28T08:33:06Z,launched_on='np0005538513.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='np0005538513.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=4,progress=0,project_id='9dda653c53224db086060962b0702694',ramdisk_id='',reservation_id='r-a3c307c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='391767f1-35f2-4b68-ae15-e0b29db66dcb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=,task_state='powering-on',terminated_at=None,trusted_certs=,updated_at=2025-11-28T09:46:12Z,user_data=None,user_id='4d9169247d4447d0a8dd4c33f8b23dee',uuid=c2f0c7d6-df5f-4541-8b2c-bc1eaf805812,vcpu_model=,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.563 279685 DEBUG nova.network.os_vif_util [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Converting VIF {"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.564 279685 DEBUG nova.network.os_vif_util [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.565 279685 DEBUG os_vif [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.567 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.568 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09612b07-51, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.570 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.571 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.574 279685 INFO os_vif [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51')#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.577 279685 DEBUG nova.virt.libvirt.host [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.578 279685 INFO nova.virt.libvirt.host [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] UEFI support detected#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.586 279685 DEBUG nova.virt.libvirt.driver [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Start _get_guest_xml network_info=[{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum=,container_format='bare',created_at=,direct_url=,disk_format='qcow2',id=391767f1-35f2-4b68-ae15-e0b29db66dcb,min_disk=1,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=,status=,tags=,updated_at=,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'device_type': 'disk', 'encrypted': False, 'image_id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}], 'ephemerals': [{'encryption_format': None, 'size': 1, 'device_name': '/dev/vdb', 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_secret_uuid': None, 'device_type': 'disk', 'encrypted': False}], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.591 279685 WARNING nova.virt.libvirt.driver [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.593 279685 DEBUG nova.virt.libvirt.host [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Searching host: 'np0005538513.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.594 279685 DEBUG nova.virt.libvirt.host [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.595 279685 DEBUG nova.virt.libvirt.host [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Searching host: 'np0005538513.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.596 279685 DEBUG nova.virt.libvirt.host [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.597 279685 DEBUG nova.virt.libvirt.driver [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.597 279685 DEBUG nova.virt.hardware [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T08:32:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={},flavorid='f3c44237-060e-4213-a926-aa7fdb4bf902',id=2,is_public=True,memory_mb=512,name='m1.small',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format='bare',created_at=,direct_url=,disk_format='qcow2',id=391767f1-35f2-4b68-ae15-e0b29db66dcb,min_disk=1,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=,status=,tags=,updated_at=,virtual_size=,visibility=), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.598 279685 DEBUG nova.virt.hardware [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.599 279685 DEBUG nova.virt.hardware [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.599 279685 DEBUG nova.virt.hardware [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.600 279685 DEBUG nova.virt.hardware [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.600 279685 DEBUG nova.virt.hardware [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.600 279685 DEBUG nova.virt.hardware [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.601 279685 DEBUG nova.virt.hardware [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.601 279685 DEBUG nova.virt.hardware [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.601 279685 DEBUG nova.virt.hardware [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.602 279685 DEBUG nova.virt.hardware [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.602 279685 DEBUG nova.objects.instance [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Lazy-loading 'vcpu_model' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.629 279685 DEBUG nova.privsep.utils [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m Nov 28 04:46:42 localhost nova_compute[279673]: 2025-11-28 09:46:42.630 279685 DEBUG oslo_concurrency.processutils [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.110 279685 DEBUG oslo_concurrency.processutils [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.113 279685 DEBUG oslo_concurrency.processutils [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.518 279685 DEBUG oslo_concurrency.processutils [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.406s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.521 279685 DEBUG nova.virt.libvirt.vif [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T08:32:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(2),hidden=False,host='np0005538513.localdomain',hostname='test',id=2,image_ref='391767f1-35f2-4b68-ae15-e0b29db66dcb',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-28T08:33:06Z,launched_on='np0005538513.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='np0005538513.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=4,progress=0,project_id='9dda653c53224db086060962b0702694',ramdisk_id='',reservation_id='r-a3c307c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='391767f1-35f2-4b68-ae15-e0b29db66dcb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=,task_state='powering-on',terminated_at=None,trusted_certs=,updated_at=2025-11-28T09:46:12Z,user_data=None,user_id='4d9169247d4447d0a8dd4c33f8b23dee',uuid=c2f0c7d6-df5f-4541-8b2c-bc1eaf805812,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.522 279685 DEBUG nova.network.os_vif_util [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Converting VIF {"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.523 279685 DEBUG nova.network.os_vif_util [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.525 279685 DEBUG nova.objects.instance [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Lazy-loading 'pci_devices' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.546 279685 DEBUG nova.virt.libvirt.driver [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] End _get_guest_xml xml= Nov 28 04:46:43 localhost nova_compute[279673]: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 Nov 28 04:46:43 localhost nova_compute[279673]: instance-00000002 Nov 28 04:46:43 localhost nova_compute[279673]: 524288 Nov 28 04:46:43 localhost nova_compute[279673]: 1 Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: test Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:42 Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: 512 Nov 28 04:46:43 localhost nova_compute[279673]: 1 Nov 28 04:46:43 localhost nova_compute[279673]: 0 Nov 28 04:46:43 localhost nova_compute[279673]: 1 Nov 28 04:46:43 localhost nova_compute[279673]: 1 Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: admin Nov 28 04:46:43 localhost nova_compute[279673]: admin Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: RDO Nov 28 04:46:43 localhost nova_compute[279673]: OpenStack Compute Nov 28 04:46:43 localhost nova_compute[279673]: 27.5.2-0.20250829104910.6f8decf.el9 Nov 28 04:46:43 localhost nova_compute[279673]: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 Nov 28 04:46:43 localhost nova_compute[279673]: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 Nov 28 04:46:43 localhost nova_compute[279673]: Virtual Machine Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: hvm Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: /dev/urandom Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: Nov 28 04:46:43 localhost nova_compute[279673]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.549 279685 DEBUG nova.virt.libvirt.driver [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.550 279685 DEBUG nova.virt.libvirt.driver [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.551 279685 DEBUG nova.virt.libvirt.vif [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T08:32:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(2),hidden=False,host='np0005538513.localdomain',hostname='test',id=2,image_ref='391767f1-35f2-4b68-ae15-e0b29db66dcb',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-28T08:33:06Z,launched_on='np0005538513.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='np0005538513.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=,power_state=4,progress=0,project_id='9dda653c53224db086060962b0702694',ramdisk_id='',reservation_id='r-a3c307c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='391767f1-35f2-4b68-ae15-e0b29db66dcb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=,task_state='powering-on',terminated_at=None,trusted_certs=,updated_at=2025-11-28T09:46:12Z,user_data=None,user_id='4d9169247d4447d0a8dd4c33f8b23dee',uuid=c2f0c7d6-df5f-4541-8b2c-bc1eaf805812,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.552 279685 DEBUG nova.network.os_vif_util [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Converting VIF {"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.553 279685 DEBUG nova.network.os_vif_util [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.553 279685 DEBUG os_vif [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.554 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.555 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.556 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.560 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.560 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09612b07-51, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.561 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap09612b07-51, col_values=(('external_ids', {'iface-id': '09612b07-5142-4b0f-9dab-74bf4403f69f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f4:fc:6c', 'vm-uuid': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.563 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.567 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.571 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.573 279685 INFO os_vif [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:fc:6c,bridge_name='br-int',has_traffic_filtering=True,id=09612b07-5142-4b0f-9dab-74bf4403f69f,network=Network(40d5da59-6201-424a-8380-80ecc3d67c7e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09612b07-51')#033[00m Nov 28 04:46:43 localhost systemd[1]: Started libvirt secret daemon. Nov 28 04:46:43 localhost kernel: device tap09612b07-51 entered promiscuous mode Nov 28 04:46:43 localhost NetworkManager[5967]: [1764323203.6912] manager: (tap09612b07-51): new Tun device (/org/freedesktop/NetworkManager/Devices/15) Nov 28 04:46:43 localhost ovn_controller[152322]: 2025-11-28T09:46:43Z|00058|binding|INFO|Claiming lport 09612b07-5142-4b0f-9dab-74bf4403f69f for this chassis. Nov 28 04:46:43 localhost ovn_controller[152322]: 2025-11-28T09:46:43Z|00059|binding|INFO|09612b07-5142-4b0f-9dab-74bf4403f69f: Claiming fa:16:3e:f4:fc:6c 192.168.0.142 Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.691 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.698 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:43 localhost systemd-udevd[280765]: Network interface NamePolicy= disabled on kernel command line. Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.701 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:43 localhost NetworkManager[5967]: [1764323203.7160] device (tap09612b07-51): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Nov 28 04:46:43 localhost NetworkManager[5967]: [1764323203.7172] device (tap09612b07-51): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Nov 28 04:46:43 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:43.717 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:fc:6c 192.168.0.142'], port_security=['fa:16:3e:f4:fc:6c 192.168.0.142'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.142/24', 'neutron:device_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-40d5da59-6201-424a-8380-80ecc3d67c7e', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '9dda653c53224db086060962b0702694', 'neutron:revision_number': '8', 'neutron:security_group_ids': '6d2c5a31-c9e5-413a-bccf-f97c7687bd94 b3c60f08-3369-426b-b744-9cef04caaa7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f3122580-f73f-40fa-a838-6bad2ff9da2f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=09612b07-5142-4b0f-9dab-74bf4403f69f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 04:46:43 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:43.720 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 09612b07-5142-4b0f-9dab-74bf4403f69f in datapath 40d5da59-6201-424a-8380-80ecc3d67c7e bound to our chassis#033[00m Nov 28 04:46:43 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:43.722 158130 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 40d5da59-6201-424a-8380-80ecc3d67c7e#033[00m Nov 28 04:46:43 localhost ovn_controller[152322]: 2025-11-28T09:46:43Z|00060|ovn_bfd|INFO|Enabled BFD on interface ovn-07900d-0 Nov 28 04:46:43 localhost ovn_controller[152322]: 2025-11-28T09:46:43Z|00061|ovn_bfd|INFO|Enabled BFD on interface ovn-c3237d-0 Nov 28 04:46:43 localhost ovn_controller[152322]: 2025-11-28T09:46:43Z|00062|ovn_bfd|INFO|Enabled BFD on interface ovn-11aa47-0 Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.731 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:43 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:43.733 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[ef84b72a-f9ed-4115-b95c-d071518ce55b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:46:43 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:43.733 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap40d5da59-61 in ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Nov 28 04:46:43 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:43.735 158233 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap40d5da59-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Nov 28 04:46:43 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:43.736 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[53e62a2d-e84b-44e6-bed7-dcefa7cbafb5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:46:43 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:43.737 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[c5d0c1a9-b0f6-447d-a13b-2ce0dff37065]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.750 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:43 localhost systemd-machined[83422]: New machine qemu-2-instance-00000002. Nov 28 04:46:43 localhost ovn_controller[152322]: 2025-11-28T09:46:43Z|00063|binding|INFO|Setting lport 09612b07-5142-4b0f-9dab-74bf4403f69f up in Southbound Nov 28 04:46:43 localhost ovn_controller[152322]: 2025-11-28T09:46:43Z|00064|binding|INFO|Setting lport 09612b07-5142-4b0f-9dab-74bf4403f69f ovn-installed in OVS Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.762 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:43 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:43.762 158264 DEBUG oslo.privsep.daemon [-] privsep: reply[047b5321-ea3d-4ef2-9fb0-2dad45cd3636]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:46:43 localhost systemd[1]: Started Virtual Machine qemu-2-instance-00000002. Nov 28 04:46:43 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:43.777 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[6b9de73e-6104-463b-8dff-ae0bda8fcf0c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.809 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:43 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:43.817 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[bd9b13d9-5198-4ab7-8f36-f83fe2ae8cf7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:46:43 localhost systemd-udevd[280767]: Network interface NamePolicy= disabled on kernel command line. Nov 28 04:46:43 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:43.824 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[bccbb675-c9f7-4aa2-862b-73318ccb8894]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:46:43 localhost NetworkManager[5967]: [1764323203.8261] manager: (tap40d5da59-60): new Veth device (/org/freedesktop/NetworkManager/Devices/16) Nov 28 04:46:43 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:43.859 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[01b8fbfb-6a0d-48dd-bce5-3f959c135eb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:46:43 localhost nova_compute[279673]: 2025-11-28 09:46:43.861 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:43 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:43.862 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[6277a64c-c063-4230-b8ff-384d81f75d60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:46:43 localhost NetworkManager[5967]: [1764323203.8848] device (tap40d5da59-60): carrier: link connected Nov 28 04:46:43 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap40d5da59-61: link becomes ready Nov 28 04:46:43 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap40d5da59-60: link becomes ready Nov 28 04:46:43 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:43.889 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[e2e09e5d-3270-4ff7-89ca-eb00f68816b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:46:43 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:43.903 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[d2dc5514-10e3-4a8d-97d2-f63b3bbed7a3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap40d5da59-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:28:4d:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1103798, 'reachable_time': 31313, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280804, 'error': None, 'target': 'ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:46:43 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:43.914 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[369b69d3-031b-4f39-b076-a34e68ed6559]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe28:4d05'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1103798, 'tstamp': 1103798}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 280813, 'error': None, 'target': 'ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:46:43 localhost snmpd[66832]: IfIndex of an interface changed. Such interfaces will appear multiple times in IF-MIB. Nov 28 04:46:43 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:43.933 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[b71b4f94-f1ab-4f3e-99e0-91629c354f09]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap40d5da59-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:28:4d:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1103798, 'reachable_time': 31313, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 280822, 'error': None, 'target': 'ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:46:43 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:43.958 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[73c3e1d8-331b-4b95-b611-ab1814750803]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:44.009 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[4e1026cb-9a3e-45a4-861b-c5d44395f604]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:44.011 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap40d5da59-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:44.011 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:44.011 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap40d5da59-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:46:44 localhost nova_compute[279673]: 2025-11-28 09:46:44.064 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:44 localhost kernel: device tap40d5da59-60 entered promiscuous mode Nov 28 04:46:44 localhost nova_compute[279673]: 2025-11-28 09:46:44.065 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:44.070 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap40d5da59-60, col_values=(('external_ids', {'iface-id': '3ff57c88-06c6-4894-984a-80ce116d1456'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:46:44 localhost nova_compute[279673]: 2025-11-28 09:46:44.071 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:44 localhost ovn_controller[152322]: 2025-11-28T09:46:44Z|00065|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 04:46:44 localhost nova_compute[279673]: 2025-11-28 09:46:44.072 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:44.074 158130 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/40d5da59-6201-424a-8380-80ecc3d67c7e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/40d5da59-6201-424a-8380-80ecc3d67c7e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:44.076 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[eb616e3d-b1fb-4d16-8059-556f4421a8ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:46:44 localhost nova_compute[279673]: 2025-11-28 09:46:44.077 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:44.078 158130 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: global Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: log /dev/log local0 debug Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: log-tag haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: user root Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: group root Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: maxconn 1024 Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: pidfile /var/lib/neutron/external/pids/40d5da59-6201-424a-8380-80ecc3d67c7e.pid.haproxy Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: daemon Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: defaults Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: log global Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: mode http Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: option httplog Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: option dontlognull Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: option http-server-close Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: option forwardfor Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: retries 3 Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: timeout http-request 30s Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: timeout connect 30s Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: timeout client 32s Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: timeout server 32s Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: timeout http-keep-alive 30s Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: listen listener Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: bind 169.254.169.254:80 Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: server metadata /var/lib/neutron/metadata_proxy Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: http-request add-header X-OVN-Network-ID 40d5da59-6201-424a-8380-80ecc3d67c7e Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Nov 28 04:46:44 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:44.079 158130 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e', 'env', 'PROCESS_TAG=haproxy-40d5da59-6201-424a-8380-80ecc3d67c7e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/40d5da59-6201-424a-8380-80ecc3d67c7e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Nov 28 04:46:44 localhost nova_compute[279673]: 2025-11-28 09:46:44.145 279685 DEBUG nova.virt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 28 04:46:44 localhost nova_compute[279673]: 2025-11-28 09:46:44.146 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] VM Resumed (Lifecycle Event)#033[00m Nov 28 04:46:44 localhost nova_compute[279673]: 2025-11-28 09:46:44.150 279685 DEBUG nova.compute.manager [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Instance event wait completed in 0 seconds for wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m Nov 28 04:46:44 localhost nova_compute[279673]: 2025-11-28 09:46:44.165 279685 INFO nova.virt.libvirt.driver [-] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Instance rebooted successfully.#033[00m Nov 28 04:46:44 localhost nova_compute[279673]: 2025-11-28 09:46:44.165 279685 DEBUG nova.compute.manager [None req-59b8762a-7364-4d2a-b46e-f7f5268249b5 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 04:46:44 localhost nova_compute[279673]: 2025-11-28 09:46:44.170 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 04:46:44 localhost nova_compute[279673]: 2025-11-28 09:46:44.173 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Nov 28 04:46:44 localhost nova_compute[279673]: 2025-11-28 09:46:44.201 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m Nov 28 04:46:44 localhost nova_compute[279673]: 2025-11-28 09:46:44.201 279685 DEBUG nova.virt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 28 04:46:44 localhost nova_compute[279673]: 2025-11-28 09:46:44.201 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] VM Started (Lifecycle Event)#033[00m Nov 28 04:46:44 localhost nova_compute[279673]: 2025-11-28 09:46:44.218 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 04:46:44 localhost nova_compute[279673]: 2025-11-28 09:46:44.221 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Nov 28 04:46:44 localhost podman[280882]: Nov 28 04:46:44 localhost podman[280882]: 2025-11-28 09:46:44.494893253 +0000 UTC m=+0.078929519 container create 5c16471a00387105cdcd384a69ea9a94b342e8ed65f03b71a8324776ec10e264 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 28 04:46:44 localhost podman[280882]: 2025-11-28 09:46:44.446603983 +0000 UTC m=+0.030640259 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Nov 28 04:46:44 localhost systemd[1]: Started libpod-conmon-5c16471a00387105cdcd384a69ea9a94b342e8ed65f03b71a8324776ec10e264.scope. Nov 28 04:46:44 localhost systemd[1]: Started libcrun container. Nov 28 04:46:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9eda2c7888720a1ed258ad38c3fe5daf04876572a87ed50f81d219a10f47fcb7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 04:46:44 localhost podman[280882]: 2025-11-28 09:46:44.611720183 +0000 UTC m=+0.195756439 container init 5c16471a00387105cdcd384a69ea9a94b342e8ed65f03b71a8324776ec10e264 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true) Nov 28 04:46:44 localhost podman[280882]: 2025-11-28 09:46:44.624647908 +0000 UTC m=+0.208684174 container start 5c16471a00387105cdcd384a69ea9a94b342e8ed65f03b71a8324776ec10e264 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:46:44 localhost neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e[280896]: [NOTICE] (280900) : New worker (280902) forked Nov 28 04:46:44 localhost neutron-haproxy-ovnmeta-40d5da59-6201-424a-8380-80ecc3d67c7e[280896]: [NOTICE] (280900) : Loading success. Nov 28 04:46:44 localhost ovn_controller[152322]: 2025-11-28T09:46:44Z|00066|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 04:46:44 localhost nova_compute[279673]: 2025-11-28 09:46:44.642 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:44 localhost nova_compute[279673]: 2025-11-28 09:46:44.801 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:44 localhost ovn_controller[152322]: 2025-11-28T09:46:44Z|00067|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 04:46:44 localhost ovn_controller[152322]: 2025-11-28T09:46:44Z|00068|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 04:46:44 localhost nova_compute[279673]: 2025-11-28 09:46:44.809 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:45 localhost nova_compute[279673]: 2025-11-28 09:46:45.591 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:45 localhost ovn_controller[152322]: 2025-11-28T09:46:45Z|00069|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 04:46:46 localhost nova_compute[279673]: 2025-11-28 09:46:46.100 279685 DEBUG nova.compute.manager [req-dbfec011-2b8c-49d1-ad31-0318bd596b52 req-10082b87-47c2-4107-8bed-92de6ec7c13c 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Received event network-vif-plugged-09612b07-5142-4b0f-9dab-74bf4403f69f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 28 04:46:46 localhost nova_compute[279673]: 2025-11-28 09:46:46.101 279685 DEBUG oslo_concurrency.lockutils [req-dbfec011-2b8c-49d1-ad31-0318bd596b52 req-10082b87-47c2-4107-8bed-92de6ec7c13c 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:46:46 localhost nova_compute[279673]: 2025-11-28 09:46:46.101 279685 DEBUG oslo_concurrency.lockutils [req-dbfec011-2b8c-49d1-ad31-0318bd596b52 req-10082b87-47c2-4107-8bed-92de6ec7c13c 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:46:46 localhost nova_compute[279673]: 2025-11-28 09:46:46.102 279685 DEBUG oslo_concurrency.lockutils [req-dbfec011-2b8c-49d1-ad31-0318bd596b52 req-10082b87-47c2-4107-8bed-92de6ec7c13c 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:46:46 localhost nova_compute[279673]: 2025-11-28 09:46:46.102 279685 DEBUG nova.compute.manager [req-dbfec011-2b8c-49d1-ad31-0318bd596b52 req-10082b87-47c2-4107-8bed-92de6ec7c13c 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] No waiting events found dispatching network-vif-plugged-09612b07-5142-4b0f-9dab-74bf4403f69f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 28 04:46:46 localhost nova_compute[279673]: 2025-11-28 09:46:46.103 279685 WARNING nova.compute.manager [req-dbfec011-2b8c-49d1-ad31-0318bd596b52 req-10082b87-47c2-4107-8bed-92de6ec7c13c 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Received unexpected event network-vif-plugged-09612b07-5142-4b0f-9dab-74bf4403f69f for instance with vm_state active and task_state None.#033[00m Nov 28 04:46:46 localhost nova_compute[279673]: 2025-11-28 09:46:46.685 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49399 DF PROTO=TCP SPT=45152 DPT=9102 SEQ=3869660939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2A57830000000001030307) Nov 28 04:46:48 localhost openstack_network_exporter[240658]: ERROR 09:46:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:46:48 localhost openstack_network_exporter[240658]: ERROR 09:46:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:46:48 localhost openstack_network_exporter[240658]: ERROR 09:46:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:46:48 localhost openstack_network_exporter[240658]: ERROR 09:46:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:46:48 localhost openstack_network_exporter[240658]: Nov 28 04:46:48 localhost openstack_network_exporter[240658]: ERROR 09:46:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:46:48 localhost openstack_network_exporter[240658]: Nov 28 04:46:48 localhost nova_compute[279673]: 2025-11-28 09:46:48.150 279685 DEBUG nova.compute.manager [req-c9aba51c-3a10-4aa1-864c-602dde55d488 req-0fad0dff-8a4a-4379-867e-e4466bba83bd 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Received event network-vif-plugged-09612b07-5142-4b0f-9dab-74bf4403f69f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 28 04:46:48 localhost nova_compute[279673]: 2025-11-28 09:46:48.151 279685 DEBUG oslo_concurrency.lockutils [req-c9aba51c-3a10-4aa1-864c-602dde55d488 req-0fad0dff-8a4a-4379-867e-e4466bba83bd 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:46:48 localhost nova_compute[279673]: 2025-11-28 09:46:48.152 279685 DEBUG oslo_concurrency.lockutils [req-c9aba51c-3a10-4aa1-864c-602dde55d488 req-0fad0dff-8a4a-4379-867e-e4466bba83bd 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:46:48 localhost nova_compute[279673]: 2025-11-28 09:46:48.152 279685 DEBUG oslo_concurrency.lockutils [req-c9aba51c-3a10-4aa1-864c-602dde55d488 req-0fad0dff-8a4a-4379-867e-e4466bba83bd 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:46:48 localhost nova_compute[279673]: 2025-11-28 09:46:48.153 279685 DEBUG nova.compute.manager [req-c9aba51c-3a10-4aa1-864c-602dde55d488 req-0fad0dff-8a4a-4379-867e-e4466bba83bd 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] No waiting events found dispatching network-vif-plugged-09612b07-5142-4b0f-9dab-74bf4403f69f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 28 04:46:48 localhost nova_compute[279673]: 2025-11-28 09:46:48.153 279685 WARNING nova.compute.manager [req-c9aba51c-3a10-4aa1-864c-602dde55d488 req-0fad0dff-8a4a-4379-867e-e4466bba83bd 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Received unexpected event network-vif-plugged-09612b07-5142-4b0f-9dab-74bf4403f69f for instance with vm_state active and task_state None.#033[00m Nov 28 04:46:48 localhost nova_compute[279673]: 2025-11-28 09:46:48.564 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:50.825 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:46:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:50.826 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:46:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:46:50.827 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:46:51 localhost nova_compute[279673]: 2025-11-28 09:46:51.728 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:46:51 localhost podman[280911]: 2025-11-28 09:46:51.864862847 +0000 UTC m=+0.092807434 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-type=git, managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Nov 28 04:46:51 localhost podman[280911]: 2025-11-28 09:46:51.882355373 +0000 UTC m=+0.110299940 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6) Nov 28 04:46:51 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:46:53 localhost nova_compute[279673]: 2025-11-28 09:46:53.568 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:56 localhost nova_compute[279673]: 2025-11-28 09:46:56.732 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:46:57 localhost ovn_controller[152322]: 2025-11-28T09:46:57Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f4:fc:6c 192.168.0.142 Nov 28 04:46:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:46:57 localhost systemd[1]: tmp-crun.MqFHkw.mount: Deactivated successfully. Nov 28 04:46:57 localhost podman[280930]: 2025-11-28 09:46:57.848881157 +0000 UTC m=+0.084535281 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:46:57 localhost podman[280930]: 2025-11-28 09:46:57.862786083 +0000 UTC m=+0.098440207 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 04:46:57 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:46:58 localhost snmpd[66832]: empty variable list in _query Nov 28 04:46:58 localhost snmpd[66832]: empty variable list in _query Nov 28 04:46:58 localhost nova_compute[279673]: 2025-11-28 09:46:58.571 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:01 localhost nova_compute[279673]: 2025-11-28 09:47:01.776 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:47:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:47:01 localhost podman[280954]: 2025-11-28 09:47:01.896051291 +0000 UTC m=+0.089940937 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 28 04:47:01 localhost podman[280954]: 2025-11-28 09:47:01.90480463 +0000 UTC m=+0.098694246 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:47:01 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:47:02 localhost systemd[1]: tmp-crun.GGpsHG.mount: Deactivated successfully. Nov 28 04:47:02 localhost podman[280953]: 2025-11-28 09:47:02.013807559 +0000 UTC m=+0.210609084 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125) Nov 28 04:47:02 localhost podman[280953]: 2025-11-28 09:47:02.078946846 +0000 UTC m=+0.275748351 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Nov 28 04:47:02 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:47:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45814 DF PROTO=TCP SPT=38778 DPT=9102 SEQ=3981783863 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2A91150000000001030307) Nov 28 04:47:02 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:02.417 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 28 04:47:02 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:02.419 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0#015 Nov 28 04:47:02 localhost ovn_metadata_agent[158125]: Accept: */*#015 Nov 28 04:47:02 localhost ovn_metadata_agent[158125]: Connection: close#015 Nov 28 04:47:02 localhost ovn_metadata_agent[158125]: Content-Type: text/plain#015 Nov 28 04:47:02 localhost ovn_metadata_agent[158125]: Host: 169.254.169.254#015 Nov 28 04:47:02 localhost ovn_metadata_agent[158125]: User-Agent: curl/7.84.0#015 Nov 28 04:47:02 localhost ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142#015 Nov 28 04:47:02 localhost ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 28 04:47:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45815 DF PROTO=TCP SPT=38778 DPT=9102 SEQ=3981783863 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2A95020000000001030307) Nov 28 04:47:03 localhost nova_compute[279673]: 2025-11-28 09:47:03.610 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.770 158228 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.771 158228 INFO eventlet.wsgi.server [-] 192.168.0.142, "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200 len: 146 time: 1.3527038#033[00m Nov 28 04:47:03 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45666 [28/Nov/2025:09:47:02.416] listener listener/metadata 0/0/0/1355/1355 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.790 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.791 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Accept: */*#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Connection: close#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Content-Type: text/plain#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Host: 169.254.169.254#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: User-Agent: curl/7.84.0#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 28 04:47:03 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45676 [28/Nov/2025:09:47:03.789] listener listener/metadata 0/0/0/28/28 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1" Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.818 158228 INFO eventlet.wsgi.server [-] 192.168.0.142, "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 404 len: 297 time: 0.0270157#033[00m Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.835 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.836 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Accept: */*#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Connection: close#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Content-Type: text/plain#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Host: 169.254.169.254#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: User-Agent: curl/7.84.0#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.852 158228 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.853 158228 INFO eventlet.wsgi.server [-] 192.168.0.142, "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200 len: 146 time: 0.0169477#033[00m Nov 28 04:47:03 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45690 [28/Nov/2025:09:47:03.835] listener listener/metadata 0/0/0/18/18 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.860 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.861 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Accept: */*#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Connection: close#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Content-Type: text/plain#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Host: 169.254.169.254#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: User-Agent: curl/7.84.0#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.875 158228 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 28 04:47:03 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45694 [28/Nov/2025:09:47:03.860] listener listener/metadata 0/0/0/15/15 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.876 158228 INFO eventlet.wsgi.server [-] 192.168.0.142, "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200 len: 136 time: 0.0145340#033[00m Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.883 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.884 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Accept: */*#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Connection: close#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Content-Type: text/plain#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Host: 169.254.169.254#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: User-Agent: curl/7.84.0#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.897 158228 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 28 04:47:03 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45710 [28/Nov/2025:09:47:03.883] listener listener/metadata 0/0/0/14/14 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1" Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.898 158228 INFO eventlet.wsgi.server [-] 192.168.0.142, "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200 len: 143 time: 0.0136721#033[00m Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.905 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.906 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Accept: */*#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Connection: close#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Content-Type: text/plain#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Host: 169.254.169.254#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: User-Agent: curl/7.84.0#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.923 158228 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 28 04:47:03 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45724 [28/Nov/2025:09:47:03.905] listener listener/metadata 0/0/0/18/18 200 133 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.923 158228 INFO eventlet.wsgi.server [-] 192.168.0.142, "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200 len: 149 time: 0.0174773#033[00m Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.931 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.932 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Accept: */*#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Connection: close#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Content-Type: text/plain#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Host: 169.254.169.254#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: User-Agent: curl/7.84.0#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.946 158228 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 28 04:47:03 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45730 [28/Nov/2025:09:47:03.931] listener listener/metadata 0/0/0/15/15 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.946 158228 INFO eventlet.wsgi.server [-] 192.168.0.142, "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200 len: 150 time: 0.0141671#033[00m Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.953 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.954 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Accept: */*#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Connection: close#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Content-Type: text/plain#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Host: 169.254.169.254#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: User-Agent: curl/7.84.0#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.974 158228 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 28 04:47:03 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45732 [28/Nov/2025:09:47:03.953] listener listener/metadata 0/0/0/21/21 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1" Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.974 158228 INFO eventlet.wsgi.server [-] 192.168.0.142, "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200 len: 139 time: 0.0200076#033[00m Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.982 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:03.983 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Accept: */*#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Connection: close#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Content-Type: text/plain#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: Host: 169.254.169.254#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: User-Agent: curl/7.84.0#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142#015 Nov 28 04:47:03 localhost ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.003 158228 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 28 04:47:04 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45748 [28/Nov/2025:09:47:03.981] listener listener/metadata 0/0/0/22/22 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.004 158228 INFO eventlet.wsgi.server [-] 192.168.0.142, "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200 len: 139 time: 0.0208871#033[00m Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.011 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.012 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Accept: */*#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Connection: close#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Content-Type: text/plain#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Host: 169.254.169.254#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: User-Agent: curl/7.84.0#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 28 04:47:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49400 DF PROTO=TCP SPT=45152 DPT=9102 SEQ=3869660939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2A97830000000001030307) Nov 28 04:47:04 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45758 [28/Nov/2025:09:47:04.011] listener listener/metadata 0/0/0/20/20 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1" Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.031 158228 INFO eventlet.wsgi.server [-] 192.168.0.142, "GET /2009-04-04/user-data HTTP/1.1" status: 404 len: 297 time: 0.0188842#033[00m Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.046 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.047 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Accept: */*#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Connection: close#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Content-Type: text/plain#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Host: 169.254.169.254#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: User-Agent: curl/7.84.0#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.060 158228 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 28 04:47:04 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45774 [28/Nov/2025:09:47:04.045] listener listener/metadata 0/0/0/15/15 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.061 158228 INFO eventlet.wsgi.server [-] 192.168.0.142, "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200 len: 155 time: 0.0140557#033[00m Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.066 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.067 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Accept: */*#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Connection: close#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Content-Type: text/plain#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Host: 169.254.169.254#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: User-Agent: curl/7.84.0#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.082 158228 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 28 04:47:04 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45784 [28/Nov/2025:09:47:04.066] listener listener/metadata 0/0/0/16/16 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.082 158228 INFO eventlet.wsgi.server [-] 192.168.0.142, "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200 len: 138 time: 0.0148098#033[00m Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.088 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.089 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.0#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Accept: */*#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Connection: close#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Content-Type: text/plain#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Host: 169.254.169.254#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: User-Agent: curl/7.84.0#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.102 158228 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 28 04:47:04 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45788 [28/Nov/2025:09:47:04.087] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.103 158228 INFO eventlet.wsgi.server [-] 192.168.0.142, "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" status: 200 len: 143 time: 0.0142629#033[00m Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.109 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.109 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Accept: */*#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Connection: close#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Content-Type: text/plain#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Host: 169.254.169.254#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: User-Agent: curl/7.84.0#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.125 158228 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 28 04:47:04 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45796 [28/Nov/2025:09:47:04.108] listener listener/metadata 0/0/0/17/17 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.126 158228 INFO eventlet.wsgi.server [-] 192.168.0.142, "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200 len: 143 time: 0.0163682#033[00m Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.133 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.133 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Accept: */*#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Connection: close#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Content-Type: text/plain#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Host: 169.254.169.254#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: User-Agent: curl/7.84.0#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.146 158228 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 28 04:47:04 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45806 [28/Nov/2025:09:47:04.132] listener listener/metadata 0/0/0/15/15 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.147 158228 INFO eventlet.wsgi.server [-] 192.168.0.142, "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200 len: 139 time: 0.0137632#033[00m Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.154 158228 DEBUG eventlet.wsgi.server [-] (158228) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.154 158228 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Accept: */*#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Connection: close#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Content-Type: text/plain#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: Host: 169.254.169.254#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: User-Agent: curl/7.84.0#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: X-Forwarded-For: 192.168.0.142#015 Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: X-Ovn-Network-Id: 40d5da59-6201-424a-8380-80ecc3d67c7e __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.166 158228 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Nov 28 04:47:04 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:04.166 158228 INFO eventlet.wsgi.server [-] 192.168.0.142, "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200 len: 139 time: 0.0121791#033[00m Nov 28 04:47:04 localhost haproxy-metadata-proxy-40d5da59-6201-424a-8380-80ecc3d67c7e[280902]: 192.168.0.142:45812 [28/Nov/2025:09:47:04.153] listener listener/metadata 0/0/0/13/13 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" Nov 28 04:47:04 localhost nova_compute[279673]: 2025-11-28 09:47:04.494 279685 DEBUG nova.compute.manager [None req-85143bf4-3818-42c0-80fe-88de490e3309 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 04:47:04 localhost nova_compute[279673]: 2025-11-28 09:47:04.500 279685 INFO nova.compute.manager [None req-85143bf4-3818-42c0-80fe-88de490e3309 4d9169247d4447d0a8dd4c33f8b23dee 9dda653c53224db086060962b0702694 - - default default] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Retrieving diagnostics#033[00m Nov 28 04:47:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:47:04 localhost podman[280999]: 2025-11-28 09:47:04.848727692 +0000 UTC m=+0.084535703 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute) Nov 28 04:47:04 localhost podman[280999]: 2025-11-28 09:47:04.864519298 +0000 UTC m=+0.100327309 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 28 04:47:04 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:47:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45816 DF PROTO=TCP SPT=38778 DPT=9102 SEQ=3981783863 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2A9D020000000001030307) Nov 28 04:47:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52249 DF PROTO=TCP SPT=41230 DPT=9102 SEQ=2153373983 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2AA1820000000001030307) Nov 28 04:47:06 localhost nova_compute[279673]: 2025-11-28 09:47:06.780 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:08 localhost nova_compute[279673]: 2025-11-28 09:47:08.613 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45817 DF PROTO=TCP SPT=38778 DPT=9102 SEQ=3981783863 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2AACC20000000001030307) Nov 28 04:47:10 localhost podman[238687]: time="2025-11-28T09:47:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:47:10 localhost podman[238687]: @ - - [28/Nov/2025:09:47:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147523 "" "Go-http-client/1.1" Nov 28 04:47:10 localhost podman[238687]: @ - - [28/Nov/2025:09:47:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17233 "" "Go-http-client/1.1" Nov 28 04:47:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:47:10 localhost systemd[1]: tmp-crun.WQyjkt.mount: Deactivated successfully. Nov 28 04:47:10 localhost podman[281021]: 2025-11-28 09:47:10.859626985 +0000 UTC m=+0.093104060 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 28 04:47:10 localhost podman[281021]: 2025-11-28 09:47:10.869612944 +0000 UTC m=+0.103089979 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 04:47:10 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:47:11 localhost nova_compute[279673]: 2025-11-28 09:47:11.811 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:47:12 localhost podman[281044]: 2025-11-28 09:47:12.849612036 +0000 UTC m=+0.083471891 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3) Nov 28 04:47:12 localhost podman[281044]: 2025-11-28 09:47:12.865598038 +0000 UTC m=+0.099457933 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true) Nov 28 04:47:12 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:47:13 localhost nova_compute[279673]: 2025-11-28 09:47:13.653 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:13 localhost ovn_controller[152322]: 2025-11-28T09:47:13Z|00070|memory_trim|INFO|Detected inactivity (last active 30012 ms ago): trimming memory Nov 28 04:47:16 localhost nova_compute[279673]: 2025-11-28 09:47:16.815 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45818 DF PROTO=TCP SPT=38778 DPT=9102 SEQ=3981783863 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2ACD820000000001030307) Nov 28 04:47:18 localhost openstack_network_exporter[240658]: ERROR 09:47:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:47:18 localhost openstack_network_exporter[240658]: ERROR 09:47:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:47:18 localhost openstack_network_exporter[240658]: ERROR 09:47:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:47:18 localhost openstack_network_exporter[240658]: ERROR 09:47:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:47:18 localhost openstack_network_exporter[240658]: Nov 28 04:47:18 localhost openstack_network_exporter[240658]: ERROR 09:47:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:47:18 localhost openstack_network_exporter[240658]: Nov 28 04:47:18 localhost nova_compute[279673]: 2025-11-28 09:47:18.655 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:21 localhost nova_compute[279673]: 2025-11-28 09:47:21.842 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:47:22 localhost podman[281063]: 2025-11-28 09:47:22.834202914 +0000 UTC m=+0.074650587 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=edpm, architecture=x86_64, version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Nov 28 04:47:22 localhost podman[281063]: 2025-11-28 09:47:22.847340053 +0000 UTC m=+0.087787706 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, io.openshift.expose-services=, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6) Nov 28 04:47:22 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:47:23 localhost nova_compute[279673]: 2025-11-28 09:47:23.680 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:26 localhost nova_compute[279673]: 2025-11-28 09:47:26.843 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:47:28 localhost podman[281100]: 2025-11-28 09:47:28.539121979 +0000 UTC m=+0.080847376 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:47:28 localhost podman[281100]: 2025-11-28 09:47:28.547827901 +0000 UTC m=+0.089553228 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 28 04:47:28 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:47:28 localhost nova_compute[279673]: 2025-11-28 09:47:28.683 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:29 localhost nova_compute[279673]: 2025-11-28 09:47:29.730 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:47:29 localhost nova_compute[279673]: 2025-11-28 09:47:29.732 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:47:29 localhost nova_compute[279673]: 2025-11-28 09:47:29.751 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:47:29 localhost nova_compute[279673]: 2025-11-28 09:47:29.751 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 04:47:29 localhost nova_compute[279673]: 2025-11-28 09:47:29.752 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 04:47:30 localhost nova_compute[279673]: 2025-11-28 09:47:30.742 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:47:30 localhost nova_compute[279673]: 2025-11-28 09:47:30.742 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:47:30 localhost nova_compute[279673]: 2025-11-28 09:47:30.743 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 04:47:30 localhost nova_compute[279673]: 2025-11-28 09:47:30.743 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:47:31 localhost nova_compute[279673]: 2025-11-28 09:47:31.880 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29905 DF PROTO=TCP SPT=40908 DPT=9102 SEQ=3436196770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B06440000000001030307) Nov 28 04:47:32 localhost nova_compute[279673]: 2025-11-28 09:47:32.685 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 04:47:32 localhost nova_compute[279673]: 2025-11-28 09:47:32.714 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:47:32 localhost nova_compute[279673]: 2025-11-28 09:47:32.714 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 04:47:32 localhost nova_compute[279673]: 2025-11-28 09:47:32.715 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:47:32 localhost nova_compute[279673]: 2025-11-28 09:47:32.716 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:47:32 localhost nova_compute[279673]: 2025-11-28 09:47:32.716 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:47:32 localhost nova_compute[279673]: 2025-11-28 09:47:32.716 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:47:32 localhost nova_compute[279673]: 2025-11-28 09:47:32.717 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:47:32 localhost nova_compute[279673]: 2025-11-28 09:47:32.717 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:47:32 localhost nova_compute[279673]: 2025-11-28 09:47:32.717 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 04:47:32 localhost nova_compute[279673]: 2025-11-28 09:47:32.718 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:47:32 localhost nova_compute[279673]: 2025-11-28 09:47:32.744 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:47:32 localhost nova_compute[279673]: 2025-11-28 09:47:32.745 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:47:32 localhost nova_compute[279673]: 2025-11-28 09:47:32.745 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:47:32 localhost nova_compute[279673]: 2025-11-28 09:47:32.746 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 04:47:32 localhost nova_compute[279673]: 2025-11-28 09:47:32.746 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:47:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:47:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:47:32 localhost systemd[1]: tmp-crun.aNAU91.mount: Deactivated successfully. Nov 28 04:47:32 localhost podman[281175]: 2025-11-28 09:47:32.87423015 +0000 UTC m=+0.100353660 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:47:32 localhost podman[281175]: 2025-11-28 09:47:32.878703519 +0000 UTC m=+0.104827049 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:47:32 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:47:32 localhost podman[281174]: 2025-11-28 09:47:32.962255482 +0000 UTC m=+0.191742329 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 28 04:47:33 localhost podman[281174]: 2025-11-28 09:47:33.030897815 +0000 UTC m=+0.260384702 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller) Nov 28 04:47:33 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:47:33 localhost nova_compute[279673]: 2025-11-28 09:47:33.178 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:47:33 localhost nova_compute[279673]: 2025-11-28 09:47:33.264 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:47:33 localhost nova_compute[279673]: 2025-11-28 09:47:33.265 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:47:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29906 DF PROTO=TCP SPT=40908 DPT=9102 SEQ=3436196770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B0A420000000001030307) Nov 28 04:47:33 localhost nova_compute[279673]: 2025-11-28 09:47:33.483 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 04:47:33 localhost nova_compute[279673]: 2025-11-28 09:47:33.486 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12296MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 04:47:33 localhost nova_compute[279673]: 2025-11-28 09:47:33.486 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:47:33 localhost nova_compute[279673]: 2025-11-28 09:47:33.487 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:47:33 localhost nova_compute[279673]: 2025-11-28 09:47:33.576 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 04:47:33 localhost nova_compute[279673]: 2025-11-28 09:47:33.577 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 04:47:33 localhost nova_compute[279673]: 2025-11-28 09:47:33.578 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 04:47:33 localhost nova_compute[279673]: 2025-11-28 09:47:33.645 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:47:33 localhost nova_compute[279673]: 2025-11-28 09:47:33.687 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:34 localhost nova_compute[279673]: 2025-11-28 09:47:34.097 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:47:34 localhost nova_compute[279673]: 2025-11-28 09:47:34.104 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 04:47:34 localhost nova_compute[279673]: 2025-11-28 09:47:34.132 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 04:47:34 localhost nova_compute[279673]: 2025-11-28 09:47:34.160 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 04:47:34 localhost nova_compute[279673]: 2025-11-28 09:47:34.160 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.674s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:47:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45819 DF PROTO=TCP SPT=38778 DPT=9102 SEQ=3981783863 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B0D820000000001030307) Nov 28 04:47:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:47:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29907 DF PROTO=TCP SPT=40908 DPT=9102 SEQ=3436196770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B12420000000001030307) Nov 28 04:47:35 localhost systemd[1]: tmp-crun.emdmCK.mount: Deactivated successfully. Nov 28 04:47:35 localhost podman[281277]: 2025-11-28 09:47:35.476068132 +0000 UTC m=+0.079936039 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute) Nov 28 04:47:35 localhost podman[281277]: 2025-11-28 09:47:35.48947504 +0000 UTC m=+0.093342937 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute) Nov 28 04:47:35 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:47:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49401 DF PROTO=TCP SPT=45152 DPT=9102 SEQ=3869660939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B15820000000001030307) Nov 28 04:47:36 localhost nova_compute[279673]: 2025-11-28 09:47:36.883 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:38 localhost nova_compute[279673]: 2025-11-28 09:47:38.692 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29908 DF PROTO=TCP SPT=40908 DPT=9102 SEQ=3436196770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B22020000000001030307) Nov 28 04:47:40 localhost podman[238687]: time="2025-11-28T09:47:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:47:40 localhost podman[238687]: @ - - [28/Nov/2025:09:47:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147523 "" "Go-http-client/1.1" Nov 28 04:47:40 localhost podman[238687]: @ - - [28/Nov/2025:09:47:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17234 "" "Go-http-client/1.1" Nov 28 04:47:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:47:41 localhost podman[281296]: 2025-11-28 09:47:41.840398294 +0000 UTC m=+0.079385994 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:47:41 localhost podman[281296]: 2025-11-28 09:47:41.851413532 +0000 UTC m=+0.090401272 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 04:47:41 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:47:41 localhost nova_compute[279673]: 2025-11-28 09:47:41.933 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:43 localhost nova_compute[279673]: 2025-11-28 09:47:43.738 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:47:43 localhost podman[281319]: 2025-11-28 09:47:43.850772664 +0000 UTC m=+0.081123215 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 04:47:43 localhost podman[281319]: 2025-11-28 09:47:43.869406401 +0000 UTC m=+0.099756962 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, managed_by=edpm_ansible) Nov 28 04:47:43 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:47:46 localhost nova_compute[279673]: 2025-11-28 09:47:46.935 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29909 DF PROTO=TCP SPT=40908 DPT=9102 SEQ=3436196770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B41830000000001030307) Nov 28 04:47:48 localhost openstack_network_exporter[240658]: ERROR 09:47:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:47:48 localhost openstack_network_exporter[240658]: ERROR 09:47:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:47:48 localhost openstack_network_exporter[240658]: ERROR 09:47:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:47:48 localhost openstack_network_exporter[240658]: ERROR 09:47:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:47:48 localhost openstack_network_exporter[240658]: Nov 28 04:47:48 localhost openstack_network_exporter[240658]: ERROR 09:47:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:47:48 localhost openstack_network_exporter[240658]: Nov 28 04:47:48 localhost nova_compute[279673]: 2025-11-28 09:47:48.740 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:50.826 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:47:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:50.827 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:47:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:47:50.829 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:47:51 localhost nova_compute[279673]: 2025-11-28 09:47:51.969 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:53 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:b0:25:93 MACPROTO=0800 SRC=3.138.197.221 DST=38.102.83.64 LEN=52 TOS=0x00 PREC=0x00 TTL=50 ID=9338 PROTO=TCP SPT=36814 DPT=9090 SEQ=2185110305 ACK=0 WINDOW=65535 RES=0x00 SYN URGP=0 OPT (020405B40103030801010402) Nov 28 04:47:53 localhost nova_compute[279673]: 2025-11-28 09:47:53.777 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:47:53 localhost systemd[1]: tmp-crun.cDRt2Q.mount: Deactivated successfully. Nov 28 04:47:53 localhost podman[281337]: 2025-11-28 09:47:53.896347538 +0000 UTC m=+0.095749477 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vendor=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 28 04:47:53 localhost podman[281337]: 2025-11-28 09:47:53.940535284 +0000 UTC m=+0.139937253 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, release=1755695350, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Nov 28 04:47:53 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:47:56 localhost nova_compute[279673]: 2025-11-28 09:47:56.972 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:47:58 localhost nova_compute[279673]: 2025-11-28 09:47:58.780 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:47:58 localhost podman[281357]: 2025-11-28 09:47:58.849089257 +0000 UTC m=+0.083809592 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:47:58 localhost podman[281357]: 2025-11-28 09:47:58.86238097 +0000 UTC m=+0.097101295 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 28 04:47:58 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.672 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.673 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.681 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c46340b1-66c6-4a9f-b3e2-d75718e49e6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:48:00.673830', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '53a5095a-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.845723655, 'message_signature': 'e3fe45d8d1cd6bd6b6896740afa1c966716cf722a68096da1490d6257540418f'}]}, 'timestamp': '2025-11-28 09:48:00.681922', '_unique_id': '1eb1eb9761944874a6de602d4b289c49'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.683 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.684 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.697 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.698 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b1bbfd3-7753-4004-b171-eeabdd5c8d78', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:48:00.685002', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '53a79b98-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.856949559, 'message_signature': '10ca8ae81e4a0e0370cf5fd3a25092ecd228aeb70a92f07150af9b72ac381309'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:48:00.685002', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '53a7af5c-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.856949559, 'message_signature': '39a2ce96941ccfac835de018f1dcba103d2bbd35433a905780a51ae49e88ce4b'}]}, 'timestamp': '2025-11-28 09:48:00.699291', '_unique_id': '88413bcb86254ef4b85e28c21f3c7ae9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.700 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.701 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.729 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.730 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a319abdc-3948-4921-8098-9b13dcb8c562', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:48:00.702151', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '53ac67cc-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.874079043, 'message_signature': 'c7cb8e1b2dbd8b58b217ac158f8275815fbd1d495fe0599de03b5f391e940bbb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:48:00.702151', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '53ac7cee-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.874079043, 'message_signature': '46b67abe0aa8ca2f3e503ce1a8f22c4a122761c21656b93fa0901df9bed04b6a'}]}, 'timestamp': '2025-11-28 09:48:00.730653', '_unique_id': 'e03c3d302aca469e88577d1a52168530'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.731 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.732 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.733 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.733 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.733 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a002305-d03e-4055-a28f-7c41f816158e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:48:00.733229', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '53acf2e6-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.874079043, 'message_signature': '32ed49363f41f127bbf58402cb77762fdc0c874e9dee7f5e50cf7f662edd5865'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:48:00.733229', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '53ad031c-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.874079043, 'message_signature': 'e7ae714adce65c6e9b5b83992903d403a1ebd4c7638152c81754e31ebe4af635'}]}, 'timestamp': '2025-11-28 09:48:00.734113', '_unique_id': '8a3299ae5429448caeedc5b8466e1e8c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.735 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.736 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.736 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.736 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '033d9787-2d94-49c1-80a8-0245372a512a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:48:00.736321', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '53ad6b2c-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.856949559, 'message_signature': 'fd4365552d9007c23d9b6f7ecdaeee9d4d8a85b9b43eae33fa66cd0df2d5562a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:48:00.736321', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '53ad7b26-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.856949559, 'message_signature': '9107e4312eeaec2080beb3a38f9ce2d14d1db72fe15c5db2830fc31b5b0ba715'}]}, 'timestamp': '2025-11-28 09:48:00.737223', '_unique_id': '58f0706cabd24e10802faa5c8f3f7c69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.738 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.739 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.757 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f32ade6f-5ec4-4c84-90f5-d669f05d7e97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:48:00.739458', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '53b0b5ca-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.929620678, 'message_signature': '4b3230e3fc98f681a68f5848ed2ee4e1e52a888c84794d6d1fdcdea84176e8c6'}]}, 'timestamp': '2025-11-28 09:48:00.758375', '_unique_id': '714598eab8224e6cbbfff2c9cd4ab4b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.759 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.760 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.760 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.760 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76b732d2-6381-4561-9d33-1395c28f872c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:48:00.760667', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '53b12244-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.845723655, 'message_signature': 'c89854f75f4cce161bb602af1d6735a285e41fdbb6a874e5ed80349d40ac4a14'}]}, 'timestamp': '2025-11-28 09:48:00.761156', '_unique_id': '1e71bd99007a4a2bb88615da43ea666c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.762 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.763 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.763 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e81a661-3dd2-4b74-9980-1d4782c470cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:48:00.763278', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '53b18838-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.845723655, 'message_signature': 'eefe2a72069128421b01fc7687d9aaee7872a73e314d3764864a5fccbba82dce'}]}, 'timestamp': '2025-11-28 09:48:00.763733', '_unique_id': '76e328ef31524e0abaad51d7fc4cb2e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.764 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.765 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.765 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '493c50bb-4033-4984-bfa6-7ffffb959577', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:48:00.765803', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '53b1ebd4-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.845723655, 'message_signature': '5df403eece4e73874ee1db7d1512e399e67b3ecf5f83557497cc27910e09941f'}]}, 'timestamp': '2025-11-28 09:48:00.766315', '_unique_id': 'bc4e3db95292429eba99b07ced437574'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.767 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.768 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.768 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3dfc5792-340d-45cc-ad87-5fd1779569a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:48:00.768376', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '53b24f48-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.845723655, 'message_signature': '208021f6b01492c07f3c8322068ccab47ac5a48c7ed934d640dbd24360aad5b6'}]}, 'timestamp': '2025-11-28 09:48:00.768830', '_unique_id': 'd40d4730529a4694988e97ded02ccef0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.770 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.771 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc154370-47a5-4900-b4d8-00f7f6415266', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:48:00.771060', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '53b2b87a-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.845723655, 'message_signature': '4e84d4665bcbb0dbe14fd1e2e4183d920428fbbdfe584bd7ad041afa0172d075'}]}, 'timestamp': '2025-11-28 09:48:00.771524', '_unique_id': '5b3c8b7901954e54b2ecff31831fdc78'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.773 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.773 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.774 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'edd16667-d656-4547-98dd-729056766d2f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:48:00.773590', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '53b31acc-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.874079043, 'message_signature': '2868c6b47e56d50aecbef58d1dd45668f99f1a1b5310784e50010d03aee7ab42'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:48:00.773590', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '53b32d46-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.874079043, 'message_signature': 'abbfa07bfe38421041e6603ed60b7d3d6e5cb36596bb70970ecc374b06864390'}]}, 'timestamp': '2025-11-28 09:48:00.774491', '_unique_id': '9b2b15d96efc4fa8980a41894809b395'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.775 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43301249-061b-4a0f-9508-a3007e47bbf2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:48:00.775957', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '53b375e4-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.845723655, 'message_signature': '2dfe1d438d6341eb5464ea3c8def6e88f96e3cd6e6ed0fad50e480c8ea1c3c92'}]}, 'timestamp': '2025-11-28 09:48:00.776322', '_unique_id': 'c968f0b6b7cc433bb56c04b6fdf90a01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.776 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.777 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.777 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '744ef230-6a73-4cbe-858d-7c6c308280e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:48:00.777607', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '53b3b4aa-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.845723655, 'message_signature': 'c23620c8250cc41caa170358c1321b85585e981ab4d79394f1fa4f780ce05524'}]}, 'timestamp': '2025-11-28 09:48:00.777890', '_unique_id': '78f655d2c1d04042bd800e61dfbef61b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.778 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.779 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.779 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.779 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95da7934-ef29-41b2-9a7c-338924af02e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:48:00.779225', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '53b3f3b6-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.856949559, 'message_signature': '78718bcd396310cada4eaad879db1b31e6ac8d401755b84df6003ef8faf285fe'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:48:00.779225', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '53b3fd52-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.856949559, 'message_signature': '8dd2114965aa3492344dd7b2a68f35268bb2178f92b3b89365cbce42a7508a1c'}]}, 'timestamp': '2025-11-28 09:48:00.779730', '_unique_id': '0ff0ed2284304d03adcfaea77dda0711'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.780 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.781 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.781 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c12295c-4bc0-4de3-a610-454ec0ecba46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:48:00.781045', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '53b43b46-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.874079043, 'message_signature': 'ddaf2344f2c5c65c4d7d62fcea9ff14bf0e85dfeae5cb2ddffce4970fd1da30b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:48:00.781045', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '53b444c4-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.874079043, 'message_signature': '2c01ea7ecbad6f646fee96e1e941f44f0bd28fa7ae3246c00d1abfba4147fefc'}]}, 'timestamp': '2025-11-28 09:48:00.781558', '_unique_id': '80547a6af64c4ab4891edafb9174360f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.782 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae5a8963-698b-49a6-b1e3-375c42d89d3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:48:00.782903', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '53b48a4c-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.845723655, 'message_signature': '8b09f3cff4e7c9ff396390e6981debd09d5719f564957d2a7a6db371cf6392d4'}]}, 'timestamp': '2025-11-28 09:48:00.783359', '_unique_id': '1c1ffebebb794788aee34c8710702da5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.783 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.784 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.784 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.784 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02884105-04b9-43ac-b6d8-7bb7eef3c4f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:48:00.784613', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '53b4c624-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.874079043, 'message_signature': '174a28293dc57b809237c7cf689c6d82aedca0eb68b0af80ab228c8ceb783a00'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:48:00.784613', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '53b4cfa2-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.874079043, 'message_signature': 'd19e163bf7e00326197c663c92588a3dc23e1345d0c5499e061a1578ad6ffacc'}]}, 'timestamp': '2025-11-28 09:48:00.785138', '_unique_id': '2492bc05bcc74ac08026cb8481100be3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.785 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.786 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.786 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84d7028f-7a1c-45db-8d86-afa3b4282194', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:48:00.786649', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '53b51606-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.845723655, 'message_signature': '187cfc2cf54dabf82b68bcb88f607f182f826a1dda51c67b095f3d71451d6e83'}]}, 'timestamp': '2025-11-28 09:48:00.786937', '_unique_id': 'c53c71e7b2fc4eeb9e56c7a6d5b40a3a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.788 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.788 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.788 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a6a9eec-2499-4bba-b145-e2a3c223d31f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:48:00.788311', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '53b556d4-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.874079043, 'message_signature': '2793aa680aa5374685b0b08c166f553f17a51058ffd930a3f5c2a59c3de852c5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:48:00.788311', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '53b56110-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.874079043, 'message_signature': '5c45b08baab989fd5606d796d7b763a7e6cb3afcc6a280daa745d7290cbb5089'}]}, 'timestamp': '2025-11-28 09:48:00.788839', '_unique_id': '2d30c103c3854b72853a6e63db158b6f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.789 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.790 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.790 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 11560000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea488f5e-e8da-4bf2-8dfc-1491679807c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11560000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:48:00.790161', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '53b5a008-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11114.929620678, 'message_signature': 'e8b79152bddd8945ee2c34d6678805052db283344d688bc4186323b0fc23e0f9'}]}, 'timestamp': '2025-11-28 09:48:00.790478', '_unique_id': '6de4b0032969467b99166c2659751deb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:48:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:48:00.791 12 ERROR oslo_messaging.notify.messaging Nov 28 04:48:02 localhost nova_compute[279673]: 2025-11-28 09:48:02.006 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:48:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48263 DF PROTO=TCP SPT=36398 DPT=9102 SEQ=2606285366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B7B750000000001030307) Nov 28 04:48:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48264 DF PROTO=TCP SPT=36398 DPT=9102 SEQ=2606285366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B7F830000000001030307) Nov 28 04:48:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:48:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:48:03 localhost nova_compute[279673]: 2025-11-28 09:48:03.829 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:48:03 localhost systemd[1]: tmp-crun.vevkIi.mount: Deactivated successfully. Nov 28 04:48:03 localhost podman[281381]: 2025-11-28 09:48:03.90065629 +0000 UTC m=+0.133876338 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 28 04:48:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29910 DF PROTO=TCP SPT=40908 DPT=9102 SEQ=3436196770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B81830000000001030307) Nov 28 04:48:03 localhost podman[281382]: 2025-11-28 09:48:03.949302455 +0000 UTC m=+0.177826247 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 04:48:03 localhost podman[281381]: 2025-11-28 09:48:03.975126342 +0000 UTC m=+0.208346380 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 28 04:48:03 localhost podman[281382]: 2025-11-28 09:48:03.983321318 +0000 UTC m=+0.211845090 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:48:03 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:48:04 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:48:05 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48265 DF PROTO=TCP SPT=36398 DPT=9102 SEQ=2606285366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B87820000000001030307) Nov 28 04:48:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:48:05 localhost systemd[1]: tmp-crun.q0rNuV.mount: Deactivated successfully. Nov 28 04:48:05 localhost podman[281425]: 2025-11-28 09:48:05.856241738 +0000 UTC m=+0.086268263 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute) Nov 28 04:48:05 localhost podman[281425]: 2025-11-28 09:48:05.897700956 +0000 UTC m=+0.127727511 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:48:05 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:48:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45820 DF PROTO=TCP SPT=38778 DPT=9102 SEQ=3981783863 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B8B820000000001030307) Nov 28 04:48:07 localhost nova_compute[279673]: 2025-11-28 09:48:07.009 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:48:08 localhost nova_compute[279673]: 2025-11-28 09:48:08.834 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:48:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48266 DF PROTO=TCP SPT=36398 DPT=9102 SEQ=2606285366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2B97420000000001030307) Nov 28 04:48:10 localhost podman[238687]: time="2025-11-28T09:48:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:48:10 localhost podman[238687]: @ - - [28/Nov/2025:09:48:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147523 "" "Go-http-client/1.1" Nov 28 04:48:10 localhost podman[238687]: @ - - [28/Nov/2025:09:48:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17228 "" "Go-http-client/1.1" Nov 28 04:48:12 localhost nova_compute[279673]: 2025-11-28 09:48:12.050 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:48:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:48:12 localhost podman[281443]: 2025-11-28 09:48:12.852282628 +0000 UTC m=+0.086540960 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:48:12 localhost podman[281443]: 2025-11-28 09:48:12.861330771 +0000 UTC m=+0.095589063 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:48:12 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:48:13 localhost nova_compute[279673]: 2025-11-28 09:48:13.876 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:48:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:48:14 localhost systemd[1]: tmp-crun.M2i5Ls.mount: Deactivated successfully. Nov 28 04:48:14 localhost podman[281466]: 2025-11-28 09:48:14.852582118 +0000 UTC m=+0.079415764 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:48:14 localhost sshd[281482]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:48:14 localhost podman[281466]: 2025-11-28 09:48:14.867312954 +0000 UTC m=+0.094146640 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd) Nov 28 04:48:14 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:48:14 localhost systemd-logind[764]: New session 61 of user zuul. Nov 28 04:48:15 localhost systemd[1]: Started Session 61 of User zuul. Nov 28 04:48:15 localhost python3[281507]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:48:15 localhost subscription-manager[281508]: Unregistered machine with identity: f7b9b60d-6b81-4721-85a2-48be6d80ec8a Nov 28 04:48:17 localhost nova_compute[279673]: 2025-11-28 09:48:17.052 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:48:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48267 DF PROTO=TCP SPT=36398 DPT=9102 SEQ=2606285366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2BB7820000000001030307) Nov 28 04:48:18 localhost openstack_network_exporter[240658]: ERROR 09:48:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:48:18 localhost openstack_network_exporter[240658]: ERROR 09:48:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:48:18 localhost openstack_network_exporter[240658]: ERROR 09:48:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:48:18 localhost openstack_network_exporter[240658]: ERROR 09:48:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:48:18 localhost openstack_network_exporter[240658]: Nov 28 04:48:18 localhost openstack_network_exporter[240658]: ERROR 09:48:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:48:18 localhost openstack_network_exporter[240658]: Nov 28 04:48:18 localhost nova_compute[279673]: 2025-11-28 09:48:18.879 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:48:22 localhost nova_compute[279673]: 2025-11-28 09:48:22.096 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:48:23 localhost nova_compute[279673]: 2025-11-28 09:48:23.914 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:48:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:48:24 localhost systemd[1]: tmp-crun.oa5hI7.mount: Deactivated successfully. Nov 28 04:48:24 localhost podman[281510]: 2025-11-28 09:48:24.864655815 +0000 UTC m=+0.094786408 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, version=9.6, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Nov 28 04:48:24 localhost podman[281510]: 2025-11-28 09:48:24.876800867 +0000 UTC m=+0.106931480 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container) Nov 28 04:48:24 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:48:27 localhost nova_compute[279673]: 2025-11-28 09:48:27.132 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:48:28 localhost nova_compute[279673]: 2025-11-28 09:48:28.917 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:48:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:48:29 localhost systemd[1]: tmp-crun.uFp5Ha.mount: Deactivated successfully. Nov 28 04:48:29 localhost podman[281531]: 2025-11-28 09:48:29.881251148 +0000 UTC m=+0.121040517 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 04:48:29 localhost podman[281531]: 2025-11-28 09:48:29.890455744 +0000 UTC m=+0.130245113 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 28 04:48:29 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:48:32 localhost nova_compute[279673]: 2025-11-28 09:48:32.173 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:48:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18443 DF PROTO=TCP SPT=56892 DPT=9102 SEQ=4071554048 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2BF0A40000000001030307) Nov 28 04:48:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18444 DF PROTO=TCP SPT=56892 DPT=9102 SEQ=4071554048 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2BF4C20000000001030307) Nov 28 04:48:33 localhost nova_compute[279673]: 2025-11-28 09:48:33.945 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:48:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48268 DF PROTO=TCP SPT=36398 DPT=9102 SEQ=2606285366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2BF7830000000001030307) Nov 28 04:48:34 localhost nova_compute[279673]: 2025-11-28 09:48:34.162 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:48:34 localhost nova_compute[279673]: 2025-11-28 09:48:34.163 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:48:34 localhost nova_compute[279673]: 2025-11-28 09:48:34.163 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 04:48:34 localhost nova_compute[279673]: 2025-11-28 09:48:34.164 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 04:48:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:48:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:48:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 04:48:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.1 total, 600.0 interval#012Cumulative writes: 4971 writes, 22K keys, 4971 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4971 writes, 644 syncs, 7.72 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 72 writes, 178 keys, 72 commit groups, 1.0 writes per commit group, ingest: 0.29 MB, 0.00 MB/s#012Interval WAL: 72 writes, 36 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 28 04:48:34 localhost systemd-journald[47227]: Field hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation. Nov 28 04:48:34 localhost systemd-journald[47227]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 28 04:48:34 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 28 04:48:34 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 28 04:48:34 localhost nova_compute[279673]: 2025-11-28 09:48:34.863 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:48:34 localhost nova_compute[279673]: 2025-11-28 09:48:34.864 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:48:34 localhost nova_compute[279673]: 2025-11-28 09:48:34.864 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 04:48:34 localhost nova_compute[279673]: 2025-11-28 09:48:34.865 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:48:34 localhost podman[281643]: 2025-11-28 09:48:34.885257296 +0000 UTC m=+0.367564197 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent) Nov 28 04:48:34 localhost podman[281642]: 2025-11-28 09:48:34.914199683 +0000 UTC m=+0.397791511 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible) Nov 28 04:48:34 localhost podman[281642]: 2025-11-28 09:48:34.941922774 +0000 UTC m=+0.425514582 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125) Nov 28 04:48:34 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:48:34 localhost podman[281643]: 2025-11-28 09:48:34.972343792 +0000 UTC m=+0.454650683 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Nov 28 04:48:34 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:48:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18445 DF PROTO=TCP SPT=56892 DPT=9102 SEQ=4071554048 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2BFCC20000000001030307) Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.012 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.039 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.039 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.040 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.041 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.041 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.041 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.042 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.042 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.043 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.043 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.067 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.068 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.068 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.069 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.069 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:48:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29911 DF PROTO=TCP SPT=40908 DPT=9102 SEQ=3436196770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2BFF830000000001030307) Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.538 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.643 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.644 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:48:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:48:36 localhost podman[281745]: 2025-11-28 09:48:36.860347966 +0000 UTC m=+0.093058138 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125) Nov 28 04:48:36 localhost podman[281745]: 2025-11-28 09:48:36.876410991 +0000 UTC m=+0.109121203 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3) Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.878 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.880 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12288MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.880 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.880 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:48:36 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.996 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.997 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 04:48:36 localhost nova_compute[279673]: 2025-11-28 09:48:36.997 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 04:48:37 localhost nova_compute[279673]: 2025-11-28 09:48:37.039 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:48:37 localhost nova_compute[279673]: 2025-11-28 09:48:37.176 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:48:37 localhost nova_compute[279673]: 2025-11-28 09:48:37.495 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:48:37 localhost nova_compute[279673]: 2025-11-28 09:48:37.502 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 04:48:37 localhost nova_compute[279673]: 2025-11-28 09:48:37.521 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 04:48:37 localhost nova_compute[279673]: 2025-11-28 09:48:37.524 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 04:48:37 localhost nova_compute[279673]: 2025-11-28 09:48:37.524 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.644s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:48:38 localhost nova_compute[279673]: 2025-11-28 09:48:38.949 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:48:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18446 DF PROTO=TCP SPT=56892 DPT=9102 SEQ=4071554048 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2C0C830000000001030307) Nov 28 04:48:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 04:48:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.3 total, 600.0 interval#012Cumulative writes: 5682 writes, 25K keys, 5682 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5682 writes, 779 syncs, 7.29 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 66 writes, 243 keys, 66 commit groups, 1.0 writes per commit group, ingest: 0.31 MB, 0.00 MB/s#012Interval WAL: 66 writes, 21 syncs, 3.14 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 28 04:48:40 localhost podman[238687]: time="2025-11-28T09:48:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:48:40 localhost podman[238687]: @ - - [28/Nov/2025:09:48:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147523 "" "Go-http-client/1.1" Nov 28 04:48:40 localhost podman[238687]: @ - - [28/Nov/2025:09:48:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17239 "" "Go-http-client/1.1" Nov 28 04:48:42 localhost nova_compute[279673]: 2025-11-28 09:48:42.214 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:48:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:48:43 localhost systemd[1]: virtsecretd.service: Deactivated successfully. Nov 28 04:48:43 localhost systemd[1]: tmp-crun.9VrhWg.mount: Deactivated successfully. Nov 28 04:48:43 localhost podman[281804]: 2025-11-28 09:48:43.848130506 +0000 UTC m=+0.081047583 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:48:43 localhost podman[281804]: 2025-11-28 09:48:43.859516724 +0000 UTC m=+0.092433771 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 04:48:43 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:48:43 localhost nova_compute[279673]: 2025-11-28 09:48:43.992 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:48:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:48:45 localhost systemd[1]: tmp-crun.5I4Z2r.mount: Deactivated successfully. Nov 28 04:48:45 localhost podman[281827]: 2025-11-28 09:48:45.863814579 +0000 UTC m=+0.095374266 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd) Nov 28 04:48:45 localhost podman[281827]: 2025-11-28 09:48:45.875477817 +0000 UTC m=+0.107037504 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125) Nov 28 04:48:45 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:48:47 localhost nova_compute[279673]: 2025-11-28 09:48:47.213 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:48:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:11:5f:6c MACDST=fa:16:3e:11:ad:79 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18447 DF PROTO=TCP SPT=56892 DPT=9102 SEQ=4071554048 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080ACB2C2D820000000001030307) Nov 28 04:48:48 localhost openstack_network_exporter[240658]: ERROR 09:48:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:48:48 localhost openstack_network_exporter[240658]: ERROR 09:48:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:48:48 localhost openstack_network_exporter[240658]: ERROR 09:48:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:48:48 localhost openstack_network_exporter[240658]: ERROR 09:48:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:48:48 localhost openstack_network_exporter[240658]: Nov 28 04:48:48 localhost openstack_network_exporter[240658]: ERROR 09:48:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:48:48 localhost openstack_network_exporter[240658]: Nov 28 04:48:48 localhost nova_compute[279673]: 2025-11-28 09:48:48.993 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:48:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:48:50.827 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:48:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:48:50.827 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:48:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:48:50.829 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:48:51 localhost sshd[281900]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:48:52 localhost systemd[1]: Created slice User Slice of UID 1003. Nov 28 04:48:52 localhost systemd[1]: Starting User Runtime Directory /run/user/1003... Nov 28 04:48:52 localhost systemd-logind[764]: New session 62 of user tripleo-admin. Nov 28 04:48:52 localhost systemd[1]: Finished User Runtime Directory /run/user/1003. Nov 28 04:48:52 localhost systemd[1]: Starting User Manager for UID 1003... Nov 28 04:48:52 localhost nova_compute[279673]: 2025-11-28 09:48:52.246 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:48:52 localhost systemd[281904]: Queued start job for default target Main User Target. Nov 28 04:48:52 localhost systemd[281904]: Created slice User Application Slice. Nov 28 04:48:52 localhost systemd[281904]: Started Mark boot as successful after the user session has run 2 minutes. Nov 28 04:48:52 localhost systemd[281904]: Started Daily Cleanup of User's Temporary Directories. Nov 28 04:48:52 localhost systemd[281904]: Reached target Paths. Nov 28 04:48:52 localhost systemd[281904]: Reached target Timers. Nov 28 04:48:52 localhost systemd[281904]: Starting D-Bus User Message Bus Socket... Nov 28 04:48:52 localhost systemd[281904]: Starting Create User's Volatile Files and Directories... Nov 28 04:48:52 localhost systemd[281904]: Listening on D-Bus User Message Bus Socket. Nov 28 04:48:52 localhost systemd[281904]: Reached target Sockets. Nov 28 04:48:52 localhost systemd[281904]: Finished Create User's Volatile Files and Directories. Nov 28 04:48:52 localhost systemd[281904]: Reached target Basic System. Nov 28 04:48:52 localhost systemd[281904]: Reached target Main User Target. Nov 28 04:48:52 localhost systemd[281904]: Startup finished in 152ms. Nov 28 04:48:52 localhost systemd[1]: Started User Manager for UID 1003. Nov 28 04:48:52 localhost systemd[1]: Started Session 62 of User tripleo-admin. Nov 28 04:48:53 localhost python3[282046]: ansible-ansible.builtin.blockinfile Invoked with marker_begin=BEGIN ceph firewall rules marker_end=END ceph firewall rules path=/etc/nftables/edpm-rules.nft mode=0644 block=# 100 ceph_alertmanager (9093)#012add rule inet filter EDPM_INPUT tcp dport { 9093 } ct state new counter accept comment "100 ceph_alertmanager"#012# 100 ceph_dashboard (8443)#012add rule inet filter EDPM_INPUT tcp dport { 8443 } ct state new counter accept comment "100 ceph_dashboard"#012# 100 ceph_grafana (3100)#012add rule inet filter EDPM_INPUT tcp dport { 3100 } ct state new counter accept comment "100 ceph_grafana"#012# 100 ceph_prometheus (9092)#012add rule inet filter EDPM_INPUT tcp dport { 9092 } ct state new counter accept comment "100 ceph_prometheus"#012# 100 ceph_rgw (8080)#012add rule inet filter EDPM_INPUT tcp dport { 8080 } ct state new counter accept comment "100 ceph_rgw"#012# 110 ceph_mon (6789, 3300, 9100)#012add rule inet filter EDPM_INPUT tcp dport { 6789,3300,9100 } ct state new counter accept comment "110 ceph_mon"#012# 112 ceph_mds (6800-7300, 9100)#012add rule inet filter EDPM_INPUT tcp dport { 6800-7300,9100 } ct state new counter accept comment "112 ceph_mds"#012# 113 ceph_mgr (6800-7300, 8444)#012add rule inet filter EDPM_INPUT tcp dport { 6800-7300,8444 } ct state new counter accept comment "113 ceph_mgr"#012# 120 ceph_nfs (2049, 12049)#012add rule inet filter EDPM_INPUT tcp dport { 2049,12049 } ct state new counter accept comment "120 ceph_nfs"#012# 123 ceph_dashboard (9090, 9094, 9283)#012add rule inet filter EDPM_INPUT tcp dport { 9090,9094,9283 } ct state new counter accept comment "123 ceph_dashboard"#012 insertbefore=^# Lock down INPUT chains state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False unsafe_writes=False insertafter=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:48:54 localhost nova_compute[279673]: 2025-11-28 09:48:54.034 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:48:54 localhost python3[282190]: ansible-ansible.builtin.systemd Invoked with name=nftables state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Nov 28 04:48:54 localhost systemd[1]: Stopping Netfilter Tables... Nov 28 04:48:54 localhost systemd[1]: nftables.service: Deactivated successfully. Nov 28 04:48:54 localhost systemd[1]: Stopped Netfilter Tables. Nov 28 04:48:54 localhost systemd[1]: Starting Netfilter Tables... Nov 28 04:48:54 localhost systemd[1]: Finished Netfilter Tables. Nov 28 04:48:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:48:55 localhost systemd[1]: tmp-crun.ATpAdQ.mount: Deactivated successfully. Nov 28 04:48:55 localhost podman[282215]: 2025-11-28 09:48:55.86318787 +0000 UTC m=+0.093581194 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., vcs-type=git, name=ubi9-minimal, config_id=edpm, container_name=openstack_network_exporter, release=1755695350, distribution-scope=public, com.redhat.component=ubi9-minimal-container, version=9.6, architecture=x86_64, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible) Nov 28 04:48:55 localhost podman[282215]: 2025-11-28 09:48:55.881483258 +0000 UTC m=+0.111876552 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git) Nov 28 04:48:55 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:48:57 localhost nova_compute[279673]: 2025-11-28 09:48:57.249 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:48:59 localhost nova_compute[279673]: 2025-11-28 09:48:59.037 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:49:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:49:00 localhost systemd[1]: tmp-crun.ncCjfe.mount: Deactivated successfully. Nov 28 04:49:00 localhost podman[282236]: 2025-11-28 09:49:00.856045936 +0000 UTC m=+0.090817553 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 28 04:49:00 localhost podman[282236]: 2025-11-28 09:49:00.867415865 +0000 UTC m=+0.102187452 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 04:49:00 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:49:02 localhost nova_compute[279673]: 2025-11-28 09:49:02.269 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:49:04 localhost nova_compute[279673]: 2025-11-28 09:49:04.065 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:49:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:49:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:49:05 localhost podman[282296]: 2025-11-28 09:49:05.479091693 +0000 UTC m=+0.084871842 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 28 04:49:05 localhost podman[282297]: 2025-11-28 09:49:05.560103503 +0000 UTC m=+0.160652231 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:49:05 localhost podman[282296]: 2025-11-28 09:49:05.565234061 +0000 UTC m=+0.171014220 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:49:05 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:49:05 localhost podman[282297]: 2025-11-28 09:49:05.594429925 +0000 UTC m=+0.194978643 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent) Nov 28 04:49:05 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:49:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:49:07 localhost podman[282375]: 2025-11-28 09:49:07.076100102 +0000 UTC m=+0.082563805 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 28 04:49:07 localhost podman[282375]: 2025-11-28 09:49:07.111891556 +0000 UTC m=+0.118355249 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, config_id=edpm, org.label-schema.license=GPLv2) Nov 28 04:49:07 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:49:07 localhost nova_compute[279673]: 2025-11-28 09:49:07.272 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:49:09 localhost nova_compute[279673]: 2025-11-28 09:49:09.069 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:49:10 localhost podman[238687]: time="2025-11-28T09:49:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:49:10 localhost podman[238687]: @ - - [28/Nov/2025:09:49:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147523 "" "Go-http-client/1.1" Nov 28 04:49:10 localhost podman[238687]: @ - - [28/Nov/2025:09:49:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17238 "" "Go-http-client/1.1" Nov 28 04:49:12 localhost nova_compute[279673]: 2025-11-28 09:49:12.307 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:49:14 localhost nova_compute[279673]: 2025-11-28 09:49:14.107 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:49:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:49:14 localhost systemd[1]: tmp-crun.nETkih.mount: Deactivated successfully. Nov 28 04:49:14 localhost podman[282432]: 2025-11-28 09:49:14.864720406 +0000 UTC m=+0.096746698 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 28 04:49:14 localhost podman[282432]: 2025-11-28 09:49:14.876518462 +0000 UTC m=+0.108544754 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 04:49:14 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:49:15 localhost podman[282538]: Nov 28 04:49:15 localhost podman[282538]: 2025-11-28 09:49:15.967562318 +0000 UTC m=+0.082701122 container create 4fc2c0a43759f645e048f3cbce449eb503b7b622a0ea8c12a69b5b0816c0868b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_jones, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_CLEAN=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , release=553, RELEASE=main, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12) Nov 28 04:49:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:49:15 localhost systemd[1]: session-61.scope: Deactivated successfully. Nov 28 04:49:15 localhost systemd-logind[764]: Session 61 logged out. Waiting for processes to exit. Nov 28 04:49:16 localhost systemd-logind[764]: Removed session 61. Nov 28 04:49:16 localhost systemd[1]: Started libpod-conmon-4fc2c0a43759f645e048f3cbce449eb503b7b622a0ea8c12a69b5b0816c0868b.scope. Nov 28 04:49:16 localhost systemd[1]: Started libcrun container. Nov 28 04:49:16 localhost podman[282538]: 2025-11-28 09:49:15.92917838 +0000 UTC m=+0.044317214 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:49:16 localhost podman[282538]: 2025-11-28 09:49:16.044003856 +0000 UTC m=+0.159142630 container init 4fc2c0a43759f645e048f3cbce449eb503b7b622a0ea8c12a69b5b0816c0868b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_jones, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, distribution-scope=public, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, RELEASE=main, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, ceph=True, com.redhat.component=rhceph-container) Nov 28 04:49:16 localhost podman[282538]: 2025-11-28 09:49:16.062233921 +0000 UTC m=+0.177372705 container start 4fc2c0a43759f645e048f3cbce449eb503b7b622a0ea8c12a69b5b0816c0868b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_jones, architecture=x86_64, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., release=553, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, version=7, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.openshift.expose-services=, name=rhceph, GIT_CLEAN=True, RELEASE=main) Nov 28 04:49:16 localhost podman[282538]: 2025-11-28 09:49:16.062521039 +0000 UTC m=+0.177659813 container attach 4fc2c0a43759f645e048f3cbce449eb503b7b622a0ea8c12a69b5b0816c0868b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_jones, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, release=553, GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, RELEASE=main, build-date=2025-09-24T08:57:55, architecture=x86_64, ceph=True, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , distribution-scope=public) Nov 28 04:49:16 localhost funny_jones[282559]: 167 167 Nov 28 04:49:16 localhost systemd[1]: libpod-4fc2c0a43759f645e048f3cbce449eb503b7b622a0ea8c12a69b5b0816c0868b.scope: Deactivated successfully. Nov 28 04:49:16 localhost podman[282552]: 2025-11-28 09:49:16.101093984 +0000 UTC m=+0.099180232 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true) Nov 28 04:49:16 localhost podman[282552]: 2025-11-28 09:49:16.115630655 +0000 UTC m=+0.113716903 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Nov 28 04:49:16 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:49:16 localhost podman[282538]: 2025-11-28 09:49:16.167694008 +0000 UTC m=+0.282832792 container died 4fc2c0a43759f645e048f3cbce449eb503b7b622a0ea8c12a69b5b0816c0868b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_jones, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, vcs-type=git, com.redhat.component=rhceph-container, RELEASE=main, io.openshift.tags=rhceph ceph, version=7, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=) Nov 28 04:49:16 localhost podman[282570]: 2025-11-28 09:49:16.265177267 +0000 UTC m=+0.187051215 container remove 4fc2c0a43759f645e048f3cbce449eb503b7b622a0ea8c12a69b5b0816c0868b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_jones, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.openshift.expose-services=, ceph=True, version=7, RELEASE=main, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7) Nov 28 04:49:16 localhost systemd[1]: libpod-conmon-4fc2c0a43759f645e048f3cbce449eb503b7b622a0ea8c12a69b5b0816c0868b.scope: Deactivated successfully. Nov 28 04:49:16 localhost systemd[1]: Reloading. Nov 28 04:49:16 localhost systemd-rc-local-generator[282617]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:49:16 localhost systemd-sysv-generator[282621]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:49:16 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:49:16 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:49:16 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:49:16 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:49:16 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:49:16 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:49:16 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:49:16 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:49:16 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:49:16 localhost systemd[1]: tmp-crun.gUVNom.mount: Deactivated successfully. Nov 28 04:49:16 localhost systemd[1]: var-lib-containers-storage-overlay-b7efc582d40325b91c09e0286f90f723318a0cef27d1a8e7dfa2889477511945-merged.mount: Deactivated successfully. Nov 28 04:49:16 localhost systemd[1]: Reloading. Nov 28 04:49:16 localhost systemd-rc-local-generator[282665]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:49:16 localhost systemd-sysv-generator[282668]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:49:16 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:49:16 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:49:16 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:49:16 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:49:16 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:49:16 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:49:16 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:49:16 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:49:16 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:49:17 localhost systemd[1]: Starting Ceph mds.mds.np0005538513.yljthc for 2c5417c9-00eb-57d5-a565-ddecbc7995c1... Nov 28 04:49:17 localhost nova_compute[279673]: 2025-11-28 09:49:17.311 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:49:17 localhost podman[282726]: Nov 28 04:49:17 localhost podman[282726]: 2025-11-28 09:49:17.43713096 +0000 UTC m=+0.054872891 container create be1102a3fe9451839ed0049a206966b47cd61ced9a9c1adaed5bffaddb4d9ab6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mds-mds-np0005538513-yljthc, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, ceph=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, release=553) Nov 28 04:49:17 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85960ae17dc871edebaadbc6101fc16170cda42cdb233babbaaa0c0d94601f12/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 28 04:49:17 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85960ae17dc871edebaadbc6101fc16170cda42cdb233babbaaa0c0d94601f12/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 28 04:49:17 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85960ae17dc871edebaadbc6101fc16170cda42cdb233babbaaa0c0d94601f12/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 28 04:49:17 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85960ae17dc871edebaadbc6101fc16170cda42cdb233babbaaa0c0d94601f12/merged/var/lib/ceph/mds/ceph-mds.np0005538513.yljthc supports timestamps until 2038 (0x7fffffff) Nov 28 04:49:17 localhost podman[282726]: 2025-11-28 09:49:17.49106302 +0000 UTC m=+0.108804951 container init be1102a3fe9451839ed0049a206966b47cd61ced9a9c1adaed5bffaddb4d9ab6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mds-mds-np0005538513-yljthc, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, com.redhat.component=rhceph-container, release=553, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.expose-services=, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , RELEASE=main, ceph=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, version=7, name=rhceph, distribution-scope=public) Nov 28 04:49:17 localhost podman[282726]: 2025-11-28 09:49:17.502517175 +0000 UTC m=+0.120259126 container start be1102a3fe9451839ed0049a206966b47cd61ced9a9c1adaed5bffaddb4d9ab6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mds-mds-np0005538513-yljthc, build-date=2025-09-24T08:57:55, version=7, com.redhat.component=rhceph-container, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, release=553, maintainer=Guillaume Abrioux , io.openshift.expose-services=, RELEASE=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 28 04:49:17 localhost bash[282726]: be1102a3fe9451839ed0049a206966b47cd61ced9a9c1adaed5bffaddb4d9ab6 Nov 28 04:49:17 localhost podman[282726]: 2025-11-28 09:49:17.413495608 +0000 UTC m=+0.031237519 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:49:17 localhost systemd[1]: Started Ceph mds.mds.np0005538513.yljthc for 2c5417c9-00eb-57d5-a565-ddecbc7995c1. Nov 28 04:49:17 localhost ceph-mds[282744]: set uid:gid to 167:167 (ceph:ceph) Nov 28 04:49:17 localhost ceph-mds[282744]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mds, pid 2 Nov 28 04:49:17 localhost ceph-mds[282744]: main not setting numa affinity Nov 28 04:49:17 localhost ceph-mds[282744]: pidfile_write: ignore empty --pid-file Nov 28 04:49:17 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mds-mds-np0005538513-yljthc[282740]: starting mds.mds.np0005538513.yljthc at Nov 28 04:49:17 localhost ceph-mds[282744]: mds.mds.np0005538513.yljthc Updating MDS map to version 9 from mon.0 Nov 28 04:49:18 localhost openstack_network_exporter[240658]: ERROR 09:49:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:49:18 localhost openstack_network_exporter[240658]: ERROR 09:49:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:49:18 localhost openstack_network_exporter[240658]: ERROR 09:49:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:49:18 localhost systemd[1]: tmp-crun.uIcWZV.mount: Deactivated successfully. Nov 28 04:49:18 localhost openstack_network_exporter[240658]: ERROR 09:49:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:49:18 localhost openstack_network_exporter[240658]: Nov 28 04:49:18 localhost openstack_network_exporter[240658]: ERROR 09:49:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:49:18 localhost openstack_network_exporter[240658]: Nov 28 04:49:18 localhost ceph-mds[282744]: mds.mds.np0005538513.yljthc Updating MDS map to version 10 from mon.0 Nov 28 04:49:18 localhost ceph-mds[282744]: mds.mds.np0005538513.yljthc Monitors have assigned me to become a standby. Nov 28 04:49:19 localhost nova_compute[279673]: 2025-11-28 09:49:19.109 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:49:19 localhost systemd[1]: tmp-crun.Kk87MB.mount: Deactivated successfully. Nov 28 04:49:19 localhost podman[282891]: 2025-11-28 09:49:19.144345993 +0000 UTC m=+0.110983098 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, io.openshift.expose-services=, name=rhceph, vendor=Red Hat, Inc., release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, architecture=x86_64, ceph=True, version=7, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, vcs-type=git, maintainer=Guillaume Abrioux ) Nov 28 04:49:19 localhost podman[282891]: 2025-11-28 09:49:19.249122819 +0000 UTC m=+0.215759974 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.openshift.expose-services=, name=rhceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=553, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container) Nov 28 04:49:22 localhost nova_compute[279673]: 2025-11-28 09:49:22.361 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:49:24 localhost nova_compute[279673]: 2025-11-28 09:49:24.152 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:49:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:49:26 localhost systemd[1]: tmp-crun.c4Cjx6.mount: Deactivated successfully. Nov 28 04:49:26 localhost podman[283012]: 2025-11-28 09:49:26.85744918 +0000 UTC m=+0.091944529 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, config_id=edpm, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 04:49:26 localhost podman[283012]: 2025-11-28 09:49:26.870726721 +0000 UTC m=+0.105222090 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, version=9.6, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vendor=Red Hat, Inc.) Nov 28 04:49:26 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:49:27 localhost nova_compute[279673]: 2025-11-28 09:49:27.365 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:49:29 localhost nova_compute[279673]: 2025-11-28 09:49:29.155 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:49:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:49:31 localhost podman[283032]: 2025-11-28 09:49:31.842518431 +0000 UTC m=+0.077902934 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 04:49:31 localhost podman[283032]: 2025-11-28 09:49:31.855460862 +0000 UTC m=+0.090845265 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 28 04:49:31 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:49:32 localhost nova_compute[279673]: 2025-11-28 09:49:32.128 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:49:32 localhost nova_compute[279673]: 2025-11-28 09:49:32.129 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:49:32 localhost nova_compute[279673]: 2025-11-28 09:49:32.154 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:49:32 localhost nova_compute[279673]: 2025-11-28 09:49:32.155 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:49:32 localhost nova_compute[279673]: 2025-11-28 09:49:32.155 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:49:32 localhost nova_compute[279673]: 2025-11-28 09:49:32.155 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:49:32 localhost nova_compute[279673]: 2025-11-28 09:49:32.155 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:49:32 localhost nova_compute[279673]: 2025-11-28 09:49:32.156 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:49:32 localhost nova_compute[279673]: 2025-11-28 09:49:32.156 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 04:49:32 localhost nova_compute[279673]: 2025-11-28 09:49:32.157 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:49:32 localhost nova_compute[279673]: 2025-11-28 09:49:32.178 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:49:32 localhost nova_compute[279673]: 2025-11-28 09:49:32.179 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:49:32 localhost nova_compute[279673]: 2025-11-28 09:49:32.179 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:49:32 localhost nova_compute[279673]: 2025-11-28 09:49:32.179 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 04:49:32 localhost nova_compute[279673]: 2025-11-28 09:49:32.180 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:49:32 localhost nova_compute[279673]: 2025-11-28 09:49:32.394 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:49:32 localhost nova_compute[279673]: 2025-11-28 09:49:32.658 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:49:32 localhost nova_compute[279673]: 2025-11-28 09:49:32.732 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:49:32 localhost nova_compute[279673]: 2025-11-28 09:49:32.732 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:49:32 localhost nova_compute[279673]: 2025-11-28 09:49:32.948 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 04:49:32 localhost nova_compute[279673]: 2025-11-28 09:49:32.949 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12263MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 04:49:32 localhost nova_compute[279673]: 2025-11-28 09:49:32.950 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:49:32 localhost nova_compute[279673]: 2025-11-28 09:49:32.950 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:49:33 localhost nova_compute[279673]: 2025-11-28 09:49:33.052 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 04:49:33 localhost nova_compute[279673]: 2025-11-28 09:49:33.053 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 04:49:33 localhost nova_compute[279673]: 2025-11-28 09:49:33.054 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 04:49:33 localhost nova_compute[279673]: 2025-11-28 09:49:33.101 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:49:33 localhost nova_compute[279673]: 2025-11-28 09:49:33.568 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:49:33 localhost nova_compute[279673]: 2025-11-28 09:49:33.577 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 04:49:33 localhost nova_compute[279673]: 2025-11-28 09:49:33.605 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 04:49:33 localhost nova_compute[279673]: 2025-11-28 09:49:33.609 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 04:49:33 localhost nova_compute[279673]: 2025-11-28 09:49:33.610 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:49:34 localhost nova_compute[279673]: 2025-11-28 09:49:34.193 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:49:34 localhost nova_compute[279673]: 2025-11-28 09:49:34.227 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:49:34 localhost nova_compute[279673]: 2025-11-28 09:49:34.228 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 04:49:34 localhost nova_compute[279673]: 2025-11-28 09:49:34.228 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 04:49:34 localhost nova_compute[279673]: 2025-11-28 09:49:34.779 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:49:34 localhost nova_compute[279673]: 2025-11-28 09:49:34.780 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:49:34 localhost nova_compute[279673]: 2025-11-28 09:49:34.780 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 04:49:34 localhost nova_compute[279673]: 2025-11-28 09:49:34.780 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:49:35 localhost nova_compute[279673]: 2025-11-28 09:49:35.300 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 04:49:35 localhost nova_compute[279673]: 2025-11-28 09:49:35.322 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:49:35 localhost nova_compute[279673]: 2025-11-28 09:49:35.323 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 04:49:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:49:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:49:35 localhost systemd[1]: tmp-crun.je0iug.mount: Deactivated successfully. Nov 28 04:49:35 localhost podman[283100]: 2025-11-28 09:49:35.875100557 +0000 UTC m=+0.105220851 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 28 04:49:35 localhost podman[283101]: 2025-11-28 09:49:35.971623307 +0000 UTC m=+0.195423195 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent) Nov 28 04:49:35 localhost podman[283100]: 2025-11-28 09:49:35.97690298 +0000 UTC m=+0.207023304 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:49:35 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:49:36 localhost podman[283101]: 2025-11-28 09:49:36.027667533 +0000 UTC m=+0.251467421 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible) Nov 28 04:49:36 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:49:36 localhost systemd[1]: tmp-crun.zK7VQt.mount: Deactivated successfully. Nov 28 04:49:37 localhost nova_compute[279673]: 2025-11-28 09:49:37.397 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:49:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:49:37 localhost podman[283142]: 2025-11-28 09:49:37.855240485 +0000 UTC m=+0.089090350 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:49:37 localhost podman[283142]: 2025-11-28 09:49:37.864976987 +0000 UTC m=+0.098826842 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 28 04:49:37 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:49:39 localhost nova_compute[279673]: 2025-11-28 09:49:39.198 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:49:40 localhost podman[238687]: time="2025-11-28T09:49:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:49:40 localhost podman[238687]: @ - - [28/Nov/2025:09:49:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149601 "" "Go-http-client/1.1" Nov 28 04:49:40 localhost podman[238687]: @ - - [28/Nov/2025:09:49:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17719 "" "Go-http-client/1.1" Nov 28 04:49:42 localhost nova_compute[279673]: 2025-11-28 09:49:42.430 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:49:44 localhost nova_compute[279673]: 2025-11-28 09:49:44.210 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:49:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:49:45 localhost podman[283180]: 2025-11-28 09:49:45.325299382 +0000 UTC m=+0.089126251 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 28 04:49:45 localhost podman[283180]: 2025-11-28 09:49:45.339846243 +0000 UTC m=+0.103673082 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:49:45 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:49:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:49:46 localhost systemd[1]: tmp-crun.mtRmt6.mount: Deactivated successfully. Nov 28 04:49:46 localhost podman[283269]: 2025-11-28 09:49:46.379375844 +0000 UTC m=+0.095108157 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:49:46 localhost podman[283269]: 2025-11-28 09:49:46.415116131 +0000 UTC m=+0.130848454 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, container_name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 04:49:46 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:49:47 localhost podman[283346]: Nov 28 04:49:47 localhost podman[283346]: 2025-11-28 09:49:47.047533002 +0000 UTC m=+0.071123285 container create 1ba0fe0e121ea5a6f50685bf2815314092a3b73ca4a38e66b8b05f4a193abf55 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_shannon, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_BRANCH=main, vcs-type=git, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 28 04:49:47 localhost systemd[1]: Started libpod-conmon-1ba0fe0e121ea5a6f50685bf2815314092a3b73ca4a38e66b8b05f4a193abf55.scope. Nov 28 04:49:47 localhost systemd[1]: Started libcrun container. Nov 28 04:49:47 localhost podman[283346]: 2025-11-28 09:49:47.11460842 +0000 UTC m=+0.138198713 container init 1ba0fe0e121ea5a6f50685bf2815314092a3b73ca4a38e66b8b05f4a193abf55 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_shannon, maintainer=Guillaume Abrioux , name=rhceph, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, GIT_BRANCH=main, vcs-type=git, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, architecture=x86_64, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main) Nov 28 04:49:47 localhost podman[283346]: 2025-11-28 09:49:47.019830344 +0000 UTC m=+0.043420687 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:49:47 localhost podman[283346]: 2025-11-28 09:49:47.125205098 +0000 UTC m=+0.148795391 container start 1ba0fe0e121ea5a6f50685bf2815314092a3b73ca4a38e66b8b05f4a193abf55 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_shannon, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, ceph=True, name=rhceph, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.openshift.expose-services=, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, distribution-scope=public, RELEASE=main, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 28 04:49:47 localhost podman[283346]: 2025-11-28 09:49:47.12722367 +0000 UTC m=+0.150813963 container attach 1ba0fe0e121ea5a6f50685bf2815314092a3b73ca4a38e66b8b05f4a193abf55 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_shannon, vcs-type=git, GIT_CLEAN=True, GIT_BRANCH=main, release=553, ceph=True, name=rhceph, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 28 04:49:47 localhost brave_shannon[283361]: 167 167 Nov 28 04:49:47 localhost systemd[1]: libpod-1ba0fe0e121ea5a6f50685bf2815314092a3b73ca4a38e66b8b05f4a193abf55.scope: Deactivated successfully. Nov 28 04:49:47 localhost podman[283346]: 2025-11-28 09:49:47.129075888 +0000 UTC m=+0.152666201 container died 1ba0fe0e121ea5a6f50685bf2815314092a3b73ca4a38e66b8b05f4a193abf55 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_shannon, version=7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, architecture=x86_64, GIT_BRANCH=main, distribution-scope=public, vcs-type=git, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 28 04:49:47 localhost podman[283366]: 2025-11-28 09:49:47.232409049 +0000 UTC m=+0.089153073 container remove 1ba0fe0e121ea5a6f50685bf2815314092a3b73ca4a38e66b8b05f4a193abf55 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_shannon, release=553, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, architecture=x86_64, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7) Nov 28 04:49:47 localhost systemd[1]: libpod-conmon-1ba0fe0e121ea5a6f50685bf2815314092a3b73ca4a38e66b8b05f4a193abf55.scope: Deactivated successfully. Nov 28 04:49:47 localhost systemd[1]: var-lib-containers-storage-overlay-ec195e5df26607ad6d5618c97609d1a4a8af520c30e3b0fdd1937c921be2a8a4-merged.mount: Deactivated successfully. Nov 28 04:49:47 localhost nova_compute[279673]: 2025-11-28 09:49:47.434 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:49:47 localhost podman[283388]: Nov 28 04:49:47 localhost podman[283388]: 2025-11-28 09:49:47.453502387 +0000 UTC m=+0.064699785 container create a371f70d09e1e7561dc50984dc9e0e6dda0cb4117f7127452f8d73ab0687cbb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_poitras, version=7, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, vcs-type=git, RELEASE=main, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, name=rhceph, io.buildah.version=1.33.12, GIT_CLEAN=True) Nov 28 04:49:47 localhost systemd[1]: Started libpod-conmon-a371f70d09e1e7561dc50984dc9e0e6dda0cb4117f7127452f8d73ab0687cbb8.scope. Nov 28 04:49:47 localhost systemd[1]: tmp-crun.HFbxsi.mount: Deactivated successfully. Nov 28 04:49:47 localhost systemd[1]: Started libcrun container. Nov 28 04:49:47 localhost podman[283388]: 2025-11-28 09:49:47.422730534 +0000 UTC m=+0.033927962 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:49:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/08a6670e3410e869bdac9d9f3cae0d281c091df3c53deae62c2a5de8444cec37/merged/rootfs supports timestamps until 2038 (0x7fffffff) Nov 28 04:49:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/08a6670e3410e869bdac9d9f3cae0d281c091df3c53deae62c2a5de8444cec37/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 28 04:49:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/08a6670e3410e869bdac9d9f3cae0d281c091df3c53deae62c2a5de8444cec37/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 28 04:49:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/08a6670e3410e869bdac9d9f3cae0d281c091df3c53deae62c2a5de8444cec37/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 28 04:49:47 localhost podman[283388]: 2025-11-28 09:49:47.532263587 +0000 UTC m=+0.143460985 container init a371f70d09e1e7561dc50984dc9e0e6dda0cb4117f7127452f8d73ab0687cbb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_poitras, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, RELEASE=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, version=7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 28 04:49:47 localhost podman[283388]: 2025-11-28 09:49:47.544157215 +0000 UTC m=+0.155354613 container start a371f70d09e1e7561dc50984dc9e0e6dda0cb4117f7127452f8d73ab0687cbb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_poitras, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, release=553, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_BRANCH=main, distribution-scope=public, ceph=True, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 28 04:49:47 localhost podman[283388]: 2025-11-28 09:49:47.544446754 +0000 UTC m=+0.155644222 container attach a371f70d09e1e7561dc50984dc9e0e6dda0cb4117f7127452f8d73ab0687cbb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_poitras, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_CLEAN=True, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, maintainer=Guillaume Abrioux , vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, release=553, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=) Nov 28 04:49:48 localhost openstack_network_exporter[240658]: ERROR 09:49:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:49:48 localhost openstack_network_exporter[240658]: ERROR 09:49:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:49:48 localhost openstack_network_exporter[240658]: ERROR 09:49:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:49:48 localhost openstack_network_exporter[240658]: ERROR 09:49:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:49:48 localhost openstack_network_exporter[240658]: Nov 28 04:49:48 localhost openstack_network_exporter[240658]: ERROR 09:49:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:49:48 localhost openstack_network_exporter[240658]: Nov 28 04:49:48 localhost tender_poitras[283403]: [ Nov 28 04:49:48 localhost tender_poitras[283403]: { Nov 28 04:49:48 localhost tender_poitras[283403]: "available": false, Nov 28 04:49:48 localhost tender_poitras[283403]: "ceph_device": false, Nov 28 04:49:48 localhost tender_poitras[283403]: "device_id": "QEMU_DVD-ROM_QM00001", Nov 28 04:49:48 localhost tender_poitras[283403]: "lsm_data": {}, Nov 28 04:49:48 localhost tender_poitras[283403]: "lvs": [], Nov 28 04:49:48 localhost tender_poitras[283403]: "path": "/dev/sr0", Nov 28 04:49:48 localhost tender_poitras[283403]: "rejected_reasons": [ Nov 28 04:49:48 localhost tender_poitras[283403]: "Insufficient space (<5GB)", Nov 28 04:49:48 localhost tender_poitras[283403]: "Has a FileSystem" Nov 28 04:49:48 localhost tender_poitras[283403]: ], Nov 28 04:49:48 localhost tender_poitras[283403]: "sys_api": { Nov 28 04:49:48 localhost tender_poitras[283403]: "actuators": null, Nov 28 04:49:48 localhost tender_poitras[283403]: "device_nodes": "sr0", Nov 28 04:49:48 localhost tender_poitras[283403]: "human_readable_size": "482.00 KB", Nov 28 04:49:48 localhost tender_poitras[283403]: "id_bus": "ata", Nov 28 04:49:48 localhost tender_poitras[283403]: "model": "QEMU DVD-ROM", Nov 28 04:49:48 localhost tender_poitras[283403]: "nr_requests": "2", Nov 28 04:49:48 localhost tender_poitras[283403]: "partitions": {}, Nov 28 04:49:48 localhost tender_poitras[283403]: "path": "/dev/sr0", Nov 28 04:49:48 localhost tender_poitras[283403]: "removable": "1", Nov 28 04:49:48 localhost tender_poitras[283403]: "rev": "2.5+", Nov 28 04:49:48 localhost tender_poitras[283403]: "ro": "0", Nov 28 04:49:48 localhost tender_poitras[283403]: "rotational": "1", Nov 28 04:49:48 localhost tender_poitras[283403]: "sas_address": "", Nov 28 04:49:48 localhost tender_poitras[283403]: "sas_device_handle": "", Nov 28 04:49:48 localhost tender_poitras[283403]: "scheduler_mode": "mq-deadline", Nov 28 04:49:48 localhost tender_poitras[283403]: "sectors": 0, Nov 28 04:49:48 localhost tender_poitras[283403]: "sectorsize": "2048", Nov 28 04:49:48 localhost tender_poitras[283403]: "size": 493568.0, Nov 28 04:49:48 localhost tender_poitras[283403]: "support_discard": "0", Nov 28 04:49:48 localhost tender_poitras[283403]: "type": "disk", Nov 28 04:49:48 localhost tender_poitras[283403]: "vendor": "QEMU" Nov 28 04:49:48 localhost tender_poitras[283403]: } Nov 28 04:49:48 localhost tender_poitras[283403]: } Nov 28 04:49:48 localhost tender_poitras[283403]: ] Nov 28 04:49:48 localhost systemd[1]: libpod-a371f70d09e1e7561dc50984dc9e0e6dda0cb4117f7127452f8d73ab0687cbb8.scope: Deactivated successfully. Nov 28 04:49:48 localhost systemd[1]: libpod-a371f70d09e1e7561dc50984dc9e0e6dda0cb4117f7127452f8d73ab0687cbb8.scope: Consumed 1.000s CPU time. Nov 28 04:49:48 localhost podman[285428]: 2025-11-28 09:49:48.627187444 +0000 UTC m=+0.052871239 container died a371f70d09e1e7561dc50984dc9e0e6dda0cb4117f7127452f8d73ab0687cbb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_poitras, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, io.openshift.expose-services=, maintainer=Guillaume Abrioux , vcs-type=git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main) Nov 28 04:49:48 localhost systemd[1]: var-lib-containers-storage-overlay-08a6670e3410e869bdac9d9f3cae0d281c091df3c53deae62c2a5de8444cec37-merged.mount: Deactivated successfully. Nov 28 04:49:48 localhost podman[285428]: 2025-11-28 09:49:48.666648166 +0000 UTC m=+0.092331911 container remove a371f70d09e1e7561dc50984dc9e0e6dda0cb4117f7127452f8d73ab0687cbb8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_poitras, io.openshift.tags=rhceph ceph, release=553, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.12, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., ceph=True, GIT_BRANCH=main, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 28 04:49:48 localhost systemd[1]: libpod-conmon-a371f70d09e1e7561dc50984dc9e0e6dda0cb4117f7127452f8d73ab0687cbb8.scope: Deactivated successfully. Nov 28 04:49:49 localhost nova_compute[279673]: 2025-11-28 09:49:49.213 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:49:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:49:50.827 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:49:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:49:50.828 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:49:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:49:50.830 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:49:52 localhost nova_compute[279673]: 2025-11-28 09:49:52.459 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:49:54 localhost nova_compute[279673]: 2025-11-28 09:49:54.239 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:49:54 localhost systemd[1]: session-62.scope: Deactivated successfully. Nov 28 04:49:54 localhost systemd[1]: session-62.scope: Consumed 1.310s CPU time. Nov 28 04:49:54 localhost systemd-logind[764]: Session 62 logged out. Waiting for processes to exit. Nov 28 04:49:54 localhost systemd-logind[764]: Removed session 62. Nov 28 04:49:57 localhost nova_compute[279673]: 2025-11-28 09:49:57.461 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:49:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:49:57 localhost systemd[1]: tmp-crun.790DBn.mount: Deactivated successfully. Nov 28 04:49:57 localhost podman[285443]: 2025-11-28 09:49:57.890421696 +0000 UTC m=+0.126780468 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 28 04:49:57 localhost podman[285443]: 2025-11-28 09:49:57.905506743 +0000 UTC m=+0.141865485 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_id=edpm, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter) Nov 28 04:49:57 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:49:59 localhost nova_compute[279673]: 2025-11-28 09:49:59.241 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.672 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.673 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.678 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9da108e-d18b-426c-b3d0-ac1b048c79f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:50:00.673433', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '9b2b28b8-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.845327096, 'message_signature': '2f816792d133ab023d05fdc934c8941cce6aaf9687a792c9e4a3c674328a9a85'}]}, 'timestamp': '2025-11-28 09:50:00.679161', '_unique_id': 'a1435dd80b0b417992982e631dae8bf3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.680 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.682 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.682 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '032e9a4e-9ca1-4a8d-a3a4-c79c81e89424', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:50:00.682220', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '9b2bb652-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.845327096, 'message_signature': '776d34daa26623dac303c2b68bfc2d4149390c2a6ece8836f63eab1c04b20ba2'}]}, 'timestamp': '2025-11-28 09:50:00.682694', '_unique_id': 'a70ad2025bca47fcadf77eead58144fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.683 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.684 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.684 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7bae2cfb-e46c-4931-bb04-e4468bd8b08c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:50:00.684855', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '9b2c1e08-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.845327096, 'message_signature': '6a98c8cd672c34c63e587a07ab47c4761032d7d642a33346825b2de0fe6a3d0c'}]}, 'timestamp': '2025-11-28 09:50:00.685420', '_unique_id': '96068efbf5a14f8581b33afc6f1b3029'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.686 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.687 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.701 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.702 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9adbb5b8-2a70-4838-97ee-f45bb9452bfb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:50:00.688188', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9b2ebc62-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.860118054, 'message_signature': 'cdb7cf2a20c852b624dce72308c18d2f1e9f3aa757a8f579d71870b80e4810a4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:50:00.688188', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9b2ed102-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.860118054, 'message_signature': 'a5a95d08d950459e244d44f5bba03b161935c07e5949e555a22e98312bced0dc'}]}, 'timestamp': '2025-11-28 09:50:00.703050', '_unique_id': '3a224b2921834537a279d47de2695c39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.704 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.705 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.705 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.706 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4da6a30-056c-4912-9f8b-d05ddb669dab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:50:00.705748', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9b2f4cfe-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.860118054, 'message_signature': 'e4da6ea206b8048b4e06ea6b8420067d39d9e68650b5055c5be196a74e712c66'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:50:00.705748', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9b2f5e42-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.860118054, 'message_signature': 'ef952c436d165568875968fb0b2e49646bd75f1a64c765929e1069f8172dd833'}]}, 'timestamp': '2025-11-28 09:50:00.706619', '_unique_id': 'f319fa86d0d042b2b352d636cb32008e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.707 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.708 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.737 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.737 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2989b59a-fa23-4413-9c70-d1a56b7767ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:50:00.708758', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9b342486-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.88067376, 'message_signature': 'b197c5dfe69b6b38e95ef403506598b471fa4d1d50bca9a09e1e21e14a03261a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:50:00.708758', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9b3439b2-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.88067376, 'message_signature': 'f65dc7cf453d5fcc0ac189e2abe1222b58e63786dadcb35a893cb47ea3630a20'}]}, 'timestamp': '2025-11-28 09:50:00.738509', '_unique_id': '08e9e3256f024357bf45dfa92ae6741e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.739 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.740 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.741 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.741 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3cb9c7e2-7279-45e7-8d64-053a37f409ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:50:00.741184', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9b34b608-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.88067376, 'message_signature': 'd1abed28a311a0e43c0416db225afa555af92289d634eeb865c70a7d90e146f6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:50:00.741184', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9b34c710-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.88067376, 'message_signature': '7a0f1c5f5d8501aab98fd5f864917a6309140ad9ff1b2ff66156b891597f45bb'}]}, 'timestamp': '2025-11-28 09:50:00.742129', '_unique_id': '0488478a8a0340d2bcff00431cd3e51d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.743 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.746 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.746 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f865c9d3-7e1c-42ba-8e9e-e1deb08b221d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:50:00.746299', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '9b357c46-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.845327096, 'message_signature': 'fdbae69f4be147b2e9eb07bcda84d2023041b5c47dcb379a69efb749be3ded1c'}]}, 'timestamp': '2025-11-28 09:50:00.746661', '_unique_id': '0d69877709f740a49c068889f61b6c2d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.747 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e09e2ca8-7da6-4eef-be5c-75cbcbac34a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:50:00.748084', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '9b35bfb2-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.845327096, 'message_signature': 'f1c4add585ba9530ffce7a133a6b7b4bbc3454a4a9392b20e71eb89e54c3e441'}]}, 'timestamp': '2025-11-28 09:50:00.748375', '_unique_id': 'b20a20b37309401f970bc5abfacb0f86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.748 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.749 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.749 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.749 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3538317-2a8b-4dcc-9feb-94d130608a79', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:50:00.749663', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9b35fd06-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.88067376, 'message_signature': 'd9c575a505836dc89091b7d0adf5275527be6fdb9c0256baf848b22082281ce7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:50:00.749663', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9b3607b0-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.88067376, 'message_signature': 'a1ea0fba68f9cdc02c7309a2aa6e7d852192814f8db514e24830b95cf7a7f8cb'}]}, 'timestamp': '2025-11-28 09:50:00.750200', '_unique_id': 'b43da796be9a47e9beb3c049d74bf66b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.750 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.751 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.751 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b96c30c1-428b-445a-9661-d164a94021fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:50:00.751498', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '9b3644d2-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.845327096, 'message_signature': '8e54378e8e63562454a989cb86435c81daafa8bc75c0a4875cf0047eff087af5'}]}, 'timestamp': '2025-11-28 09:50:00.751797', '_unique_id': 'f84b370dc03047ff8f596ac76a109480'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.753 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.768 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 12160000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '245e5133-7ee7-4da5-9aa3-ff7ae3e920ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12160000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:50:00.753164', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '9b38e0c0-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.94039008, 'message_signature': 'c032ba4c6c8189ceed2efb6dd64abde04021f37a4d85a2fb77755f276878460d'}]}, 'timestamp': '2025-11-28 09:50:00.768874', '_unique_id': '45aadbe7cf9b435e94f998cc7d2dae60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.770 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.770 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.770 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d552264-0826-4061-bddc-29631105a1bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:50:00.770270', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '9b3921f2-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.845327096, 'message_signature': '4c9d94d925167ef5fef4fede70b3291799cde19725083f2712182f773edfbe7d'}]}, 'timestamp': '2025-11-28 09:50:00.770551', '_unique_id': '4a2ae6029e854b32aca7c6ad9c2bca61'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.771 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efa843ff-9a81-42a9-9c06-114e5602226f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:50:00.771841', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9b395f5a-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.860118054, 'message_signature': '901c81c2c526b54829317c51d4b867d2198f87adda922cd041864ecc97a67f38'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:50:00.771841', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9b3969c8-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.860118054, 'message_signature': '6b96afb9bfff66001850ed917513daf0f173b855164d796b8a1747e0ad1e5d5d'}]}, 'timestamp': '2025-11-28 09:50:00.772368', '_unique_id': '58de60eed40b41018bae2ae9938fa4ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.773 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.773 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.773 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dec807d3-c145-4fd5-91ec-402b01112d7c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:50:00.773652', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9b39a604-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.88067376, 'message_signature': '2e450f28515143a5465be1d9b0888d5409373831a1a25faa7e684d632cdab2bf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:50:00.773652', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9b39b072-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.88067376, 'message_signature': 'ce64b22e9ca99393aef34a23701f330b24d489e8ee55389fd3d35d75c882cc54'}]}, 'timestamp': '2025-11-28 09:50:00.774179', '_unique_id': '3258da311b654fefbc0e5bdb8c9e5e68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.774 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.775 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.775 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c17ae4d-6635-43f8-a3f7-3a3f8f820382', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:50:00.775495', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '9b39eede-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.845327096, 'message_signature': 'e915c487a0a8e0d0d11dd4ab8c6981b1d0745583739a7c893d0fcf0a45817cd4'}]}, 'timestamp': '2025-11-28 09:50:00.775794', '_unique_id': 'a02d24cd17a2452096915e2962ef87c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.776 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8e01b17-0e00-4890-a0a7-29ddb7de4d22', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:50:00.777084', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '9b3a2c14-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.94039008, 'message_signature': '18e2aef03d11fff9c43004724fc2a5b69d8baa675b4c1d702bc75ff8ff0680ef'}]}, 'timestamp': '2025-11-28 09:50:00.777351', '_unique_id': '2e9dcad99eca43e89c9247829f9f5240'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.777 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.778 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.778 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.778 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20cec515-6c19-4b6b-9a61-43851a8e28b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:50:00.778714', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '9b3a6bc0-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.845327096, 'message_signature': 'bb834b268a74c5a8740a07a6f9f5ed628a371168647a59c7dd105af7e59d96e3'}]}, 'timestamp': '2025-11-28 09:50:00.779070', '_unique_id': '80ade852dfd6434292da53e3ccf720e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.779 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.780 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.780 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c74bdf13-bc1a-4c4f-9242-7641649a4aa1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:50:00.780544', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '9b3ab350-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.845327096, 'message_signature': '6ec176d3e269ba29a96ce237fdb0e5b391be5de5ce8d109ecfd3d50add56a4ae'}]}, 'timestamp': '2025-11-28 09:50:00.780824', '_unique_id': '04255758d50045198cbec5fde94c0a38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.781 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.782 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.782 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dff361a4-32db-45d9-87db-d7b86cdb633b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:50:00.782093', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9b3aefaa-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.88067376, 'message_signature': 'b7739264cd3cf6b862dd1130231a123df135170b651c0098e6c3879cb782cf55'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:50:00.782093', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9b3af928-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.88067376, 'message_signature': '1ad92ac23166250bf4a60d078d0fffaa166b1cad6f1222d24223485e6778411c'}]}, 'timestamp': '2025-11-28 09:50:00.782593', '_unique_id': '2c276588e8a646c692216772d9c12deb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.783 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '630fef5c-3059-4f1a-a054-4e483fe587b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:50:00.783878', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9b3b356e-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.88067376, 'message_signature': '012c3a74059d0681cd020801c74b709d88edc5a830c64211b7ff7ead3b1fee22'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:50:00.783878', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9b3b3fc8-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11234.88067376, 'message_signature': '60d76ac905c28f19cb09df351f86ef67478bbffd43f1f89af654ca68c5468cdb'}]}, 'timestamp': '2025-11-28 09:50:00.784400', '_unique_id': '4d1a097a7f134730a5b998b4a9306d34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:50:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:50:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 04:50:02 localhost nova_compute[279673]: 2025-11-28 09:50:02.482 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:50:02 localhost podman[285463]: 2025-11-28 09:50:02.848683055 +0000 UTC m=+0.082036102 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 04:50:02 localhost podman[285463]: 2025-11-28 09:50:02.860647377 +0000 UTC m=+0.094000464 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 04:50:02 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:50:04 localhost nova_compute[279673]: 2025-11-28 09:50:04.274 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:04 localhost systemd[1]: Stopping User Manager for UID 1003... Nov 28 04:50:04 localhost systemd[281904]: Activating special unit Exit the Session... Nov 28 04:50:04 localhost systemd[281904]: Stopped target Main User Target. Nov 28 04:50:04 localhost systemd[281904]: Stopped target Basic System. Nov 28 04:50:04 localhost systemd[281904]: Stopped target Paths. Nov 28 04:50:04 localhost systemd[281904]: Stopped target Sockets. Nov 28 04:50:04 localhost systemd[281904]: Stopped target Timers. Nov 28 04:50:04 localhost systemd[281904]: Stopped Mark boot as successful after the user session has run 2 minutes. Nov 28 04:50:04 localhost systemd[281904]: Stopped Daily Cleanup of User's Temporary Directories. Nov 28 04:50:04 localhost systemd[281904]: Closed D-Bus User Message Bus Socket. Nov 28 04:50:04 localhost systemd[281904]: Stopped Create User's Volatile Files and Directories. Nov 28 04:50:04 localhost systemd[281904]: Removed slice User Application Slice. Nov 28 04:50:04 localhost systemd[281904]: Reached target Shutdown. Nov 28 04:50:04 localhost systemd[281904]: Finished Exit the Session. Nov 28 04:50:04 localhost systemd[281904]: Reached target Exit the Session. Nov 28 04:50:04 localhost systemd[1]: user@1003.service: Deactivated successfully. Nov 28 04:50:04 localhost systemd[1]: Stopped User Manager for UID 1003. Nov 28 04:50:04 localhost systemd[1]: Stopping User Runtime Directory /run/user/1003... Nov 28 04:50:04 localhost systemd[1]: run-user-1003.mount: Deactivated successfully. Nov 28 04:50:04 localhost systemd[1]: user-runtime-dir@1003.service: Deactivated successfully. Nov 28 04:50:04 localhost systemd[1]: Stopped User Runtime Directory /run/user/1003. Nov 28 04:50:04 localhost systemd[1]: Removed slice User Slice of UID 1003. Nov 28 04:50:04 localhost systemd[1]: user-1003.slice: Consumed 1.772s CPU time. Nov 28 04:50:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:50:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:50:06 localhost podman[285486]: 2025-11-28 09:50:06.844094788 +0000 UTC m=+0.077848451 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 04:50:06 localhost podman[285487]: 2025-11-28 09:50:06.921231098 +0000 UTC m=+0.151841434 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:50:06 localhost podman[285487]: 2025-11-28 09:50:06.952552128 +0000 UTC m=+0.183162444 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 28 04:50:06 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:50:06 localhost podman[285486]: 2025-11-28 09:50:06.971847116 +0000 UTC m=+0.205600719 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 04:50:06 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:50:07 localhost nova_compute[279673]: 2025-11-28 09:50:07.485 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:50:08 localhost podman[285546]: 2025-11-28 09:50:08.462033167 +0000 UTC m=+0.100078531 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Nov 28 04:50:08 localhost podman[285546]: 2025-11-28 09:50:08.472343907 +0000 UTC m=+0.110389291 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute) Nov 28 04:50:08 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:50:09 localhost nova_compute[279673]: 2025-11-28 09:50:09.277 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:10 localhost podman[238687]: time="2025-11-28T09:50:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:50:10 localhost podman[238687]: @ - - [28/Nov/2025:09:50:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149601 "" "Go-http-client/1.1" Nov 28 04:50:10 localhost podman[238687]: @ - - [28/Nov/2025:09:50:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17727 "" "Go-http-client/1.1" Nov 28 04:50:12 localhost nova_compute[279673]: 2025-11-28 09:50:12.518 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:14 localhost nova_compute[279673]: 2025-11-28 09:50:14.319 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:50:15 localhost podman[285601]: 2025-11-28 09:50:15.858633758 +0000 UTC m=+0.087901314 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:50:15 localhost podman[285601]: 2025-11-28 09:50:15.89549027 +0000 UTC m=+0.124757826 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:50:15 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:50:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:50:16 localhost podman[285624]: 2025-11-28 09:50:16.850514793 +0000 UTC m=+0.093079774 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 28 04:50:16 localhost podman[285624]: 2025-11-28 09:50:16.894407683 +0000 UTC m=+0.136972694 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible) Nov 28 04:50:16 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:50:17 localhost nova_compute[279673]: 2025-11-28 09:50:17.522 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:18 localhost openstack_network_exporter[240658]: ERROR 09:50:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:50:18 localhost openstack_network_exporter[240658]: ERROR 09:50:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:50:18 localhost openstack_network_exporter[240658]: ERROR 09:50:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:50:18 localhost openstack_network_exporter[240658]: ERROR 09:50:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:50:18 localhost openstack_network_exporter[240658]: Nov 28 04:50:18 localhost openstack_network_exporter[240658]: ERROR 09:50:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:50:18 localhost openstack_network_exporter[240658]: Nov 28 04:50:19 localhost nova_compute[279673]: 2025-11-28 09:50:19.324 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:22 localhost nova_compute[279673]: 2025-11-28 09:50:22.563 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:24 localhost nova_compute[279673]: 2025-11-28 09:50:24.369 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:26 localhost nova_compute[279673]: 2025-11-28 09:50:26.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:50:26 localhost nova_compute[279673]: 2025-11-28 09:50:26.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Nov 28 04:50:26 localhost nova_compute[279673]: 2025-11-28 09:50:26.790 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Nov 28 04:50:26 localhost nova_compute[279673]: 2025-11-28 09:50:26.791 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:50:26 localhost nova_compute[279673]: 2025-11-28 09:50:26.791 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Nov 28 04:50:26 localhost nova_compute[279673]: 2025-11-28 09:50:26.808 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:50:27 localhost nova_compute[279673]: 2025-11-28 09:50:27.567 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:27 localhost nova_compute[279673]: 2025-11-28 09:50:27.816 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:50:27 localhost nova_compute[279673]: 2025-11-28 09:50:27.816 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:50:27 localhost nova_compute[279673]: 2025-11-28 09:50:27.817 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 04:50:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:50:28 localhost nova_compute[279673]: 2025-11-28 09:50:28.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:50:28 localhost podman[285643]: 2025-11-28 09:50:28.851610374 +0000 UTC m=+0.084448257 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc.) Nov 28 04:50:28 localhost podman[285643]: 2025-11-28 09:50:28.86696261 +0000 UTC m=+0.099800493 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, vcs-type=git, name=ubi9-minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_id=edpm, vendor=Red Hat, Inc.) Nov 28 04:50:28 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:50:29 localhost nova_compute[279673]: 2025-11-28 09:50:29.373 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:29 localhost nova_compute[279673]: 2025-11-28 09:50:29.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:50:29 localhost nova_compute[279673]: 2025-11-28 09:50:29.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:50:30 localhost nova_compute[279673]: 2025-11-28 09:50:30.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:50:31 localhost nova_compute[279673]: 2025-11-28 09:50:31.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:50:31 localhost nova_compute[279673]: 2025-11-28 09:50:31.807 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:50:31 localhost nova_compute[279673]: 2025-11-28 09:50:31.807 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:50:31 localhost nova_compute[279673]: 2025-11-28 09:50:31.807 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:50:31 localhost nova_compute[279673]: 2025-11-28 09:50:31.808 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 04:50:31 localhost nova_compute[279673]: 2025-11-28 09:50:31.808 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:50:32 localhost nova_compute[279673]: 2025-11-28 09:50:32.303 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:50:32 localhost nova_compute[279673]: 2025-11-28 09:50:32.376 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:50:32 localhost nova_compute[279673]: 2025-11-28 09:50:32.376 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:50:32 localhost nova_compute[279673]: 2025-11-28 09:50:32.595 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:32 localhost nova_compute[279673]: 2025-11-28 09:50:32.608 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 04:50:32 localhost nova_compute[279673]: 2025-11-28 09:50:32.609 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=12259MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 04:50:32 localhost nova_compute[279673]: 2025-11-28 09:50:32.610 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:50:32 localhost nova_compute[279673]: 2025-11-28 09:50:32.610 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:50:32 localhost nova_compute[279673]: 2025-11-28 09:50:32.730 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 04:50:32 localhost nova_compute[279673]: 2025-11-28 09:50:32.731 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 04:50:32 localhost nova_compute[279673]: 2025-11-28 09:50:32.731 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 04:50:32 localhost nova_compute[279673]: 2025-11-28 09:50:32.780 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing inventories for resource provider 35fead26-0bad-4950-b646-987079d58a17 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 28 04:50:32 localhost nova_compute[279673]: 2025-11-28 09:50:32.805 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Updating ProviderTree inventory for provider 35fead26-0bad-4950-b646-987079d58a17 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 28 04:50:32 localhost nova_compute[279673]: 2025-11-28 09:50:32.805 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 28 04:50:32 localhost nova_compute[279673]: 2025-11-28 09:50:32.869 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing aggregate associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 28 04:50:32 localhost nova_compute[279673]: 2025-11-28 09:50:32.891 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing trait associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_FMA3,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_ABM,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AMD_SVM,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_CLMUL,HW_CPU_X86_BMI2,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 28 04:50:32 localhost nova_compute[279673]: 2025-11-28 09:50:32.925 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:50:33 localhost nova_compute[279673]: 2025-11-28 09:50:33.393 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:50:33 localhost nova_compute[279673]: 2025-11-28 09:50:33.399 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 04:50:33 localhost nova_compute[279673]: 2025-11-28 09:50:33.418 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 04:50:33 localhost nova_compute[279673]: 2025-11-28 09:50:33.420 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 04:50:33 localhost nova_compute[279673]: 2025-11-28 09:50:33.421 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.811s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:50:33 localhost sshd[285707]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:50:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:50:33 localhost systemd[1]: tmp-crun.Uzc9sY.mount: Deactivated successfully. Nov 28 04:50:33 localhost podman[285709]: 2025-11-28 09:50:33.846905091 +0000 UTC m=+0.085275153 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:50:33 localhost podman[285709]: 2025-11-28 09:50:33.855321152 +0000 UTC m=+0.093691224 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 04:50:33 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:50:34 localhost nova_compute[279673]: 2025-11-28 09:50:34.409 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:34 localhost nova_compute[279673]: 2025-11-28 09:50:34.421 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:50:34 localhost nova_compute[279673]: 2025-11-28 09:50:34.421 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 04:50:34 localhost nova_compute[279673]: 2025-11-28 09:50:34.421 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 04:50:35 localhost nova_compute[279673]: 2025-11-28 09:50:35.281 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:50:35 localhost nova_compute[279673]: 2025-11-28 09:50:35.281 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:50:35 localhost nova_compute[279673]: 2025-11-28 09:50:35.281 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 04:50:35 localhost nova_compute[279673]: 2025-11-28 09:50:35.281 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:50:36 localhost nova_compute[279673]: 2025-11-28 09:50:36.208 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 04:50:36 localhost nova_compute[279673]: 2025-11-28 09:50:36.227 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:50:36 localhost nova_compute[279673]: 2025-11-28 09:50:36.228 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 04:50:36 localhost nova_compute[279673]: 2025-11-28 09:50:36.228 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:50:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:50:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:50:37 localhost systemd[1]: tmp-crun.UbH7NU.mount: Deactivated successfully. Nov 28 04:50:37 localhost systemd[1]: tmp-crun.SB8fHW.mount: Deactivated successfully. Nov 28 04:50:37 localhost podman[285751]: 2025-11-28 09:50:37.208321236 +0000 UTC m=+0.132707762 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 28 04:50:37 localhost podman[285752]: 2025-11-28 09:50:37.178241275 +0000 UTC m=+0.102761755 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 04:50:37 localhost podman[285752]: 2025-11-28 09:50:37.270498473 +0000 UTC m=+0.195018933 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125) Nov 28 04:50:37 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:50:37 localhost podman[285751]: 2025-11-28 09:50:37.312469483 +0000 UTC m=+0.236856059 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller) Nov 28 04:50:37 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:50:37 localhost nova_compute[279673]: 2025-11-28 09:50:37.599 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:50:38 localhost podman[285810]: 2025-11-28 09:50:38.852196457 +0000 UTC m=+0.082063123 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 04:50:38 localhost podman[285810]: 2025-11-28 09:50:38.864377535 +0000 UTC m=+0.094244211 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3) Nov 28 04:50:38 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:50:39 localhost nova_compute[279673]: 2025-11-28 09:50:39.411 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:40 localhost podman[238687]: time="2025-11-28T09:50:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:50:40 localhost podman[238687]: @ - - [28/Nov/2025:09:50:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149601 "" "Go-http-client/1.1" Nov 28 04:50:40 localhost podman[238687]: @ - - [28/Nov/2025:09:50:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17724 "" "Go-http-client/1.1" Nov 28 04:50:41 localhost podman[285925]: Nov 28 04:50:41 localhost podman[285925]: 2025-11-28 09:50:41.655973839 +0000 UTC m=+0.075347826 container create 2ac47655fc969d5ce9a9e9bcafeb74ec590f05de71137ce1b5c4d076d0a0dec8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_knuth, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vcs-type=git, maintainer=Guillaume Abrioux , distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, name=rhceph, io.openshift.expose-services=, ceph=True, release=553, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 28 04:50:41 localhost systemd[1]: Started libpod-conmon-2ac47655fc969d5ce9a9e9bcafeb74ec590f05de71137ce1b5c4d076d0a0dec8.scope. Nov 28 04:50:41 localhost podman[285925]: 2025-11-28 09:50:41.623326927 +0000 UTC m=+0.042700944 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:50:41 localhost systemd[1]: Started libcrun container. Nov 28 04:50:41 localhost podman[285925]: 2025-11-28 09:50:41.738528796 +0000 UTC m=+0.157902783 container init 2ac47655fc969d5ce9a9e9bcafeb74ec590f05de71137ce1b5c4d076d0a0dec8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_knuth, maintainer=Guillaume Abrioux , ceph=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, name=rhceph, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7) Nov 28 04:50:41 localhost podman[285925]: 2025-11-28 09:50:41.747107842 +0000 UTC m=+0.166481839 container start 2ac47655fc969d5ce9a9e9bcafeb74ec590f05de71137ce1b5c4d076d0a0dec8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_knuth, distribution-scope=public, version=7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, vcs-type=git, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, RELEASE=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.tags=rhceph ceph) Nov 28 04:50:41 localhost podman[285925]: 2025-11-28 09:50:41.747303078 +0000 UTC m=+0.166677105 container attach 2ac47655fc969d5ce9a9e9bcafeb74ec590f05de71137ce1b5c4d076d0a0dec8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_knuth, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, name=rhceph, io.openshift.tags=rhceph ceph, RELEASE=main, distribution-scope=public, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container) Nov 28 04:50:41 localhost nervous_knuth[285939]: 167 167 Nov 28 04:50:41 localhost systemd[1]: libpod-2ac47655fc969d5ce9a9e9bcafeb74ec590f05de71137ce1b5c4d076d0a0dec8.scope: Deactivated successfully. Nov 28 04:50:41 localhost podman[285925]: 2025-11-28 09:50:41.749392392 +0000 UTC m=+0.168766399 container died 2ac47655fc969d5ce9a9e9bcafeb74ec590f05de71137ce1b5c4d076d0a0dec8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_knuth, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, release=553, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.expose-services=, CEPH_POINT_RELEASE=, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, version=7) Nov 28 04:50:41 localhost podman[285944]: 2025-11-28 09:50:41.849671048 +0000 UTC m=+0.088097829 container remove 2ac47655fc969d5ce9a9e9bcafeb74ec590f05de71137ce1b5c4d076d0a0dec8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_knuth, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, CEPH_POINT_RELEASE=, vcs-type=git, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., version=7, io.openshift.expose-services=, GIT_CLEAN=True, RELEASE=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7) Nov 28 04:50:41 localhost systemd[1]: libpod-conmon-2ac47655fc969d5ce9a9e9bcafeb74ec590f05de71137ce1b5c4d076d0a0dec8.scope: Deactivated successfully. Nov 28 04:50:41 localhost systemd[1]: Reloading. Nov 28 04:50:42 localhost systemd-sysv-generator[285989]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:50:42 localhost systemd-rc-local-generator[285983]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:50:42 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:50:42 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:50:42 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:50:42 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:50:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:50:42 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:50:42 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:50:42 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:50:42 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:50:42 localhost systemd[1]: var-lib-containers-storage-overlay-bcd07bdf35e1bb658fa32b5ad0055ef57cca0afdc31ec71ba07544af3a3513aa-merged.mount: Deactivated successfully. Nov 28 04:50:42 localhost systemd[1]: Reloading. Nov 28 04:50:42 localhost systemd-rc-local-generator[286027]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:50:42 localhost systemd-sysv-generator[286031]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:50:42 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:50:42 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:50:42 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:50:42 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:50:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:50:42 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:50:42 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:50:42 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:50:42 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:50:42 localhost nova_compute[279673]: 2025-11-28 09:50:42.632 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:42 localhost systemd[1]: Starting Ceph mgr.np0005538513.dsfdlx for 2c5417c9-00eb-57d5-a565-ddecbc7995c1... Nov 28 04:50:43 localhost podman[286087]: Nov 28 04:50:43 localhost podman[286087]: 2025-11-28 09:50:43.067430111 +0000 UTC m=+0.064390676 container create 76f135805438fecc744b1ef2f588233be779b89d35556f2880623d2347301830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx, GIT_BRANCH=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat Ceph Storage 7, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, version=7, vcs-type=git, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_CLEAN=True) Nov 28 04:50:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6be7c41c7335451eaf2119ac66116e5641f59d6dce63812a7e980786f51387cf/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 28 04:50:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6be7c41c7335451eaf2119ac66116e5641f59d6dce63812a7e980786f51387cf/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 28 04:50:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6be7c41c7335451eaf2119ac66116e5641f59d6dce63812a7e980786f51387cf/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 28 04:50:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6be7c41c7335451eaf2119ac66116e5641f59d6dce63812a7e980786f51387cf/merged/var/lib/ceph/mgr/ceph-np0005538513.dsfdlx supports timestamps until 2038 (0x7fffffff) Nov 28 04:50:43 localhost podman[286087]: 2025-11-28 09:50:43.133840378 +0000 UTC m=+0.130800943 container init 76f135805438fecc744b1ef2f588233be779b89d35556f2880623d2347301830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=553, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , name=rhceph, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 28 04:50:43 localhost podman[286087]: 2025-11-28 09:50:43.036942286 +0000 UTC m=+0.033902881 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:50:43 localhost podman[286087]: 2025-11-28 09:50:43.151646529 +0000 UTC m=+0.148607084 container start 76f135805438fecc744b1ef2f588233be779b89d35556f2880623d2347301830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.buildah.version=1.33.12, distribution-scope=public, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-type=git, GIT_CLEAN=True) Nov 28 04:50:43 localhost bash[286087]: 76f135805438fecc744b1ef2f588233be779b89d35556f2880623d2347301830 Nov 28 04:50:43 localhost systemd[1]: Started Ceph mgr.np0005538513.dsfdlx for 2c5417c9-00eb-57d5-a565-ddecbc7995c1. Nov 28 04:50:43 localhost ceph-mgr[286105]: set uid:gid to 167:167 (ceph:ceph) Nov 28 04:50:43 localhost ceph-mgr[286105]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2 Nov 28 04:50:43 localhost ceph-mgr[286105]: pidfile_write: ignore empty --pid-file Nov 28 04:50:43 localhost ceph-mgr[286105]: mgr[py] Loading python module 'alerts' Nov 28 04:50:43 localhost ceph-mgr[286105]: mgr[py] Module alerts has missing NOTIFY_TYPES member Nov 28 04:50:43 localhost ceph-mgr[286105]: mgr[py] Loading python module 'balancer' Nov 28 04:50:43 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:43.324+0000 7fc543976140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member Nov 28 04:50:43 localhost ceph-mgr[286105]: mgr[py] Module balancer has missing NOTIFY_TYPES member Nov 28 04:50:43 localhost ceph-mgr[286105]: mgr[py] Loading python module 'cephadm' Nov 28 04:50:43 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:43.393+0000 7fc543976140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member Nov 28 04:50:43 localhost systemd[1]: tmp-crun.fC1J0g.mount: Deactivated successfully. Nov 28 04:50:43 localhost ceph-mgr[286105]: mgr[py] Loading python module 'crash' Nov 28 04:50:44 localhost ceph-mgr[286105]: mgr[py] Module crash has missing NOTIFY_TYPES member Nov 28 04:50:44 localhost ceph-mgr[286105]: mgr[py] Loading python module 'dashboard' Nov 28 04:50:44 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:44.026+0000 7fc543976140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member Nov 28 04:50:44 localhost nova_compute[279673]: 2025-11-28 09:50:44.445 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:44 localhost ceph-mgr[286105]: mgr[py] Loading python module 'devicehealth' Nov 28 04:50:44 localhost ceph-mgr[286105]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member Nov 28 04:50:44 localhost ceph-mgr[286105]: mgr[py] Loading python module 'diskprediction_local' Nov 28 04:50:44 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:44.653+0000 7fc543976140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member Nov 28 04:50:44 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode. Nov 28 04:50:44 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve. Nov 28 04:50:44 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: from numpy import show_config as show_numpy_config Nov 28 04:50:44 localhost ceph-mgr[286105]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Nov 28 04:50:44 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:44.801+0000 7fc543976140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Nov 28 04:50:44 localhost ceph-mgr[286105]: mgr[py] Loading python module 'influx' Nov 28 04:50:44 localhost ceph-mgr[286105]: mgr[py] Module influx has missing NOTIFY_TYPES member Nov 28 04:50:44 localhost ceph-mgr[286105]: mgr[py] Loading python module 'insights' Nov 28 04:50:44 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:44.867+0000 7fc543976140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member Nov 28 04:50:44 localhost ceph-mgr[286105]: mgr[py] Loading python module 'iostat' Nov 28 04:50:45 localhost ceph-mgr[286105]: mgr[py] Module iostat has missing NOTIFY_TYPES member Nov 28 04:50:45 localhost ceph-mgr[286105]: mgr[py] Loading python module 'k8sevents' Nov 28 04:50:45 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:45.002+0000 7fc543976140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member Nov 28 04:50:45 localhost ceph-mgr[286105]: mgr[py] Loading python module 'localpool' Nov 28 04:50:45 localhost ceph-mgr[286105]: mgr[py] Loading python module 'mds_autoscaler' Nov 28 04:50:45 localhost ceph-mgr[286105]: mgr[py] Loading python module 'mirroring' Nov 28 04:50:45 localhost ceph-mgr[286105]: mgr[py] Loading python module 'nfs' Nov 28 04:50:45 localhost ceph-mgr[286105]: mgr[py] Module nfs has missing NOTIFY_TYPES member Nov 28 04:50:45 localhost ceph-mgr[286105]: mgr[py] Loading python module 'orchestrator' Nov 28 04:50:45 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:45.796+0000 7fc543976140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member Nov 28 04:50:45 localhost ceph-mgr[286105]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member Nov 28 04:50:45 localhost ceph-mgr[286105]: mgr[py] Loading python module 'osd_perf_query' Nov 28 04:50:45 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:45.945+0000 7fc543976140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member Nov 28 04:50:46 localhost ceph-mgr[286105]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Nov 28 04:50:46 localhost ceph-mgr[286105]: mgr[py] Loading python module 'osd_support' Nov 28 04:50:46 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:46.011+0000 7fc543976140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Nov 28 04:50:46 localhost ceph-mgr[286105]: mgr[py] Module osd_support has missing NOTIFY_TYPES member Nov 28 04:50:46 localhost ceph-mgr[286105]: mgr[py] Loading python module 'pg_autoscaler' Nov 28 04:50:46 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:46.068+0000 7fc543976140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member Nov 28 04:50:46 localhost ceph-mgr[286105]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Nov 28 04:50:46 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:46.139+0000 7fc543976140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Nov 28 04:50:46 localhost ceph-mgr[286105]: mgr[py] Loading python module 'progress' Nov 28 04:50:46 localhost ceph-mgr[286105]: mgr[py] Module progress has missing NOTIFY_TYPES member Nov 28 04:50:46 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:46.197+0000 7fc543976140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member Nov 28 04:50:46 localhost ceph-mgr[286105]: mgr[py] Loading python module 'prometheus' Nov 28 04:50:46 localhost ceph-mgr[286105]: mgr[py] Module prometheus has missing NOTIFY_TYPES member Nov 28 04:50:46 localhost ceph-mgr[286105]: mgr[py] Loading python module 'rbd_support' Nov 28 04:50:46 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:46.490+0000 7fc543976140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member Nov 28 04:50:46 localhost ceph-mgr[286105]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member Nov 28 04:50:46 localhost ceph-mgr[286105]: mgr[py] Loading python module 'restful' Nov 28 04:50:46 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:46.574+0000 7fc543976140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member Nov 28 04:50:46 localhost ceph-mgr[286105]: mgr[py] Loading python module 'rgw' Nov 28 04:50:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:50:46 localhost podman[286135]: 2025-11-28 09:50:46.84155068 +0000 UTC m=+0.077327546 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 04:50:46 localhost podman[286135]: 2025-11-28 09:50:46.852384446 +0000 UTC m=+0.088161282 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 04:50:46 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:50:46 localhost ceph-mgr[286105]: mgr[py] Module rgw has missing NOTIFY_TYPES member Nov 28 04:50:46 localhost ceph-mgr[286105]: mgr[py] Loading python module 'rook' Nov 28 04:50:46 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:46.898+0000 7fc543976140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member Nov 28 04:50:47 localhost ceph-mgr[286105]: mgr[py] Module rook has missing NOTIFY_TYPES member Nov 28 04:50:47 localhost ceph-mgr[286105]: mgr[py] Loading python module 'selftest' Nov 28 04:50:47 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:47.331+0000 7fc543976140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member Nov 28 04:50:47 localhost ceph-mgr[286105]: mgr[py] Module selftest has missing NOTIFY_TYPES member Nov 28 04:50:47 localhost ceph-mgr[286105]: mgr[py] Loading python module 'snap_schedule' Nov 28 04:50:47 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:47.390+0000 7fc543976140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member Nov 28 04:50:47 localhost ceph-mgr[286105]: mgr[py] Loading python module 'stats' Nov 28 04:50:47 localhost ceph-mgr[286105]: mgr[py] Loading python module 'status' Nov 28 04:50:47 localhost ceph-mgr[286105]: mgr[py] Module status has missing NOTIFY_TYPES member Nov 28 04:50:47 localhost ceph-mgr[286105]: mgr[py] Loading python module 'telegraf' Nov 28 04:50:47 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:47.582+0000 7fc543976140 -1 mgr[py] Module status has missing NOTIFY_TYPES member Nov 28 04:50:47 localhost nova_compute[279673]: 2025-11-28 09:50:47.636 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:47 localhost ceph-mgr[286105]: mgr[py] Module telegraf has missing NOTIFY_TYPES member Nov 28 04:50:47 localhost ceph-mgr[286105]: mgr[py] Loading python module 'telemetry' Nov 28 04:50:47 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:47.642+0000 7fc543976140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member Nov 28 04:50:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:50:47 localhost ceph-mgr[286105]: mgr[py] Module telemetry has missing NOTIFY_TYPES member Nov 28 04:50:47 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:47.779+0000 7fc543976140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member Nov 28 04:50:47 localhost ceph-mgr[286105]: mgr[py] Loading python module 'test_orchestrator' Nov 28 04:50:47 localhost podman[286158]: 2025-11-28 09:50:47.846866191 +0000 UTC m=+0.083436576 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 28 04:50:47 localhost podman[286158]: 2025-11-28 09:50:47.862398513 +0000 UTC m=+0.098968898 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:50:47 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:50:47 localhost ceph-mgr[286105]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Nov 28 04:50:47 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:47.931+0000 7fc543976140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Nov 28 04:50:47 localhost ceph-mgr[286105]: mgr[py] Loading python module 'volumes' Nov 28 04:50:48 localhost openstack_network_exporter[240658]: ERROR 09:50:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:50:48 localhost openstack_network_exporter[240658]: ERROR 09:50:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:50:48 localhost openstack_network_exporter[240658]: ERROR 09:50:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:50:48 localhost openstack_network_exporter[240658]: ERROR 09:50:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:50:48 localhost openstack_network_exporter[240658]: Nov 28 04:50:48 localhost openstack_network_exporter[240658]: ERROR 09:50:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:50:48 localhost openstack_network_exporter[240658]: Nov 28 04:50:48 localhost ceph-mgr[286105]: mgr[py] Module volumes has missing NOTIFY_TYPES member Nov 28 04:50:48 localhost ceph-mgr[286105]: mgr[py] Loading python module 'zabbix' Nov 28 04:50:48 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:48.131+0000 7fc543976140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member Nov 28 04:50:48 localhost ceph-mgr[286105]: mgr[py] Module zabbix has missing NOTIFY_TYPES member Nov 28 04:50:48 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:50:48.192+0000 7fc543976140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member Nov 28 04:50:48 localhost ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b91e0 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0 Nov 28 04:50:48 localhost ceph-mgr[286105]: client.0 ms_handle_reset on v2:172.18.0.103:6800/705940825 Nov 28 04:50:49 localhost podman[286303]: 2025-11-28 09:50:49.063700954 +0000 UTC m=+0.095959803 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , version=7, release=553, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, name=rhceph, com.redhat.component=rhceph-container, io.buildah.version=1.33.12) Nov 28 04:50:49 localhost podman[286303]: 2025-11-28 09:50:49.233601948 +0000 UTC m=+0.265860787 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_CLEAN=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, RELEASE=main, architecture=x86_64, GIT_BRANCH=main, vcs-type=git, distribution-scope=public, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 28 04:50:49 localhost nova_compute[279673]: 2025-11-28 09:50:49.448 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:50:50.828 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:50:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:50:50.828 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:50:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:50:50.830 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:50:52 localhost nova_compute[279673]: 2025-11-28 09:50:52.677 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:54 localhost nova_compute[279673]: 2025-11-28 09:50:54.485 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:57 localhost nova_compute[279673]: 2025-11-28 09:50:57.677 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:50:59 localhost nova_compute[279673]: 2025-11-28 09:50:59.488 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:50:59 localhost podman[287203]: 2025-11-28 09:50:59.547287611 +0000 UTC m=+0.100791783 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41, version=9.6, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.) Nov 28 04:50:59 localhost podman[287203]: 2025-11-28 09:50:59.592345627 +0000 UTC m=+0.145849789 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container) Nov 28 04:50:59 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:51:01 localhost sshd[287222]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:51:02 localhost nova_compute[279673]: 2025-11-28 09:51:02.711 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:51:04 localhost nova_compute[279673]: 2025-11-28 09:51:04.511 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:51:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:51:04 localhost podman[287224]: 2025-11-28 09:51:04.842359993 +0000 UTC m=+0.080454003 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 04:51:04 localhost podman[287224]: 2025-11-28 09:51:04.854396036 +0000 UTC m=+0.092490026 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 28 04:51:04 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:51:05 localhost ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b91e0 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0 Nov 28 04:51:07 localhost nova_compute[279673]: 2025-11-28 09:51:07.717 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:51:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:51:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:51:07 localhost systemd[1]: tmp-crun.CwGzte.mount: Deactivated successfully. Nov 28 04:51:07 localhost podman[287248]: 2025-11-28 09:51:07.864877669 +0000 UTC m=+0.097606284 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS) Nov 28 04:51:07 localhost podman[287248]: 2025-11-28 09:51:07.897328124 +0000 UTC m=+0.130056799 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 04:51:07 localhost systemd[1]: tmp-crun.wqgQVX.mount: Deactivated successfully. Nov 28 04:51:07 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:51:07 localhost podman[287247]: 2025-11-28 09:51:07.915355163 +0000 UTC m=+0.152819975 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:51:07 localhost podman[287247]: 2025-11-28 09:51:07.952434561 +0000 UTC m=+0.189899373 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Nov 28 04:51:07 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:51:09 localhost nova_compute[279673]: 2025-11-28 09:51:09.514 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:51:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:51:09 localhost podman[287290]: 2025-11-28 09:51:09.852680085 +0000 UTC m=+0.090420182 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:51:09 localhost podman[287290]: 2025-11-28 09:51:09.867421792 +0000 UTC m=+0.105161939 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, io.buildah.version=1.41.3) Nov 28 04:51:09 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:51:10 localhost podman[238687]: time="2025-11-28T09:51:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:51:10 localhost podman[238687]: @ - - [28/Nov/2025:09:51:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 151667 "" "Go-http-client/1.1" Nov 28 04:51:10 localhost podman[238687]: @ - - [28/Nov/2025:09:51:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18217 "" "Go-http-client/1.1" Nov 28 04:51:11 localhost podman[287387]: Nov 28 04:51:11 localhost podman[287387]: 2025-11-28 09:51:11.749063668 +0000 UTC m=+0.083889279 container create 2b7ebea883864fbecf1719db677961821c9d8f6c97a9d49eac32a21a081e4a98 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_shannon, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=553, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , RELEASE=main, ceph=True, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12) Nov 28 04:51:11 localhost systemd[1]: Started libpod-conmon-2b7ebea883864fbecf1719db677961821c9d8f6c97a9d49eac32a21a081e4a98.scope. Nov 28 04:51:11 localhost podman[287387]: 2025-11-28 09:51:11.71392014 +0000 UTC m=+0.048745811 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:51:11 localhost systemd[1]: Started libcrun container. Nov 28 04:51:11 localhost podman[287387]: 2025-11-28 09:51:11.829998116 +0000 UTC m=+0.164823777 container init 2b7ebea883864fbecf1719db677961821c9d8f6c97a9d49eac32a21a081e4a98 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_shannon, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, GIT_CLEAN=True, GIT_BRANCH=main, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , RELEASE=main, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, vcs-type=git, ceph=True, version=7, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, architecture=x86_64) Nov 28 04:51:11 localhost systemd[1]: tmp-crun.l9r6y6.mount: Deactivated successfully. Nov 28 04:51:11 localhost podman[287387]: 2025-11-28 09:51:11.844051841 +0000 UTC m=+0.178877452 container start 2b7ebea883864fbecf1719db677961821c9d8f6c97a9d49eac32a21a081e4a98 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_shannon, name=rhceph, maintainer=Guillaume Abrioux , distribution-scope=public, version=7, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, release=553, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12) Nov 28 04:51:11 localhost podman[287387]: 2025-11-28 09:51:11.844404132 +0000 UTC m=+0.179229753 container attach 2b7ebea883864fbecf1719db677961821c9d8f6c97a9d49eac32a21a081e4a98 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_shannon, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, io.buildah.version=1.33.12, vcs-type=git, release=553, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, com.redhat.component=rhceph-container, architecture=x86_64, GIT_CLEAN=True, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, distribution-scope=public, name=rhceph) Nov 28 04:51:11 localhost infallible_shannon[287402]: 167 167 Nov 28 04:51:11 localhost systemd[1]: libpod-2b7ebea883864fbecf1719db677961821c9d8f6c97a9d49eac32a21a081e4a98.scope: Deactivated successfully. Nov 28 04:51:11 localhost podman[287387]: 2025-11-28 09:51:11.848673574 +0000 UTC m=+0.183499215 container died 2b7ebea883864fbecf1719db677961821c9d8f6c97a9d49eac32a21a081e4a98 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_shannon, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, release=553, GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.33.12, RELEASE=main, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, ceph=True, GIT_CLEAN=True) Nov 28 04:51:11 localhost podman[287407]: 2025-11-28 09:51:11.962438648 +0000 UTC m=+0.099262925 container remove 2b7ebea883864fbecf1719db677961821c9d8f6c97a9d49eac32a21a081e4a98 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_shannon, vcs-type=git, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, architecture=x86_64, CEPH_POINT_RELEASE=, release=553, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, GIT_BRANCH=main, ceph=True, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph) Nov 28 04:51:11 localhost systemd[1]: libpod-conmon-2b7ebea883864fbecf1719db677961821c9d8f6c97a9d49eac32a21a081e4a98.scope: Deactivated successfully. Nov 28 04:51:12 localhost podman[287426]: Nov 28 04:51:12 localhost podman[287426]: 2025-11-28 09:51:12.076478141 +0000 UTC m=+0.082042742 container create d9b8dcc9bd3c0f16abb7eaa0d0cd5d168d0da0d0c609dbade0773bc8e8869934 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_knuth, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, RELEASE=main, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_CLEAN=True, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, maintainer=Guillaume Abrioux , architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=) Nov 28 04:51:12 localhost systemd[1]: Started libpod-conmon-d9b8dcc9bd3c0f16abb7eaa0d0cd5d168d0da0d0c609dbade0773bc8e8869934.scope. Nov 28 04:51:12 localhost systemd[1]: Started libcrun container. Nov 28 04:51:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86b20cbfc849af5301c83db64e4cb36a417f447d9fd364d088495788c7efb629/merged/tmp/config supports timestamps until 2038 (0x7fffffff) Nov 28 04:51:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86b20cbfc849af5301c83db64e4cb36a417f447d9fd364d088495788c7efb629/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff) Nov 28 04:51:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86b20cbfc849af5301c83db64e4cb36a417f447d9fd364d088495788c7efb629/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 28 04:51:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/86b20cbfc849af5301c83db64e4cb36a417f447d9fd364d088495788c7efb629/merged/var/lib/ceph/mon/ceph-np0005538513 supports timestamps until 2038 (0x7fffffff) Nov 28 04:51:12 localhost podman[287426]: 2025-11-28 09:51:12.041102086 +0000 UTC m=+0.046666727 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:51:12 localhost podman[287426]: 2025-11-28 09:51:12.140089412 +0000 UTC m=+0.145654003 container init d9b8dcc9bd3c0f16abb7eaa0d0cd5d168d0da0d0c609dbade0773bc8e8869934 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_knuth, ceph=True, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, architecture=x86_64) Nov 28 04:51:12 localhost podman[287426]: 2025-11-28 09:51:12.149548354 +0000 UTC m=+0.155112955 container start d9b8dcc9bd3c0f16abb7eaa0d0cd5d168d0da0d0c609dbade0773bc8e8869934 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_knuth, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, GIT_BRANCH=main, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, ceph=True) Nov 28 04:51:12 localhost podman[287426]: 2025-11-28 09:51:12.149867364 +0000 UTC m=+0.155431995 container attach d9b8dcc9bd3c0f16abb7eaa0d0cd5d168d0da0d0c609dbade0773bc8e8869934 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_knuth, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , RELEASE=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vcs-type=git, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.expose-services=) Nov 28 04:51:12 localhost systemd[1]: libpod-d9b8dcc9bd3c0f16abb7eaa0d0cd5d168d0da0d0c609dbade0773bc8e8869934.scope: Deactivated successfully. Nov 28 04:51:12 localhost podman[287426]: 2025-11-28 09:51:12.253835114 +0000 UTC m=+0.259399755 container died d9b8dcc9bd3c0f16abb7eaa0d0cd5d168d0da0d0c609dbade0773bc8e8869934 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_knuth, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, name=rhceph, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.buildah.version=1.33.12, ceph=True, io.openshift.tags=rhceph ceph, RELEASE=main, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_CLEAN=True) Nov 28 04:51:12 localhost ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b8f20 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0 Nov 28 04:51:12 localhost podman[287467]: 2025-11-28 09:51:12.351095657 +0000 UTC m=+0.085006183 container remove d9b8dcc9bd3c0f16abb7eaa0d0cd5d168d0da0d0c609dbade0773bc8e8869934 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_knuth, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, vcs-type=git, CEPH_POINT_RELEASE=, release=553, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, name=rhceph, architecture=x86_64, com.redhat.component=rhceph-container, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_CLEAN=True, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , distribution-scope=public, RELEASE=main, vendor=Red Hat, Inc., version=7) Nov 28 04:51:12 localhost systemd[1]: libpod-conmon-d9b8dcc9bd3c0f16abb7eaa0d0cd5d168d0da0d0c609dbade0773bc8e8869934.scope: Deactivated successfully. Nov 28 04:51:12 localhost systemd[1]: Reloading. Nov 28 04:51:12 localhost systemd-rc-local-generator[287504]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:51:12 localhost systemd-sysv-generator[287509]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:51:12 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:51:12 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:51:12 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:51:12 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:51:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:51:12 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:51:12 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:51:12 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:51:12 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:51:12 localhost systemd[1]: var-lib-containers-storage-overlay-06e86172f19db509740997f9d4331839abf2fe8a6e515afd8687873657279d09-merged.mount: Deactivated successfully. Nov 28 04:51:12 localhost nova_compute[279673]: 2025-11-28 09:51:12.746 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:51:12 localhost systemd[1]: Reloading. Nov 28 04:51:12 localhost systemd-sysv-generator[287553]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:51:12 localhost systemd-rc-local-generator[287548]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:51:12 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:51:12 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:51:12 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:51:12 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:51:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:51:12 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:51:12 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:51:12 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:51:12 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:51:13 localhost systemd[1]: Starting Ceph mon.np0005538513 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1... Nov 28 04:51:13 localhost podman[287610]: Nov 28 04:51:13 localhost podman[287610]: 2025-11-28 09:51:13.52043683 +0000 UTC m=+0.085051386 container create b72bfc005269ade02879c161a498dd471b1542ee13d035dbcf0188cb36c61613 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mon-np0005538513, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, distribution-scope=public, com.redhat.component=rhceph-container, vcs-type=git, GIT_BRANCH=main, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, version=7, vendor=Red Hat, Inc., name=rhceph, ceph=True) Nov 28 04:51:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa77a2e7ddc389a4da392f7d457efdd2cb70596fdbafdc2c079e625848050438/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 28 04:51:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa77a2e7ddc389a4da392f7d457efdd2cb70596fdbafdc2c079e625848050438/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 28 04:51:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa77a2e7ddc389a4da392f7d457efdd2cb70596fdbafdc2c079e625848050438/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 28 04:51:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa77a2e7ddc389a4da392f7d457efdd2cb70596fdbafdc2c079e625848050438/merged/var/lib/ceph/mon/ceph-np0005538513 supports timestamps until 2038 (0x7fffffff) Nov 28 04:51:13 localhost podman[287610]: 2025-11-28 09:51:13.581509181 +0000 UTC m=+0.146123737 container init b72bfc005269ade02879c161a498dd471b1542ee13d035dbcf0188cb36c61613 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mon-np0005538513, vcs-type=git, io.buildah.version=1.33.12, release=553, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, build-date=2025-09-24T08:57:55, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.expose-services=) Nov 28 04:51:13 localhost podman[287610]: 2025-11-28 09:51:13.487976744 +0000 UTC m=+0.052591340 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:51:13 localhost podman[287610]: 2025-11-28 09:51:13.594101152 +0000 UTC m=+0.158715718 container start b72bfc005269ade02879c161a498dd471b1542ee13d035dbcf0188cb36c61613 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mon-np0005538513, io.buildah.version=1.33.12, release=553, distribution-scope=public, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux ) Nov 28 04:51:13 localhost bash[287610]: b72bfc005269ade02879c161a498dd471b1542ee13d035dbcf0188cb36c61613 Nov 28 04:51:13 localhost systemd[1]: Started Ceph mon.np0005538513 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1. Nov 28 04:51:13 localhost ceph-mon[287629]: set uid:gid to 167:167 (ceph:ceph) Nov 28 04:51:13 localhost ceph-mon[287629]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2 Nov 28 04:51:13 localhost ceph-mon[287629]: pidfile_write: ignore empty --pid-file Nov 28 04:51:13 localhost ceph-mon[287629]: load: jerasure load: lrc Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: RocksDB version: 7.9.2 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Git sha 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Compile date 2025-09-23 00:00:00 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: DB SUMMARY Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: DB Session ID: ND9860OIBZS6OJ35KIN7 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: CURRENT file: CURRENT Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: IDENTITY file: IDENTITY Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: MANIFEST file: MANIFEST-000005 size: 59 Bytes Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005538513/store.db dir, Total Num: 0, files: Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005538513/store.db: 000004.log size: 886 ; Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.error_if_exists: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.create_if_missing: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.paranoid_checks: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.flush_verify_memtable_count: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.env: 0x556f56dc39e0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.fs: PosixFileSystem Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.info_log: 0x556f59286d20 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_file_opening_threads: 16 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.statistics: (nil) Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.use_fsync: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_log_file_size: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_manifest_file_size: 1073741824 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.log_file_time_to_roll: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.keep_log_file_num: 1000 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.recycle_log_file_num: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.allow_fallocate: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.allow_mmap_reads: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.allow_mmap_writes: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.use_direct_reads: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.create_missing_column_families: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.db_log_dir: Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.wal_dir: Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.table_cache_numshardbits: 6 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.WAL_ttl_seconds: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.WAL_size_limit_MB: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.manifest_preallocation_size: 4194304 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.is_fd_close_on_exec: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.advise_random_on_open: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.db_write_buffer_size: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.write_buffer_manager: 0x556f59297540 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.access_hint_on_compaction_start: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.random_access_max_buffer_size: 1048576 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.use_adaptive_mutex: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.rate_limiter: (nil) Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.wal_recovery_mode: 2 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.enable_thread_tracking: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.enable_pipelined_write: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.unordered_write: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.allow_concurrent_memtable_write: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.write_thread_max_yield_usec: 100 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.write_thread_slow_yield_usec: 3 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.row_cache: None Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.wal_filter: None Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.avoid_flush_during_recovery: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.allow_ingest_behind: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.two_write_queues: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.manual_wal_flush: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.wal_compression: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.atomic_flush: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.persist_stats_to_disk: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.write_dbid_to_manifest: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.log_readahead_size: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.file_checksum_gen_factory: Unknown Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.best_efforts_recovery: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.allow_data_in_errors: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.db_host_id: __hostname__ Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.enforce_single_del_contracts: true Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_background_jobs: 2 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_background_compactions: -1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_subcompactions: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.avoid_flush_during_shutdown: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.writable_file_max_buffer_size: 1048576 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.delayed_write_rate : 16777216 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_total_wal_size: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.stats_dump_period_sec: 600 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.stats_persist_period_sec: 600 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.stats_history_buffer_size: 1048576 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_open_files: -1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.bytes_per_sync: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.wal_bytes_per_sync: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.strict_bytes_per_sync: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.compaction_readahead_size: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_background_flushes: -1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Compression algorithms supported: Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: #011kZSTD supported: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: #011kXpressCompression supported: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: #011kBZip2Compression supported: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: #011kLZ4Compression supported: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: #011kZlibCompression supported: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: #011kLZ4HCCompression supported: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: #011kSnappyCompression supported: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Fast CRC32 supported: Supported on x86 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: DMutex implementation: pthread_mutex_t Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005538513/store.db/MANIFEST-000005 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.merge_operator: Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.compaction_filter: None Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.compaction_filter_factory: None Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.sst_partitioner_factory: None Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x556f59286980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x556f59283350#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.write_buffer_size: 33554432 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_write_buffer_number: 2 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.compression: NoCompression Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.bottommost_compression: Disabled Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.prefix_extractor: nullptr Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.num_levels: 7 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.min_write_buffer_number_to_merge: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.compression_opts.level: 32767 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.compression_opts.enabled: false Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.level0_file_num_compaction_trigger: 4 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_bytes_for_level_base: 268435456 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.arena_block_size: 1048576 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.table_properties_collectors: Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.inplace_update_support: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.bloom_locality: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.max_successive_merges: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.force_consistency_checks: 1 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.ttl: 2592000 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.enable_blob_files: false Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.min_blob_size: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.blob_file_size: 268435456 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005538513/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 53876326-3f1b-4342-b386-ddefe9bbd825 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323473641079, "job": 1, "event": "recovery_started", "wal_files": [4]} Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323473643656, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 2012, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 898, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 776, "raw_average_value_size": 155, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323473, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "53876326-3f1b-4342-b386-ddefe9bbd825", "db_session_id": "ND9860OIBZS6OJ35KIN7", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}} Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323473643813, "job": 1, "event": "recovery_finished"} Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: [db/version_set.cc:5047] Creating manifest 10 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x556f592aae00 Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: DB pointer 0x556f593a0000 Nov 28 04:51:13 localhost ceph-mon[287629]: mon.np0005538513 does not exist in monmap, will attempt to join an existing cluster Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 04:51:13 localhost ceph-mon[287629]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 1/0 1.96 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.8 0.00 0.00 1 0.002 0 0 0.0 0.0#012 Sum 1/0 1.96 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.8 0.00 0.00 1 0.002 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.8 0.00 0.00 1 0.002 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.8 0.00 0.00 1 0.002 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x556f59283350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 2.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Nov 28 04:51:13 localhost ceph-mon[287629]: using public_addr v2:172.18.0.106:0/0 -> [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] Nov 28 04:51:13 localhost ceph-mon[287629]: starting mon.np0005538513 rank -1 at public addrs [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] at bind addrs [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005538513 fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 Nov 28 04:51:13 localhost ceph-mon[287629]: mon.np0005538513@-1(???) e0 preinit fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 Nov 28 04:51:13 localhost ceph-mon[287629]: mon.np0005538513@-1(synchronizing) e5 sync_obtain_latest_monmap Nov 28 04:51:13 localhost ceph-mon[287629]: mon.np0005538513@-1(synchronizing) e5 sync_obtain_latest_monmap obtained monmap e5 Nov 28 04:51:13 localhost ceph-mon[287629]: mon.np0005538513@-1(synchronizing).mds e17 new map Nov 28 04:51:13 localhost ceph-mon[287629]: mon.np0005538513@-1(synchronizing).mds e17 print_map#012e17#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#01115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-28T08:07:30.958224+0000#012modified#0112025-11-28T09:49:53.259185+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#01183#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=26449}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[6]#012metadata_pool#0117#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 26449 members: 26449#012[mds.mds.np0005538514.umgtoy{0:26449} state up:active seq 12 addr [v2:172.18.0.107:6808/1969410151,v1:172.18.0.107:6809/1969410151] compat {c=[1],r=[1],i=[17ff]}]#012 #012 #012Standby daemons:#012 #012[mds.mds.np0005538513.yljthc{-1:16968} state up:standby seq 1 addr [v2:172.18.0.106:6808/2782735008,v1:172.18.0.106:6809/2782735008] compat {c=[1],r=[1],i=[17ff]}]#012[mds.mds.np0005538515.anvatb{-1:26446} state up:standby seq 1 addr [v2:172.18.0.108:6808/2640180,v1:172.18.0.108:6809/2640180] compat {c=[1],r=[1],i=[17ff]}] Nov 28 04:51:13 localhost ceph-mon[287629]: mon.np0005538513@-1(synchronizing).osd e85 crush map has features 3314933000852226048, adjusting msgr requires Nov 28 04:51:13 localhost ceph-mon[287629]: mon.np0005538513@-1(synchronizing).osd e85 crush map has features 288514051259236352, adjusting msgr requires Nov 28 04:51:13 localhost ceph-mon[287629]: mon.np0005538513@-1(synchronizing).osd e85 crush map has features 288514051259236352, adjusting msgr requires Nov 28 04:51:13 localhost ceph-mon[287629]: mon.np0005538513@-1(synchronizing).osd e85 crush map has features 288514051259236352, adjusting msgr requires Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: Added label mgr to host np0005538513.localdomain Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: Added label mgr to host np0005538514.localdomain Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: Added label mgr to host np0005538515.localdomain Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished Nov 28 04:51:13 localhost ceph-mon[287629]: Saving service mgr spec with placement label:mgr Nov 28 04:51:13 localhost ceph-mon[287629]: Deploying daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished Nov 28 04:51:13 localhost ceph-mon[287629]: Deploying daemon mgr.np0005538514.djozup on np0005538514.localdomain Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: Added label mon to host np0005538510.localdomain Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:51:13 localhost ceph-mon[287629]: Added label _admin to host np0005538510.localdomain Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished Nov 28 04:51:13 localhost ceph-mon[287629]: Deploying daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: Added label mon to host np0005538511.localdomain Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: Added label _admin to host np0005538511.localdomain Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: Added label mon to host np0005538512.localdomain Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: Added label _admin to host np0005538512.localdomain Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: Added label mon to host np0005538513.localdomain Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:51:13 localhost ceph-mon[287629]: Added label _admin to host np0005538513.localdomain Nov 28 04:51:13 localhost ceph-mon[287629]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:51:13 localhost ceph-mon[287629]: Added label mon to host np0005538514.localdomain Nov 28 04:51:13 localhost ceph-mon[287629]: Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:51:13 localhost ceph-mon[287629]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: Added label _admin to host np0005538514.localdomain Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:51:13 localhost ceph-mon[287629]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: Added label mon to host np0005538515.localdomain Nov 28 04:51:13 localhost ceph-mon[287629]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:51:13 localhost ceph-mon[287629]: Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:51:13 localhost ceph-mon[287629]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: Added label _admin to host np0005538515.localdomain Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:51:13 localhost ceph-mon[287629]: Saving service mon spec with placement label:mon Nov 28 04:51:13 localhost ceph-mon[287629]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:51:13 localhost ceph-mon[287629]: Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:51:13 localhost ceph-mon[287629]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:51:13 localhost ceph-mon[287629]: Deploying daemon mon.np0005538515 on np0005538515.localdomain Nov 28 04:51:13 localhost ceph-mon[287629]: Deploying daemon mon.np0005538514 on np0005538514.localdomain Nov 28 04:51:13 localhost ceph-mon[287629]: mon.np0005538510 calling monitor election Nov 28 04:51:13 localhost ceph-mon[287629]: mon.np0005538512 calling monitor election Nov 28 04:51:13 localhost ceph-mon[287629]: mon.np0005538511 calling monitor election Nov 28 04:51:13 localhost ceph-mon[287629]: mon.np0005538515 calling monitor election Nov 28 04:51:13 localhost ceph-mon[287629]: mon.np0005538510 is new leader, mons np0005538510,np0005538512,np0005538511,np0005538515 in quorum (ranks 0,1,2,3) Nov 28 04:51:13 localhost ceph-mon[287629]: overall HEALTH_OK Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:13 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:51:13 localhost ceph-mon[287629]: Deploying daemon mon.np0005538513 on np0005538513.localdomain Nov 28 04:51:13 localhost ceph-mon[287629]: mon.np0005538513@-1(synchronizing).paxosservice(auth 1..34) refresh upgraded, format 0 -> 3 Nov 28 04:51:14 localhost nova_compute[279673]: 2025-11-28 09:51:14.539 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:51:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:51:17 localhost nova_compute[279673]: 2025-11-28 09:51:17.752 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:51:17 localhost podman[287668]: 2025-11-28 09:51:17.830242143 +0000 UTC m=+0.073469257 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 28 04:51:17 localhost podman[287668]: 2025-11-28 09:51:17.835100324 +0000 UTC m=+0.078327368 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:51:17 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:51:17 localhost ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b9600 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0 Nov 28 04:51:18 localhost openstack_network_exporter[240658]: ERROR 09:51:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:51:18 localhost openstack_network_exporter[240658]: ERROR 09:51:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:51:18 localhost openstack_network_exporter[240658]: ERROR 09:51:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:51:18 localhost openstack_network_exporter[240658]: ERROR 09:51:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:51:18 localhost openstack_network_exporter[240658]: Nov 28 04:51:18 localhost openstack_network_exporter[240658]: ERROR 09:51:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:51:18 localhost openstack_network_exporter[240658]: Nov 28 04:51:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:51:18 localhost podman[287726]: 2025-11-28 09:51:18.202222916 +0000 UTC m=+0.092139805 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:51:18 localhost podman[287726]: 2025-11-28 09:51:18.23947576 +0000 UTC m=+0.129392599 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125) Nov 28 04:51:18 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:51:19 localhost systemd[1]: tmp-crun.G2Ewit.mount: Deactivated successfully. Nov 28 04:51:19 localhost podman[287832]: 2025-11-28 09:51:19.080617355 +0000 UTC m=+0.105449727 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, vcs-type=git, vendor=Red Hat, Inc., release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, distribution-scope=public, io.openshift.tags=rhceph ceph, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 28 04:51:19 localhost podman[287832]: 2025-11-28 09:51:19.190711156 +0000 UTC m=+0.215543548 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-type=git, RELEASE=main, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, GIT_BRANCH=main, vendor=Red Hat, Inc., architecture=x86_64, name=rhceph, version=7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container) Nov 28 04:51:19 localhost nova_compute[279673]: 2025-11-28 09:51:19.542 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:51:19 localhost ceph-mon[287629]: mon.np0005538513@-1(probing) e6 my rank is now 5 (was -1) Nov 28 04:51:19 localhost ceph-mon[287629]: log_channel(cluster) log [INF] : mon.np0005538513 calling monitor election Nov 28 04:51:19 localhost ceph-mon[287629]: paxos.5).electionLogic(0) init, first boot, initializing epoch at 1 Nov 28 04:51:19 localhost ceph-mon[287629]: mon.np0005538513@5(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:51:22 localhost nova_compute[279673]: 2025-11-28 09:51:22.792 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:51:22 localhost ceph-mon[287629]: mon.np0005538513@5(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:51:22 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code} Nov 28 04:51:22 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout} Nov 28 04:51:22 localhost ceph-mon[287629]: mon.np0005538511 calling monitor election Nov 28 04:51:22 localhost ceph-mon[287629]: mon.np0005538510 calling monitor election Nov 28 04:51:22 localhost ceph-mon[287629]: mon.np0005538515 calling monitor election Nov 28 04:51:22 localhost ceph-mon[287629]: mon.np0005538512 calling monitor election Nov 28 04:51:22 localhost ceph-mon[287629]: mon.np0005538514 calling monitor election Nov 28 04:51:22 localhost ceph-mon[287629]: mon.np0005538510 is new leader, mons np0005538510,np0005538512,np0005538511,np0005538515,np0005538514 in quorum (ranks 0,1,2,3,4) Nov 28 04:51:22 localhost ceph-mon[287629]: overall HEALTH_OK Nov 28 04:51:22 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:22 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:51:22 localhost ceph-mon[287629]: mgrc update_daemon_metadata mon.np0005538513 metadata {addrs=[v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005538513.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.6 (Plow),distro_version=9.6,hostname=np0005538513.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux} Nov 28 04:51:23 localhost ceph-mon[287629]: mon.np0005538511 calling monitor election Nov 28 04:51:23 localhost ceph-mon[287629]: mon.np0005538515 calling monitor election Nov 28 04:51:23 localhost ceph-mon[287629]: mon.np0005538510 calling monitor election Nov 28 04:51:23 localhost ceph-mon[287629]: mon.np0005538512 calling monitor election Nov 28 04:51:23 localhost ceph-mon[287629]: mon.np0005538514 calling monitor election Nov 28 04:51:23 localhost ceph-mon[287629]: mon.np0005538513 calling monitor election Nov 28 04:51:23 localhost ceph-mon[287629]: mon.np0005538510 is new leader, mons np0005538510,np0005538512,np0005538511,np0005538515,np0005538514,np0005538513 in quorum (ranks 0,1,2,3,4,5) Nov 28 04:51:23 localhost ceph-mon[287629]: overall HEALTH_OK Nov 28 04:51:23 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:23 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:24 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:24 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:24 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:24 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:24 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:24 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} : dispatch Nov 28 04:51:24 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:24 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "config rm", "who": "osd/host:np0005538510", "name": "osd_memory_target"} : dispatch Nov 28 04:51:24 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:24 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:24 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:24 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:51:24 localhost ceph-mon[287629]: Updating np0005538510.localdomain:/etc/ceph/ceph.conf Nov 28 04:51:24 localhost ceph-mon[287629]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf Nov 28 04:51:24 localhost ceph-mon[287629]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf Nov 28 04:51:24 localhost ceph-mon[287629]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:51:24 localhost ceph-mon[287629]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:51:24 localhost ceph-mon[287629]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:51:24 localhost nova_compute[279673]: 2025-11-28 09:51:24.580 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:51:25 localhost ceph-mon[287629]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:51:25 localhost ceph-mon[287629]: Updating np0005538510.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:51:25 localhost ceph-mon[287629]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:51:25 localhost ceph-mon[287629]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:51:25 localhost ceph-mon[287629]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:51:25 localhost ceph-mon[287629]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:51:25 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:25 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:25 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:25 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:25 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:25 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:25 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:25 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:25 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:25 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:25 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:25 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:25 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:25 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:51:26 localhost ceph-mon[287629]: Reconfiguring mon.np0005538510 (monmap changed)... Nov 28 04:51:26 localhost ceph-mon[287629]: Reconfiguring daemon mon.np0005538510 on np0005538510.localdomain Nov 28 04:51:26 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:26 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:26 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538510.nzitwz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:51:27 localhost ceph-mon[287629]: Reconfiguring mgr.np0005538510.nzitwz (monmap changed)... Nov 28 04:51:27 localhost ceph-mon[287629]: Reconfiguring daemon mgr.np0005538510.nzitwz on np0005538510.localdomain Nov 28 04:51:27 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:27 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:27 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538510.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:51:27 localhost nova_compute[279673]: 2025-11-28 09:51:27.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:51:27 localhost nova_compute[279673]: 2025-11-28 09:51:27.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 04:51:27 localhost nova_compute[279673]: 2025-11-28 09:51:27.795 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:51:28 localhost ceph-mon[287629]: Reconfiguring crash.np0005538510 (monmap changed)... Nov 28 04:51:28 localhost ceph-mon[287629]: Reconfiguring daemon crash.np0005538510 on np0005538510.localdomain Nov 28 04:51:28 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:28 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:28 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:28 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538511.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:51:29 localhost ceph-mon[287629]: Reconfiguring crash.np0005538511 (monmap changed)... Nov 28 04:51:29 localhost ceph-mon[287629]: Reconfiguring daemon crash.np0005538511 on np0005538511.localdomain Nov 28 04:51:29 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:29 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:29 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:51:29 localhost nova_compute[279673]: 2025-11-28 09:51:29.583 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:51:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:51:29 localhost nova_compute[279673]: 2025-11-28 09:51:29.767 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:51:29 localhost nova_compute[279673]: 2025-11-28 09:51:29.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:51:29 localhost podman[288289]: 2025-11-28 09:51:29.85954706 +0000 UTC m=+0.089711960 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350, config_id=edpm, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc.) Nov 28 04:51:29 localhost podman[288289]: 2025-11-28 09:51:29.903494022 +0000 UTC m=+0.133658932 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, release=1755695350, managed_by=edpm_ansible, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=) Nov 28 04:51:29 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:51:30 localhost ceph-mon[287629]: Reconfiguring mon.np0005538511 (monmap changed)... Nov 28 04:51:30 localhost ceph-mon[287629]: Reconfiguring daemon mon.np0005538511 on np0005538511.localdomain Nov 28 04:51:30 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:30 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:30 localhost ceph-mon[287629]: Reconfiguring mgr.np0005538511.fvuybw (monmap changed)... Nov 28 04:51:30 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538511.fvuybw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:51:30 localhost ceph-mon[287629]: Reconfiguring daemon mgr.np0005538511.fvuybw on np0005538511.localdomain Nov 28 04:51:30 localhost nova_compute[279673]: 2025-11-28 09:51:30.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:51:30 localhost nova_compute[279673]: 2025-11-28 09:51:30.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:51:31 localhost nova_compute[279673]: 2025-11-28 09:51:31.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:51:31 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:31 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:31 localhost ceph-mon[287629]: Reconfiguring mon.np0005538512 (monmap changed)... Nov 28 04:51:31 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:51:31 localhost ceph-mon[287629]: Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain Nov 28 04:51:31 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:31 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' Nov 28 04:51:31 localhost ceph-mon[287629]: from='mgr.14120 172.18.0.103:0/3259224084' entity='mgr.np0005538510.nzitwz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mgr fail"} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [INF] : from='client.? 172.18.0.103:0/3703486687' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon).osd e85 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375 Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon).osd e85 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1 Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon).osd e86 e86: 6 total, 6 up, 6 in Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538510"} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538510"} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538511"} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538511"} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538512"} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538512"} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538513"} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005538513.yljthc"} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mds metadata", "who": "mds.np0005538513.yljthc"} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005538515.anvatb"} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mds metadata", "who": "mds.np0005538515.anvatb"} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005538514.umgtoy"} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mds metadata", "who": "mds.np0005538514.umgtoy"} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon).mds e17 all = 0 Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon).mds e17 all = 0 Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon).mds e17 all = 0 Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005538512.zyhkxs", "id": "np0005538512.zyhkxs"} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr metadata", "who": "np0005538512.zyhkxs", "id": "np0005538512.zyhkxs"} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005538514.djozup", "id": "np0005538514.djozup"} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr metadata", "who": "np0005538514.djozup", "id": "np0005538514.djozup"} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005538515.yfkzhl", "id": "np0005538515.yfkzhl"} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr metadata", "who": "np0005538515.yfkzhl", "id": "np0005538515.yfkzhl"} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005538511.fvuybw", "id": "np0005538511.fvuybw"} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr metadata", "who": "np0005538511.fvuybw", "id": "np0005538511.fvuybw"} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005538513.dsfdlx", "id": "np0005538513.dsfdlx"} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr metadata", "who": "np0005538513.dsfdlx", "id": "np0005538513.dsfdlx"} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata", "id": 0} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata", "id": 1} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata", "id": 2} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "osd metadata", "id": 3} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata", "id": 3} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "osd metadata", "id": 4} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata", "id": 4} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "osd metadata", "id": 5} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata", "id": 5} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mds metadata"} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mds metadata"} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon).mds e17 all = 1 Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "osd metadata"} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd metadata"} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mon metadata"} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon metadata"} : dispatch Nov 28 04:51:32 localhost systemd[1]: session-14.scope: Deactivated successfully. Nov 28 04:51:32 localhost systemd[1]: session-18.scope: Deactivated successfully. Nov 28 04:51:32 localhost systemd[1]: session-23.scope: Deactivated successfully. Nov 28 04:51:32 localhost systemd[1]: session-25.scope: Deactivated successfully. Nov 28 04:51:32 localhost systemd-logind[764]: Session 25 logged out. Waiting for processes to exit. Nov 28 04:51:32 localhost systemd[1]: session-16.scope: Deactivated successfully. Nov 28 04:51:32 localhost systemd-logind[764]: Session 23 logged out. Waiting for processes to exit. Nov 28 04:51:32 localhost systemd-logind[764]: Session 14 logged out. Waiting for processes to exit. Nov 28 04:51:32 localhost systemd-logind[764]: Session 18 logged out. Waiting for processes to exit. Nov 28 04:51:32 localhost systemd-logind[764]: Session 16 logged out. Waiting for processes to exit. Nov 28 04:51:32 localhost systemd[1]: session-17.scope: Deactivated successfully. Nov 28 04:51:32 localhost systemd[1]: session-26.scope: Deactivated successfully. Nov 28 04:51:32 localhost systemd[1]: session-26.scope: Consumed 3min 26.962s CPU time. Nov 28 04:51:32 localhost systemd[1]: session-21.scope: Deactivated successfully. Nov 28 04:51:32 localhost systemd[1]: session-24.scope: Deactivated successfully. Nov 28 04:51:32 localhost systemd-logind[764]: Session 21 logged out. Waiting for processes to exit. Nov 28 04:51:32 localhost systemd[1]: session-20.scope: Deactivated successfully. Nov 28 04:51:32 localhost systemd[1]: session-22.scope: Deactivated successfully. Nov 28 04:51:32 localhost systemd-logind[764]: Session 17 logged out. Waiting for processes to exit. Nov 28 04:51:32 localhost systemd-logind[764]: Session 26 logged out. Waiting for processes to exit. Nov 28 04:51:32 localhost systemd-logind[764]: Session 20 logged out. Waiting for processes to exit. Nov 28 04:51:32 localhost systemd-logind[764]: Session 22 logged out. Waiting for processes to exit. Nov 28 04:51:32 localhost systemd-logind[764]: Session 24 logged out. Waiting for processes to exit. Nov 28 04:51:32 localhost systemd-logind[764]: Removed session 14. Nov 28 04:51:32 localhost systemd-logind[764]: Removed session 18. Nov 28 04:51:32 localhost systemd[1]: session-19.scope: Deactivated successfully. Nov 28 04:51:32 localhost systemd-logind[764]: Session 19 logged out. Waiting for processes to exit. Nov 28 04:51:32 localhost systemd-logind[764]: Removed session 23. Nov 28 04:51:32 localhost systemd-logind[764]: Removed session 25. Nov 28 04:51:32 localhost systemd-logind[764]: Removed session 16. Nov 28 04:51:32 localhost systemd-logind[764]: Removed session 17. Nov 28 04:51:32 localhost systemd-logind[764]: Removed session 26. Nov 28 04:51:32 localhost systemd-logind[764]: Removed session 21. Nov 28 04:51:32 localhost systemd-logind[764]: Removed session 24. Nov 28 04:51:32 localhost systemd-logind[764]: Removed session 20. Nov 28 04:51:32 localhost systemd-logind[764]: Removed session 22. Nov 28 04:51:32 localhost systemd-logind[764]: Removed session 19. Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538512.zyhkxs/mirror_snapshot_schedule"} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538512.zyhkxs/mirror_snapshot_schedule"} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538512.zyhkxs/trash_purge_schedule"} v 0) Nov 28 04:51:32 localhost ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538512.zyhkxs/trash_purge_schedule"} : dispatch Nov 28 04:51:32 localhost sshd[288311]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:51:32 localhost systemd-logind[764]: New session 64 of user ceph-admin. Nov 28 04:51:32 localhost systemd[1]: Started Session 64 of User ceph-admin. Nov 28 04:51:32 localhost nova_compute[279673]: 2025-11-28 09:51:32.799 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:51:32 localhost ceph-mon[287629]: from='client.? 172.18.0.103:0/3703486687' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: Activating manager daemon np0005538512.zyhkxs Nov 28 04:51:32 localhost ceph-mon[287629]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Nov 28 04:51:32 localhost ceph-mon[287629]: Manager daemon np0005538512.zyhkxs is now available Nov 28 04:51:32 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538512.zyhkxs/mirror_snapshot_schedule"} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538512.zyhkxs/mirror_snapshot_schedule"} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538512.zyhkxs/trash_purge_schedule"} : dispatch Nov 28 04:51:32 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538512.zyhkxs/trash_purge_schedule"} : dispatch Nov 28 04:51:33 localhost ceph-mon[287629]: mon.np0005538513@5(peon).osd e86 _set_new_cache_sizes cache_size:1019691232 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:51:33 localhost nova_compute[279673]: 2025-11-28 09:51:33.767 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:51:33 localhost nova_compute[279673]: 2025-11-28 09:51:33.791 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:51:33 localhost nova_compute[279673]: 2025-11-28 09:51:33.792 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:51:33 localhost podman[288426]: 2025-11-28 09:51:33.809823405 +0000 UTC m=+0.108971777 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, vcs-type=git, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 28 04:51:33 localhost nova_compute[279673]: 2025-11-28 09:51:33.817 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:51:33 localhost nova_compute[279673]: 2025-11-28 09:51:33.817 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:51:33 localhost nova_compute[279673]: 2025-11-28 09:51:33.818 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:51:33 localhost nova_compute[279673]: 2025-11-28 09:51:33.818 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 04:51:33 localhost nova_compute[279673]: 2025-11-28 09:51:33.819 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:51:33 localhost podman[288426]: 2025-11-28 09:51:33.913535488 +0000 UTC m=+0.212683800 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., GIT_CLEAN=True, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, CEPH_POINT_RELEASE=, io.openshift.expose-services=, RELEASE=main, ceph=True, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, release=553, name=rhceph, version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 28 04:51:34 localhost nova_compute[279673]: 2025-11-28 09:51:34.278 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:51:34 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538511.localdomain.devices.0}] v 0) Nov 28 04:51:34 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538511.localdomain}] v 0) Nov 28 04:51:34 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538510.localdomain.devices.0}] v 0) Nov 28 04:51:34 localhost nova_compute[279673]: 2025-11-28 09:51:34.366 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:51:34 localhost nova_compute[279673]: 2025-11-28 09:51:34.367 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:51:34 localhost nova_compute[279673]: 2025-11-28 09:51:34.613 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:51:34 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain.devices.0}] v 0) Nov 28 04:51:34 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:51:34 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:51:34 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538510.localdomain}] v 0) Nov 28 04:51:34 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain}] v 0) Nov 28 04:51:34 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:51:34 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:51:34 localhost nova_compute[279673]: 2025-11-28 09:51:34.968 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 04:51:34 localhost nova_compute[279673]: 2025-11-28 09:51:34.969 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11793MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 04:51:34 localhost nova_compute[279673]: 2025-11-28 09:51:34.969 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:51:34 localhost nova_compute[279673]: 2025-11-28 09:51:34.969 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:51:34 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:51:35 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:51:35 localhost nova_compute[279673]: 2025-11-28 09:51:35.054 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 04:51:35 localhost nova_compute[279673]: 2025-11-28 09:51:35.054 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 04:51:35 localhost nova_compute[279673]: 2025-11-28 09:51:35.054 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 04:51:35 localhost nova_compute[279673]: 2025-11-28 09:51:35.091 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:51:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:51:35 localhost podman[288587]: 2025-11-28 09:51:35.208060828 +0000 UTC m=+0.063227719 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 04:51:35 localhost podman[288587]: 2025-11-28 09:51:35.221251007 +0000 UTC m=+0.076417898 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 04:51:35 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:51:35 localhost ceph-mon[287629]: [28/Nov/2025:09:51:33] ENGINE Bus STARTING Nov 28 04:51:35 localhost ceph-mon[287629]: [28/Nov/2025:09:51:33] ENGINE Serving on https://172.18.0.105:7150 Nov 28 04:51:35 localhost ceph-mon[287629]: [28/Nov/2025:09:51:33] ENGINE Client ('172.18.0.105', 40464) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Nov 28 04:51:35 localhost ceph-mon[287629]: [28/Nov/2025:09:51:33] ENGINE Serving on http://172.18.0.105:8765 Nov 28 04:51:35 localhost ceph-mon[287629]: [28/Nov/2025:09:51:33] ENGINE Bus STARTED Nov 28 04:51:35 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:35 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:35 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:35 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:35 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:35 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:35 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:35 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:35 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:35 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:35 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:35 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:35 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 04:51:35 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2421283346' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 04:51:35 localhost nova_compute[279673]: 2025-11-28 09:51:35.591 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:51:35 localhost nova_compute[279673]: 2025-11-28 09:51:35.601 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 04:51:35 localhost nova_compute[279673]: 2025-11-28 09:51:35.623 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 04:51:35 localhost nova_compute[279673]: 2025-11-28 09:51:35.627 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 04:51:35 localhost nova_compute[279673]: 2025-11-28 09:51:35.628 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:51:36 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain.devices.0}] v 0) Nov 28 04:51:36 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:51:36 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:51:36 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:51:36 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538511.localdomain.devices.0}] v 0) Nov 28 04:51:36 localhost nova_compute[279673]: 2025-11-28 09:51:36.608 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:51:36 localhost nova_compute[279673]: 2025-11-28 09:51:36.609 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 04:51:36 localhost nova_compute[279673]: 2025-11-28 09:51:36.609 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 04:51:36 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain}] v 0) Nov 28 04:51:37 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:51:37 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:51:37 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:51:37 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538511.localdomain}] v 0) Nov 28 04:51:37 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} v 0) Nov 28 04:51:37 localhost ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0) Nov 28 04:51:37 localhost ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0) Nov 28 04:51:37 localhost ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0) Nov 28 04:51:37 localhost ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} v 0) Nov 28 04:51:37 localhost ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0) Nov 28 04:51:37 localhost ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0) Nov 28 04:51:37 localhost ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0) Nov 28 04:51:37 localhost ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Nov 28 04:51:37 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Nov 28 04:51:37 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Nov 28 04:51:37 localhost nova_compute[279673]: 2025-11-28 09:51:37.324 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:51:37 localhost nova_compute[279673]: 2025-11-28 09:51:37.324 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:51:37 localhost nova_compute[279673]: 2025-11-28 09:51:37.325 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 04:51:37 localhost nova_compute[279673]: 2025-11-28 09:51:37.325 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:51:37 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538510.localdomain.devices.0}] v 0) Nov 28 04:51:37 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538510.localdomain}] v 0) Nov 28 04:51:37 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config rm", "who": "osd/host:np0005538510", "name": "osd_memory_target"} v 0) Nov 28 04:51:37 localhost ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd/host:np0005538510", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:51:37 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Nov 28 04:51:37 localhost ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:51:37 localhost nova_compute[279673]: 2025-11-28 09:51:37.691 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 04:51:37 localhost nova_compute[279673]: 2025-11-28 09:51:37.710 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:51:37 localhost nova_compute[279673]: 2025-11-28 09:51:37.710 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: Adjusting osd_memory_target on np0005538514.localdomain to 836.6M Nov 28 04:51:37 localhost ceph-mon[287629]: Adjusting osd_memory_target on np0005538515.localdomain to 836.6M Nov 28 04:51:37 localhost ceph-mon[287629]: Adjusting osd_memory_target on np0005538513.localdomain to 836.6M Nov 28 04:51:37 localhost ceph-mon[287629]: Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 04:51:37 localhost ceph-mon[287629]: Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 04:51:37 localhost ceph-mon[287629]: Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd/host:np0005538510", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:51:37 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config rm", "who": "osd/host:np0005538510", "name": "osd_memory_target"} : dispatch Nov 28 04:51:37 localhost nova_compute[279673]: 2025-11-28 09:51:37.831 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:51:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:51:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:51:38 localhost podman[288823]: 2025-11-28 09:51:38.048733282 +0000 UTC m=+0.101658260 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Nov 28 04:51:38 localhost podman[288823]: 2025-11-28 09:51:38.084318684 +0000 UTC m=+0.137243642 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 04:51:38 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:51:38 localhost systemd[1]: tmp-crun.E00jbi.mount: Deactivated successfully. Nov 28 04:51:38 localhost podman[288872]: 2025-11-28 09:51:38.164075436 +0000 UTC m=+0.106167060 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 04:51:38 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005538510.nzitwz", "id": "np0005538510.nzitwz"} v 0) Nov 28 04:51:38 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr metadata", "who": "np0005538510.nzitwz", "id": "np0005538510.nzitwz"} : dispatch Nov 28 04:51:38 localhost podman[288872]: 2025-11-28 09:51:38.241482133 +0000 UTC m=+0.183573767 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 28 04:51:38 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:51:38 localhost ceph-mon[287629]: mon.np0005538513@5(peon).osd e86 _set_new_cache_sizes cache_size:1020047072 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:51:39 localhost ceph-mon[287629]: Updating np0005538510.localdomain:/etc/ceph/ceph.conf Nov 28 04:51:39 localhost ceph-mon[287629]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf Nov 28 04:51:39 localhost ceph-mon[287629]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf Nov 28 04:51:39 localhost ceph-mon[287629]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:51:39 localhost ceph-mon[287629]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:51:39 localhost ceph-mon[287629]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:51:39 localhost ceph-mon[287629]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:51:39 localhost ceph-mon[287629]: Updating np0005538510.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:51:39 localhost ceph-mon[287629]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:51:39 localhost ceph-mon[287629]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:51:39 localhost ceph-mon[287629]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:51:39 localhost ceph-mon[287629]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:51:39 localhost nova_compute[279673]: 2025-11-28 09:51:39.615 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:51:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:51:40 localhost podman[289290]: 2025-11-28 09:51:40.02046591 +0000 UTC m=+0.098815402 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:51:40 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:51:40 localhost podman[289290]: 2025-11-28 09:51:40.041238504 +0000 UTC m=+0.119588036 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 28 04:51:40 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:51:40 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:51:40 localhost podman[238687]: time="2025-11-28T09:51:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:51:40 localhost podman[238687]: @ - - [28/Nov/2025:09:51:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 28 04:51:40 localhost podman[238687]: @ - - [28/Nov/2025:09:51:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18707 "" "Go-http-client/1.1" Nov 28 04:51:40 localhost ceph-mon[287629]: Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:51:40 localhost ceph-mon[287629]: Updating np0005538510.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:51:40 localhost ceph-mon[287629]: Updating np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:51:40 localhost ceph-mon[287629]: Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:51:40 localhost ceph-mon[287629]: Updating np0005538511.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:51:40 localhost ceph-mon[287629]: Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:51:40 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:40 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:40 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538510.localdomain.devices.0}] v 0) Nov 28 04:51:40 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:51:40 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538510.localdomain}] v 0) Nov 28 04:51:40 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain.devices.0}] v 0) Nov 28 04:51:40 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:51:40 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain}] v 0) Nov 28 04:51:40 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538511.localdomain.devices.0}] v 0) Nov 28 04:51:40 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538511.localdomain}] v 0) Nov 28 04:51:40 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:51:40 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:51:40 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 04:51:40 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Nov 28 04:51:40 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Nov 28 04:51:41 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Nov 28 04:51:41 localhost ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:51:41 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mgr services"} v 0) Nov 28 04:51:41 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch Nov 28 04:51:41 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:51:41 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:51:41 localhost ceph-mon[287629]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:51:41 localhost ceph-mon[287629]: Updating np0005538510.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:51:41 localhost ceph-mon[287629]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:51:41 localhost ceph-mon[287629]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:51:41 localhost ceph-mon[287629]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:51:41 localhost ceph-mon[287629]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:51:41 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:41 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:41 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:41 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:41 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:41 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:41 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:41 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:41 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:41 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:41 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:41 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:51:41 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:51:41 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain.devices.0}] v 0) Nov 28 04:51:41 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain}] v 0) Nov 28 04:51:41 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Nov 28 04:51:41 localhost ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:51:41 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:51:41 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:51:42 localhost ceph-mon[287629]: Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)... Nov 28 04:51:42 localhost ceph-mon[287629]: Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain Nov 28 04:51:42 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:42 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:51:42 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:42 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:51:42 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 28 04:51:42 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain.devices.0}] v 0) Nov 28 04:51:42 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538512.localdomain}] v 0) Nov 28 04:51:42 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Nov 28 04:51:42 localhost ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:51:42 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:51:42 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:51:42 localhost nova_compute[279673]: 2025-11-28 09:51:42.871 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:51:43 localhost ceph-mon[287629]: Reconfiguring crash.np0005538512 (monmap changed)... Nov 28 04:51:43 localhost ceph-mon[287629]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain Nov 28 04:51:43 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:43 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:43 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:51:43 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:43 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:51:43 localhost podman[289505]: Nov 28 04:51:43 localhost podman[289505]: 2025-11-28 09:51:43.418428657 +0000 UTC m=+0.081970500 container create f636510474bea4a4ff41782216cfdc69914ddb48a547cb10f3267c329b054180 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_bassi, build-date=2025-09-24T08:57:55, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, name=rhceph, vcs-type=git, io.openshift.expose-services=, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, release=553) Nov 28 04:51:43 localhost systemd[1]: Started libpod-conmon-f636510474bea4a4ff41782216cfdc69914ddb48a547cb10f3267c329b054180.scope. Nov 28 04:51:43 localhost podman[289505]: 2025-11-28 09:51:43.385702773 +0000 UTC m=+0.049244646 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:51:43 localhost systemd[1]: Started libcrun container. Nov 28 04:51:43 localhost podman[289505]: 2025-11-28 09:51:43.513334997 +0000 UTC m=+0.176876840 container init f636510474bea4a4ff41782216cfdc69914ddb48a547cb10f3267c329b054180 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_bassi, ceph=True, release=553, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, architecture=x86_64, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, distribution-scope=public, GIT_BRANCH=main) Nov 28 04:51:43 localhost podman[289505]: 2025-11-28 09:51:43.525566626 +0000 UTC m=+0.189108459 container start f636510474bea4a4ff41782216cfdc69914ddb48a547cb10f3267c329b054180 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_bassi, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, name=rhceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, io.openshift.expose-services=, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, distribution-scope=public, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12) Nov 28 04:51:43 localhost podman[289505]: 2025-11-28 09:51:43.525912167 +0000 UTC m=+0.189454000 container attach f636510474bea4a4ff41782216cfdc69914ddb48a547cb10f3267c329b054180 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_bassi, vcs-type=git, GIT_CLEAN=True, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, version=7, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, release=553, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 28 04:51:43 localhost heuristic_bassi[289520]: 167 167 Nov 28 04:51:43 localhost systemd[1]: libpod-f636510474bea4a4ff41782216cfdc69914ddb48a547cb10f3267c329b054180.scope: Deactivated successfully. Nov 28 04:51:43 localhost podman[289505]: 2025-11-28 09:51:43.531136658 +0000 UTC m=+0.194678541 container died f636510474bea4a4ff41782216cfdc69914ddb48a547cb10f3267c329b054180 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_bassi, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, version=7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_CLEAN=True) Nov 28 04:51:43 localhost podman[289525]: 2025-11-28 09:51:43.630207978 +0000 UTC m=+0.089597117 container remove f636510474bea4a4ff41782216cfdc69914ddb48a547cb10f3267c329b054180 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=heuristic_bassi, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, CEPH_POINT_RELEASE=, RELEASE=main, architecture=x86_64, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_CLEAN=True, maintainer=Guillaume Abrioux , release=553, vendor=Red Hat, Inc., GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12) Nov 28 04:51:43 localhost systemd[1]: libpod-conmon-f636510474bea4a4ff41782216cfdc69914ddb48a547cb10f3267c329b054180.scope: Deactivated successfully. Nov 28 04:51:43 localhost ceph-mon[287629]: mon.np0005538513@5(peon).osd e86 _set_new_cache_sizes cache_size:1020054565 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:51:43 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:51:43 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:51:43 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0) Nov 28 04:51:43 localhost ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 28 04:51:43 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:51:43 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:51:44 localhost ceph-mon[287629]: Reconfiguring crash.np0005538513 (monmap changed)... Nov 28 04:51:44 localhost ceph-mon[287629]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain Nov 28 04:51:44 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:44 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 28 04:51:44 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:44 localhost podman[289595]: Nov 28 04:51:44 localhost podman[289595]: 2025-11-28 09:51:44.337987032 +0000 UTC m=+0.066896503 container create a8318569b00adb21a7ff9b5e8b773e0ddf9b66cd4acb70eba0c27a49a25f25c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_agnesi, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.openshift.tags=rhceph ceph, version=7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_CLEAN=True) Nov 28 04:51:44 localhost systemd[1]: Started libpod-conmon-a8318569b00adb21a7ff9b5e8b773e0ddf9b66cd4acb70eba0c27a49a25f25c5.scope. Nov 28 04:51:44 localhost systemd[1]: Started libcrun container. Nov 28 04:51:44 localhost podman[289595]: 2025-11-28 09:51:44.306548108 +0000 UTC m=+0.035457599 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:51:44 localhost podman[289595]: 2025-11-28 09:51:44.411093026 +0000 UTC m=+0.140002497 container init a8318569b00adb21a7ff9b5e8b773e0ddf9b66cd4acb70eba0c27a49a25f25c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_agnesi, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, maintainer=Guillaume Abrioux , version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, distribution-scope=public, CEPH_POINT_RELEASE=, architecture=x86_64, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, name=rhceph) Nov 28 04:51:44 localhost podman[289595]: 2025-11-28 09:51:44.420851139 +0000 UTC m=+0.149760600 container start a8318569b00adb21a7ff9b5e8b773e0ddf9b66cd4acb70eba0c27a49a25f25c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_agnesi, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, RELEASE=main, version=7, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , architecture=x86_64, vendor=Red Hat, Inc., name=rhceph, ceph=True, distribution-scope=public, vcs-type=git, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container) Nov 28 04:51:44 localhost amazing_agnesi[289610]: 167 167 Nov 28 04:51:44 localhost systemd[1]: var-lib-containers-storage-overlay-454c6499e0a1ac672d2907fbb2c5eab0ffa25180482709da65a54ecb80996177-merged.mount: Deactivated successfully. Nov 28 04:51:44 localhost systemd[1]: libpod-a8318569b00adb21a7ff9b5e8b773e0ddf9b66cd4acb70eba0c27a49a25f25c5.scope: Deactivated successfully. Nov 28 04:51:44 localhost podman[289595]: 2025-11-28 09:51:44.421162438 +0000 UTC m=+0.150071899 container attach a8318569b00adb21a7ff9b5e8b773e0ddf9b66cd4acb70eba0c27a49a25f25c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_agnesi, distribution-scope=public, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, version=7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, io.openshift.tags=rhceph ceph, RELEASE=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 28 04:51:44 localhost podman[289595]: 2025-11-28 09:51:44.4305835 +0000 UTC m=+0.159493011 container died a8318569b00adb21a7ff9b5e8b773e0ddf9b66cd4acb70eba0c27a49a25f25c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_agnesi, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, version=7, vcs-type=git, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 28 04:51:44 localhost systemd[1]: var-lib-containers-storage-overlay-c461735e4425da9797d34137b527a69d41c73ac10c88cb0b457449cd5b86b220-merged.mount: Deactivated successfully. Nov 28 04:51:44 localhost podman[289615]: 2025-11-28 09:51:44.531864358 +0000 UTC m=+0.093116856 container remove a8318569b00adb21a7ff9b5e8b773e0ddf9b66cd4acb70eba0c27a49a25f25c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_agnesi, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_CLEAN=True, distribution-scope=public, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, architecture=x86_64, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , name=rhceph, CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 28 04:51:44 localhost systemd[1]: libpod-conmon-a8318569b00adb21a7ff9b5e8b773e0ddf9b66cd4acb70eba0c27a49a25f25c5.scope: Deactivated successfully. Nov 28 04:51:44 localhost nova_compute[279673]: 2025-11-28 09:51:44.644 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:51:44 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:51:44 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:51:44 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0) Nov 28 04:51:44 localhost ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 28 04:51:44 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:51:44 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:51:45 localhost ceph-mon[287629]: Reconfiguring osd.2 (monmap changed)... Nov 28 04:51:45 localhost ceph-mon[287629]: Reconfiguring daemon osd.2 on np0005538513.localdomain Nov 28 04:51:45 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:45 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 28 04:51:45 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:45 localhost podman[289692]: Nov 28 04:51:45 localhost podman[289692]: 2025-11-28 09:51:45.47620354 +0000 UTC m=+0.086118379 container create 16b827de04db2d9f312e6d444b1ed835c22fa087de6316e3f086b51f5b62b9b9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_snyder, GIT_CLEAN=True, vendor=Red Hat, Inc., RELEASE=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vcs-type=git, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, name=rhceph, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, version=7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64) Nov 28 04:51:45 localhost systemd[1]: Started libpod-conmon-16b827de04db2d9f312e6d444b1ed835c22fa087de6316e3f086b51f5b62b9b9.scope. Nov 28 04:51:45 localhost systemd[1]: Started libcrun container. Nov 28 04:51:45 localhost podman[289692]: 2025-11-28 09:51:45.441183305 +0000 UTC m=+0.051098194 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:51:45 localhost podman[289692]: 2025-11-28 09:51:45.54819692 +0000 UTC m=+0.158111759 container init 16b827de04db2d9f312e6d444b1ed835c22fa087de6316e3f086b51f5b62b9b9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_snyder, RELEASE=main, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, version=7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, name=rhceph, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 28 04:51:45 localhost epic_snyder[289707]: 167 167 Nov 28 04:51:45 localhost systemd[1]: libpod-16b827de04db2d9f312e6d444b1ed835c22fa087de6316e3f086b51f5b62b9b9.scope: Deactivated successfully. Nov 28 04:51:45 localhost podman[289692]: 2025-11-28 09:51:45.557333383 +0000 UTC m=+0.167248222 container start 16b827de04db2d9f312e6d444b1ed835c22fa087de6316e3f086b51f5b62b9b9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_snyder, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, version=7, RELEASE=main, com.redhat.component=rhceph-container, architecture=x86_64, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph) Nov 28 04:51:45 localhost podman[289692]: 2025-11-28 09:51:45.562212244 +0000 UTC m=+0.172127083 container attach 16b827de04db2d9f312e6d444b1ed835c22fa087de6316e3f086b51f5b62b9b9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_snyder, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_BRANCH=main, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, RELEASE=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, release=553, io.buildah.version=1.33.12, name=rhceph, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 28 04:51:45 localhost podman[289692]: 2025-11-28 09:51:45.565463525 +0000 UTC m=+0.175378394 container died 16b827de04db2d9f312e6d444b1ed835c22fa087de6316e3f086b51f5b62b9b9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_snyder, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , version=7, GIT_BRANCH=main, vendor=Red Hat, Inc., RELEASE=main, io.buildah.version=1.33.12, GIT_CLEAN=True, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, distribution-scope=public, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vcs-type=git) Nov 28 04:51:45 localhost podman[289713]: 2025-11-28 09:51:45.631991385 +0000 UTC m=+0.063856778 container remove 16b827de04db2d9f312e6d444b1ed835c22fa087de6316e3f086b51f5b62b9b9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_snyder, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, distribution-scope=public, io.openshift.expose-services=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, CEPH_POINT_RELEASE=, ceph=True, vendor=Red Hat, Inc., GIT_CLEAN=True) Nov 28 04:51:45 localhost systemd[1]: libpod-conmon-16b827de04db2d9f312e6d444b1ed835c22fa087de6316e3f086b51f5b62b9b9.scope: Deactivated successfully. Nov 28 04:51:45 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:51:45 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:51:45 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Nov 28 04:51:45 localhost ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:51:45 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:51:45 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:51:46 localhost ceph-mon[287629]: Reconfiguring osd.5 (monmap changed)... Nov 28 04:51:46 localhost ceph-mon[287629]: Reconfiguring daemon osd.5 on np0005538513.localdomain Nov 28 04:51:46 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:46 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:51:46 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:46 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:51:46 localhost systemd[1]: var-lib-containers-storage-overlay-9ed6b433b11fa7279eda7c68f33a34cbfadf33be427c982552b151346a3d90d8-merged.mount: Deactivated successfully. Nov 28 04:51:46 localhost podman[289789]: Nov 28 04:51:46 localhost podman[289789]: 2025-11-28 09:51:46.532195571 +0000 UTC m=+0.087212392 container create fd903b507ce9272c3fd7ac3854af4d713578195f6c4d59b6ecb022bc4bc7d495 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_brattain, ceph=True, distribution-scope=public, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, release=553, io.buildah.version=1.33.12, RELEASE=main) Nov 28 04:51:46 localhost systemd[1]: Started libpod-conmon-fd903b507ce9272c3fd7ac3854af4d713578195f6c4d59b6ecb022bc4bc7d495.scope. Nov 28 04:51:46 localhost systemd[1]: Started libcrun container. Nov 28 04:51:46 localhost podman[289789]: 2025-11-28 09:51:46.499853639 +0000 UTC m=+0.054870500 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:51:46 localhost podman[289789]: 2025-11-28 09:51:46.601291511 +0000 UTC m=+0.156308332 container init fd903b507ce9272c3fd7ac3854af4d713578195f6c4d59b6ecb022bc4bc7d495 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_brattain, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., release=553, architecture=x86_64, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux , version=7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7) Nov 28 04:51:46 localhost podman[289789]: 2025-11-28 09:51:46.613380396 +0000 UTC m=+0.168397217 container start fd903b507ce9272c3fd7ac3854af4d713578195f6c4d59b6ecb022bc4bc7d495 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_brattain, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, architecture=x86_64, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, GIT_CLEAN=True, vcs-type=git, ceph=True) Nov 28 04:51:46 localhost podman[289789]: 2025-11-28 09:51:46.613651804 +0000 UTC m=+0.168668665 container attach fd903b507ce9272c3fd7ac3854af4d713578195f6c4d59b6ecb022bc4bc7d495 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_brattain, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, version=7, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, architecture=x86_64, io.buildah.version=1.33.12, distribution-scope=public, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., release=553, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , name=rhceph, RELEASE=main) Nov 28 04:51:46 localhost relaxed_brattain[289804]: 167 167 Nov 28 04:51:46 localhost systemd[1]: libpod-fd903b507ce9272c3fd7ac3854af4d713578195f6c4d59b6ecb022bc4bc7d495.scope: Deactivated successfully. Nov 28 04:51:46 localhost podman[289789]: 2025-11-28 09:51:46.618343929 +0000 UTC m=+0.173360800 container died fd903b507ce9272c3fd7ac3854af4d713578195f6c4d59b6ecb022bc4bc7d495 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_brattain, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_BRANCH=main, version=7, distribution-scope=public, release=553, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 28 04:51:46 localhost podman[289809]: 2025-11-28 09:51:46.72135629 +0000 UTC m=+0.092457975 container remove fd903b507ce9272c3fd7ac3854af4d713578195f6c4d59b6ecb022bc4bc7d495 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_brattain, com.redhat.component=rhceph-container, version=7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, release=553, architecture=x86_64, io.openshift.expose-services=, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, ceph=True, GIT_BRANCH=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.buildah.version=1.33.12) Nov 28 04:51:46 localhost systemd[1]: libpod-conmon-fd903b507ce9272c3fd7ac3854af4d713578195f6c4d59b6ecb022bc4bc7d495.scope: Deactivated successfully. Nov 28 04:51:46 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:51:46 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:51:46 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Nov 28 04:51:46 localhost ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:51:46 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mgr services"} v 0) Nov 28 04:51:46 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mgr services"} : dispatch Nov 28 04:51:46 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:51:46 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:51:47 localhost ceph-mon[287629]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)... Nov 28 04:51:47 localhost ceph-mon[287629]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain Nov 28 04:51:47 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:47 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:47 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:51:47 localhost ceph-mon[287629]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)... Nov 28 04:51:47 localhost ceph-mon[287629]: from='mgr.14190 ' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:51:47 localhost ceph-mon[287629]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain Nov 28 04:51:47 localhost podman[289880]: Nov 28 04:51:47 localhost systemd[1]: var-lib-containers-storage-overlay-9480669b0e5659c76a11a25687955aa4eeba5348392bec876060a31ecfba255c-merged.mount: Deactivated successfully. Nov 28 04:51:47 localhost podman[289880]: 2025-11-28 09:51:47.482835329 +0000 UTC m=+0.083176708 container create ff15fdf91c938fb824a7ad871b1630ff5bcfeb7ca99457e8ebb0865fc2fb4dbf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermi, name=rhceph, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, version=7, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, distribution-scope=public, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 04:51:47 localhost systemd[1]: Started libpod-conmon-ff15fdf91c938fb824a7ad871b1630ff5bcfeb7ca99457e8ebb0865fc2fb4dbf.scope. Nov 28 04:51:47 localhost systemd[1]: Started libcrun container. Nov 28 04:51:47 localhost podman[289880]: 2025-11-28 09:51:47.450166946 +0000 UTC m=+0.050508355 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:51:47 localhost podman[289880]: 2025-11-28 09:51:47.549605907 +0000 UTC m=+0.149947276 container init ff15fdf91c938fb824a7ad871b1630ff5bcfeb7ca99457e8ebb0865fc2fb4dbf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermi, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, com.redhat.component=rhceph-container, architecture=x86_64, version=7, RELEASE=main, vcs-type=git, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, release=553, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, io.openshift.expose-services=, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12) Nov 28 04:51:47 localhost podman[289880]: 2025-11-28 09:51:47.559952208 +0000 UTC m=+0.160293577 container start ff15fdf91c938fb824a7ad871b1630ff5bcfeb7ca99457e8ebb0865fc2fb4dbf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermi, description=Red Hat Ceph Storage 7, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.buildah.version=1.33.12, GIT_CLEAN=True, ceph=True, RELEASE=main, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, build-date=2025-09-24T08:57:55, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 28 04:51:47 localhost podman[289880]: 2025-11-28 09:51:47.560620778 +0000 UTC m=+0.160962187 container attach ff15fdf91c938fb824a7ad871b1630ff5bcfeb7ca99457e8ebb0865fc2fb4dbf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermi, version=7, release=553, vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, vendor=Red Hat, Inc., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.tags=rhceph ceph, RELEASE=main, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7) Nov 28 04:51:47 localhost recursing_fermi[289896]: 167 167 Nov 28 04:51:47 localhost systemd[1]: libpod-ff15fdf91c938fb824a7ad871b1630ff5bcfeb7ca99457e8ebb0865fc2fb4dbf.scope: Deactivated successfully. Nov 28 04:51:47 localhost podman[289880]: 2025-11-28 09:51:47.573118525 +0000 UTC m=+0.173459924 container died ff15fdf91c938fb824a7ad871b1630ff5bcfeb7ca99457e8ebb0865fc2fb4dbf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermi, build-date=2025-09-24T08:57:55, distribution-scope=public, version=7, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.buildah.version=1.33.12, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, io.openshift.expose-services=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7) Nov 28 04:51:47 localhost podman[289901]: 2025-11-28 09:51:47.675172377 +0000 UTC m=+0.092902390 container remove ff15fdf91c938fb824a7ad871b1630ff5bcfeb7ca99457e8ebb0865fc2fb4dbf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermi, io.openshift.expose-services=, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, release=553, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, vcs-type=git) Nov 28 04:51:47 localhost systemd[1]: libpod-conmon-ff15fdf91c938fb824a7ad871b1630ff5bcfeb7ca99457e8ebb0865fc2fb4dbf.scope: Deactivated successfully. Nov 28 04:51:47 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:51:47 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:51:47 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Nov 28 04:51:47 localhost ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:51:47 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Nov 28 04:51:47 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Nov 28 04:51:47 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:51:47 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:51:47 localhost nova_compute[279673]: 2025-11-28 09:51:47.871 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:51:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:51:47 localhost podman[289935]: 2025-11-28 09:51:47.98778258 +0000 UTC m=+0.078448771 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:51:48 localhost podman[289935]: 2025-11-28 09:51:48.023530937 +0000 UTC m=+0.114197108 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:51:48 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:51:48 localhost openstack_network_exporter[240658]: ERROR 09:51:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:51:48 localhost openstack_network_exporter[240658]: ERROR 09:51:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:51:48 localhost openstack_network_exporter[240658]: ERROR 09:51:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:51:48 localhost openstack_network_exporter[240658]: ERROR 09:51:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:51:48 localhost openstack_network_exporter[240658]: Nov 28 04:51:48 localhost openstack_network_exporter[240658]: ERROR 09:51:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:51:48 localhost openstack_network_exporter[240658]: Nov 28 04:51:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:51:48 localhost systemd[1]: var-lib-containers-storage-overlay-769641ae2eb320e288f9e1b0a52e8e321e86a6ef10a15e10b362bf052d6fb7a0-merged.mount: Deactivated successfully. Nov 28 04:51:48 localhost podman[289993]: 2025-11-28 09:51:48.541864434 +0000 UTC m=+0.099951068 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 04:51:48 localhost podman[290001]: Nov 28 04:51:48 localhost podman[290001]: 2025-11-28 09:51:48.558959893 +0000 UTC m=+0.091052441 container create a67931833af89a99c1cbea8b9b9cb55e88105cbbe802bce147fd6ffdb8424e35 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_tu, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, GIT_CLEAN=True, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, release=553, vcs-type=git, distribution-scope=public, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, version=7, CEPH_POINT_RELEASE=, ceph=True) Nov 28 04:51:48 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "quorum_status"} v 0) Nov 28 04:51:48 localhost ceph-mon[287629]: log_channel(audit) log [DBG] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "quorum_status"} : dispatch Nov 28 04:51:48 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e6 handle_command mon_command({"prefix": "mon rm", "name": "np0005538510"} v 0) Nov 28 04:51:48 localhost ceph-mon[287629]: log_channel(audit) log [INF] : from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon rm", "name": "np0005538510"} : dispatch Nov 28 04:51:48 localhost ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b9600 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0 Nov 28 04:51:48 localhost systemd[1]: Started libpod-conmon-a67931833af89a99c1cbea8b9b9cb55e88105cbbe802bce147fd6ffdb8424e35.scope. Nov 28 04:51:48 localhost ceph-mgr[286105]: client.0 ms_handle_reset on v2:172.18.0.105:3300/0 Nov 28 04:51:48 localhost ceph-mgr[286105]: client.0 ms_handle_reset on v2:172.18.0.105:3300/0 Nov 28 04:51:48 localhost ceph-mon[287629]: mon.np0005538513@5(peon) e7 my rank is now 4 (was 5) Nov 28 04:51:48 localhost podman[289993]: 2025-11-28 09:51:48.610256992 +0000 UTC m=+0.168343626 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Nov 28 04:51:48 localhost ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b91e0 mon_map magic: 0 from mon.4 v2:172.18.0.106:3300/0 Nov 28 04:51:48 localhost podman[290001]: 2025-11-28 09:51:48.522241356 +0000 UTC m=+0.054333944 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:51:48 localhost ceph-mon[287629]: log_channel(cluster) log [INF] : mon.np0005538513 calling monitor election Nov 28 04:51:48 localhost ceph-mon[287629]: paxos.4).electionLogic(26) init, last seen epoch 26 Nov 28 04:51:48 localhost ceph-mon[287629]: mon.np0005538513@4(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:51:48 localhost ceph-mon[287629]: mon.np0005538513@4(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:51:48 localhost systemd[1]: Started libcrun container. Nov 28 04:51:48 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:51:48 localhost podman[290001]: 2025-11-28 09:51:48.66764393 +0000 UTC m=+0.199736478 container init a67931833af89a99c1cbea8b9b9cb55e88105cbbe802bce147fd6ffdb8424e35 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_tu, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, distribution-scope=public, GIT_CLEAN=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, architecture=x86_64, RELEASE=main, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, ceph=True, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 28 04:51:48 localhost podman[290001]: 2025-11-28 09:51:48.677571787 +0000 UTC m=+0.209664335 container start a67931833af89a99c1cbea8b9b9cb55e88105cbbe802bce147fd6ffdb8424e35 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_tu, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.component=rhceph-container, name=rhceph, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, io.buildah.version=1.33.12, version=7, GIT_CLEAN=True, vcs-type=git) Nov 28 04:51:48 localhost podman[290001]: 2025-11-28 09:51:48.677891878 +0000 UTC m=+0.209984476 container attach a67931833af89a99c1cbea8b9b9cb55e88105cbbe802bce147fd6ffdb8424e35 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_tu, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, io.buildah.version=1.33.12, distribution-scope=public, build-date=2025-09-24T08:57:55, architecture=x86_64, RELEASE=main, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, release=553, version=7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 28 04:51:48 localhost condescending_tu[290030]: 167 167 Nov 28 04:51:48 localhost systemd[1]: libpod-a67931833af89a99c1cbea8b9b9cb55e88105cbbe802bce147fd6ffdb8424e35.scope: Deactivated successfully. Nov 28 04:51:48 localhost podman[290001]: 2025-11-28 09:51:48.685183053 +0000 UTC m=+0.217275611 container died a67931833af89a99c1cbea8b9b9cb55e88105cbbe802bce147fd6ffdb8424e35 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_tu, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, RELEASE=main, release=553, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public) Nov 28 04:51:48 localhost podman[290035]: 2025-11-28 09:51:48.801465125 +0000 UTC m=+0.099324518 container remove a67931833af89a99c1cbea8b9b9cb55e88105cbbe802bce147fd6ffdb8424e35 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_tu, CEPH_POINT_RELEASE=, release=553, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, vcs-type=git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, distribution-scope=public, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, architecture=x86_64) Nov 28 04:51:48 localhost systemd[1]: libpod-conmon-a67931833af89a99c1cbea8b9b9cb55e88105cbbe802bce147fd6ffdb8424e35.scope: Deactivated successfully. Nov 28 04:51:49 localhost systemd[1]: var-lib-containers-storage-overlay-fd152b1b9fa663a649df1b21b39bf9efd4cd17e83bf10fec9fa10105450469b2-merged.mount: Deactivated successfully. Nov 28 04:51:49 localhost nova_compute[279673]: 2025-11-28 09:51:49.649 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:51:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:51:50.828 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:51:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:51:50.829 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:51:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:51:50.830 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:51:52 localhost nova_compute[279673]: 2025-11-28 09:51:52.903 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:51:53 localhost ceph-mds[282744]: mds.beacon.mds.np0005538513.yljthc missed beacon ack from the monitors Nov 28 04:51:53 localhost ceph-mon[287629]: mon.np0005538513@4(peon) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:51:53 localhost ceph-mon[287629]: mon.np0005538513@4(peon).osd e86 _set_new_cache_sizes cache_size:1020054728 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:51:53 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:51:53 localhost ceph-mon[287629]: Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain Nov 28 04:51:53 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "mon rm", "name": "np0005538510"} : dispatch Nov 28 04:51:53 localhost ceph-mon[287629]: Remove daemons mon.np0005538510 Nov 28 04:51:53 localhost ceph-mon[287629]: Safe to remove mon.np0005538510: new quorum should be ['np0005538512', 'np0005538511', 'np0005538515', 'np0005538514', 'np0005538513'] (from ['np0005538512', 'np0005538511', 'np0005538515', 'np0005538514', 'np0005538513']) Nov 28 04:51:53 localhost ceph-mon[287629]: Removing monitor np0005538510 from monmap... Nov 28 04:51:53 localhost ceph-mon[287629]: Removing daemon mon.np0005538510 from np0005538510.localdomain -- ports [] Nov 28 04:51:53 localhost ceph-mon[287629]: mon.np0005538515 calling monitor election Nov 28 04:51:53 localhost ceph-mon[287629]: mon.np0005538511 calling monitor election Nov 28 04:51:53 localhost ceph-mon[287629]: mon.np0005538512 calling monitor election Nov 28 04:51:53 localhost ceph-mon[287629]: mon.np0005538513 calling monitor election Nov 28 04:51:53 localhost ceph-mon[287629]: mon.np0005538512 is new leader, mons np0005538512,np0005538511,np0005538515,np0005538513 in quorum (ranks 0,1,2,4) Nov 28 04:51:53 localhost ceph-mon[287629]: Health check failed: 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538513 (MON_DOWN) Nov 28 04:51:53 localhost ceph-mon[287629]: Health detail: HEALTH_WARN 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538513 Nov 28 04:51:53 localhost ceph-mon[287629]: [WRN] MON_DOWN: 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538513 Nov 28 04:51:53 localhost ceph-mon[287629]: mon.np0005538514 (rank 3) addr [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] is down (out of quorum) Nov 28 04:51:53 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:54 localhost nova_compute[279673]: 2025-11-28 09:51:54.675 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:51:54 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:54 localhost ceph-mon[287629]: Reconfiguring crash.np0005538514 (monmap changed)... Nov 28 04:51:54 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:51:54 localhost ceph-mon[287629]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain Nov 28 04:51:54 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:54 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:54 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 28 04:51:55 localhost ceph-mon[287629]: mon.np0005538513@4(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:51:55 localhost ceph-mon[287629]: mon.np0005538513@4(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:51:55 localhost ceph-mon[287629]: mon.np0005538513@4(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:51:55 localhost ceph-mon[287629]: mon.np0005538513@4(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:51:55 localhost ceph-mon[287629]: mon.np0005538513@4(peon) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:51:56 localhost ceph-mon[287629]: mon.np0005538514 calling monitor election Nov 28 04:51:56 localhost ceph-mon[287629]: Removed label mon from host np0005538510.localdomain Nov 28 04:51:56 localhost ceph-mon[287629]: Reconfiguring osd.3 (monmap changed)... Nov 28 04:51:56 localhost ceph-mon[287629]: Reconfiguring daemon osd.3 on np0005538514.localdomain Nov 28 04:51:56 localhost ceph-mon[287629]: mon.np0005538511 calling monitor election Nov 28 04:51:56 localhost ceph-mon[287629]: mon.np0005538515 calling monitor election Nov 28 04:51:56 localhost ceph-mon[287629]: mon.np0005538512 calling monitor election Nov 28 04:51:56 localhost ceph-mon[287629]: mon.np0005538512 is new leader, mons np0005538512,np0005538511,np0005538515,np0005538514,np0005538513 in quorum (ranks 0,1,2,3,4) Nov 28 04:51:56 localhost ceph-mon[287629]: Health check cleared: MON_DOWN (was: 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538513) Nov 28 04:51:56 localhost ceph-mon[287629]: Cluster is now healthy Nov 28 04:51:56 localhost ceph-mon[287629]: overall HEALTH_OK Nov 28 04:51:56 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:56 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:56 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:51:57 localhost ceph-mon[287629]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)... Nov 28 04:51:57 localhost ceph-mon[287629]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain Nov 28 04:51:57 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:57 localhost ceph-mon[287629]: Removed label mgr from host np0005538510.localdomain Nov 28 04:51:57 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:57 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:57 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:51:57 localhost nova_compute[279673]: 2025-11-28 09:51:57.903 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:51:58 localhost ceph-mon[287629]: mon.np0005538513@4(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:51:59 localhost ceph-mon[287629]: Reconfiguring mgr.np0005538514.djozup (monmap changed)... Nov 28 04:51:59 localhost ceph-mon[287629]: Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain Nov 28 04:51:59 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:59 localhost ceph-mon[287629]: Removed label _admin from host np0005538510.localdomain Nov 28 04:51:59 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:59 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:59 localhost ceph-mon[287629]: Reconfiguring mon.np0005538514 (monmap changed)... Nov 28 04:51:59 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:51:59 localhost ceph-mon[287629]: Reconfiguring daemon mon.np0005538514 on np0005538514.localdomain Nov 28 04:51:59 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:59 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:51:59 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:51:59 localhost nova_compute[279673]: 2025-11-28 09:51:59.678 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:52:00 localhost ceph-mon[287629]: Reconfiguring crash.np0005538515 (monmap changed)... Nov 28 04:52:00 localhost ceph-mon[287629]: Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain Nov 28 04:52:00 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:00 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:00 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.673 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.675 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.710 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.711 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '214922ef-f20b-4c42-9623-3da0c7a2abca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:52:00.675427', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e2b6b7c4-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.847341432, 'message_signature': 'fd77d4f54934f83fa4cbac869fe601f11b796cab24d5a7c3b0a9cecf576636de'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:52:00.675427', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e2b6cdfe-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.847341432, 'message_signature': '9bb4eeda6708b1df838b793ffe1407ac3755aa6f04b415dfde54f8f9bf0fa311'}]}, 'timestamp': '2025-11-28 09:52:00.712471', '_unique_id': '13462806b4f04329a9bbb67daa989585'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.714 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.716 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.720 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4b4be75-98c2-444f-9e4c-e546e0077d4b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:52:00.716333', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'e2b810b0-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.888238179, 'message_signature': '0a2928a63a5045fe8272ef405eb37906c7030e8c10005c5594fa63ac440922d3'}]}, 'timestamp': '2025-11-28 09:52:00.720748', '_unique_id': '37897a71ba97470ebfadc45cdda473f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.721 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.722 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.723 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.723 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.723 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '725a2f19-ff6d-4aaa-bbcd-99473b57317f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:52:00.723157', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e2b88144-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.847341432, 'message_signature': '8523af116c815982d5e0746c252ace8d6c8d142d166726e39fe3e3f4a737ba2b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:52:00.723157', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e2b89120-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.847341432, 'message_signature': 'e83868b96542a5d47d06dc8103b473a1fd2a87cfde6248ceea15b5ba0c364400'}]}, 'timestamp': '2025-11-28 09:52:00.723996', '_unique_id': '5531b26bc3664ff28293053614052cc1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.724 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.726 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.726 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b747b9e-7b19-4a85-82b1-48e55c69ce3b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:52:00.726205', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'e2b8f836-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.888238179, 'message_signature': 'a019dc53cda9572f036bfa636820e93626fef1a3fbcd8ff73def75c1eda745b3'}]}, 'timestamp': '2025-11-28 09:52:00.726664', '_unique_id': '191198ea9dc3403e9097644f87bb89e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.727 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.728 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.746 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 12760000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7e8abe6-7a6f-48ce-9d2a-5df44ba0ba5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12760000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:52:00.728747', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'e2bc2664-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.918732784, 'message_signature': 'a7d69dfbe71ffa4545c74fe434b25e411be7fb5b3bc78ccd3a61308bd6277488'}]}, 'timestamp': '2025-11-28 09:52:00.747515', '_unique_id': '125284382b7a45e9800ce448b534abaa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.748 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.749 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.749 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.750 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be635916-7a4f-4aef-a8c6-cf7de23c57d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:52:00.749710', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e2bc8e1a-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.847341432, 'message_signature': '6ca2d7cf98ddba9afc1679d4fafc4debf10de10cc4f91d5d516996afc0f1b579'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:52:00.749710', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e2bc9f4a-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.847341432, 'message_signature': '3c074aae2a8709a83fabb4654c20eef250cd91695bb7081bb2aa39b8ddc070fa'}]}, 'timestamp': '2025-11-28 09:52:00.750571', '_unique_id': 'b353bbef85f64753a596fabb6ee3b718'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.751 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.752 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 28 04:52:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.765 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.765 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2bddbae-8053-4884-86b0-142fb99727fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:52:00.752745', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e2bee9bc-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.924641177, 'message_signature': '184e1374a048393c37f41d02e82685d6817cbc2dead9ebc7e28844c4aab760f9'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:52:00.752745', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e2befa1a-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.924641177, 'message_signature': '6aaf87739b1fb3c92607276f69869575072dedd5288a74a412d5f5e374f0a886'}]}, 'timestamp': '2025-11-28 09:52:00.766003', '_unique_id': 'd8c1ae5223994dceb4a2e1dd70b23767'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.766 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.768 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.768 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.768 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c18eaa7b-6265-4d84-ab74-aaf0179af586', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:52:00.768589', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'e2bf6fb8-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.918732784, 'message_signature': '838a7799f1611d83e1c08d6c1486ed69562f8d09caa609bfb06372aa0f210f48'}]}, 'timestamp': '2025-11-28 09:52:00.769098', '_unique_id': '5f1c98a334314c3a98ef912b017dc251'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.771 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.771 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.771 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73b78d13-7923-4a5b-8905-82cf55dfe60c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:52:00.771174', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e2bfd480-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.847341432, 'message_signature': 'fb003e18052d6fd5c72b9e3eeaf19686a391a92e3b583a234a72906190f0f463'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:52:00.771174', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e2bfe420-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.847341432, 'message_signature': '4002d00b6555963384870e1de63e3880e278073b841ff968d80423d6eb254a5a'}]}, 'timestamp': '2025-11-28 09:52:00.771994', '_unique_id': '1cfadfdde2f04f85bb1d7fd9f39b2e5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.774 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.774 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9852a5ed-c029-4a75-a2a6-0a546b801e5b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:52:00.774161', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'e2c04a6e-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.888238179, 'message_signature': 'f89e19b6bcbdefd7e85dccf837803a41a7278212908af74d69ddaa1291b82305'}]}, 'timestamp': '2025-11-28 09:52:00.774670', '_unique_id': '51b469ed3e464b7f842a5d490d037ed1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.775 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.776 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.776 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dbeac082-aed8-4899-bd39-0be963da7908', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:52:00.776940', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'e2c0b74c-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.888238179, 'message_signature': 'ab0d3499cd0b96c48dca4640cc93f01a0a31633422cba847f6aca9baad688387'}]}, 'timestamp': '2025-11-28 09:52:00.777426', '_unique_id': '50c33d85b3554e26b3158d4c3e20acd0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.778 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.779 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.779 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.779 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2cc4b7e-f7c5-4580-9b2c-433d56c974de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:52:00.779466', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e2c11822-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.847341432, 'message_signature': 'fb9a708f91ece67813e0026a4aafaf193a4c1e69b6efa7777df09662c6cab9d0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:52:00.779466', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e2c12916-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.847341432, 'message_signature': '1a6628759ccbd1312b06ef3e13b26b09fc16a91ffd206cc77ab9804251a111ed'}]}, 'timestamp': '2025-11-28 09:52:00.780313', '_unique_id': '9779c8dad98d4cad969f90df6bb1c02f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.781 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.782 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.782 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.782 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07dbbb58-c2f2-41b6-9525-5862ee3b1725', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:52:00.782526', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'e2c18fb4-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.888238179, 'message_signature': '7e0f74f6a84d269ee146070b4a15abb271e8bd67951d17563012f5627404bbb5'}]}, 'timestamp': '2025-11-28 09:52:00.782967', '_unique_id': 'c83970b3eff54a5686439b2f5ddcaaa9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.783 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.784 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.785 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd913e79-ae48-4aea-9044-e63de1b762dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:52:00.785033', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'e2c20d7c-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.888238179, 'message_signature': 'fbc7b428c875cf4bcfb95b6b1d2ffcdb2d0114ad845854a33b23fa40795f3e8a'}]}, 'timestamp': '2025-11-28 09:52:00.786421', '_unique_id': '2a3fb661ebfd49598707f5b3deca2ec1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.788 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.789 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.789 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc72ff31-18ad-4a84-a9ce-8c6df7c63818', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:52:00.789838', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'e2c2afde-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.888238179, 'message_signature': '560a97d85fb5576dc2a7544723dc01549dd47297b5c63f18d0c0fad33c31562a'}]}, 'timestamp': '2025-11-28 09:52:00.790358', '_unique_id': '47a396969b364b438892aac4f98830a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.791 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.792 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.792 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.792 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1738b2f-08e4-43b5-bcc5-5a143ad5d68d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:52:00.792716', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'e2c31e38-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.888238179, 'message_signature': 'c21ed207347dd29547c3d0ca48756bf4ed953ea56492113afa595a422f6ab424'}]}, 'timestamp': '2025-11-28 09:52:00.793206', '_unique_id': '932ef35aaa1f403e925549ba2efb45d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.794 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.795 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.795 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.795 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a541e8a-707a-4f5a-8bf0-1038e457a6a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:52:00.795299', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e2c3830a-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.924641177, 'message_signature': '369ae8958d0478269e4305ef5624dabc561c1388aabb2426da84a2e6174c7c8d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:52:00.795299', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e2c392d2-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.924641177, 'message_signature': '6fda23888d5c22c995657772342be087cec7712e9872fe4073b1d01b6377c248'}]}, 'timestamp': '2025-11-28 09:52:00.796164', '_unique_id': '9ee53aee7905415f87ac73990564e84d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.797 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.798 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.798 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d135c63-c491-43a3-84b2-5cac4cfaf8d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:52:00.798288', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'e2c3f79a-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.888238179, 'message_signature': '3d2c535159bc5f22fa74eab4e44cdab74c4b521ce82314c116e868ef845c0a97'}]}, 'timestamp': '2025-11-28 09:52:00.798735', '_unique_id': 'b30308ab24f74d20878de9149b064949'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.799 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.801 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.801 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.801 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5c4c128-3d0a-47ca-bc88-ef4a73e18f4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:52:00.801337', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e2c46ce8-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.924641177, 'message_signature': '4eb6dff05fc07fc583459ad0c0051cac14cac41b5962b8f8b672cc477d71513d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:52:00.801337', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e2c476d4-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.924641177, 'message_signature': 'b2fe282a7478fd002ca9821155cf7f86a13da985096ad53ad3f05c7d836bb5b9'}]}, 'timestamp': '2025-11-28 09:52:00.801887', '_unique_id': 'bebce2593dd14e20adf6f15e7f6f4679'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.802 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.803 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.803 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.803 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27f781ed-21e2-4e7f-afdb-8fa0925237c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:52:00.803206', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e2c4b496-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.847341432, 'message_signature': 'a5d6cfe97413103f2e5720eb08c59e01c2fe50a413ceec4200c1bd1ca0fa4089'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:52:00.803206', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e2c4be0a-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.847341432, 'message_signature': 'a559b5edb2c56c45096792907f37a78c026ee6f126a2d0c326539b26512397f5'}]}, 'timestamp': '2025-11-28 09:52:00.803709', '_unique_id': 'f0403d47cb53412aaf264e41ab09af0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.804 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45adb67e-b8e1-4391-9229-eed29f91c743', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:52:00.805056', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'e2c4fcf8-cc3f-11f0-a370-fa163eb02593', 'monotonic_time': 11354.888238179, 'message_signature': '53521d9d05f0913a7c0893a46bb900db231c01e3c0e165222496ec6c96f404f5'}]}, 'timestamp': '2025-11-28 09:52:00.805339', '_unique_id': 'cd77c5da1cd74ca9b18c0c122a14bbb5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:52:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:52:00.805 12 ERROR oslo_messaging.notify.messaging Nov 28 04:52:00 localhost podman[290053]: 2025-11-28 09:52:00.870610137 +0000 UTC m=+0.097344347 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-type=git, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter) Nov 28 04:52:00 localhost podman[290053]: 2025-11-28 09:52:00.885609961 +0000 UTC m=+0.112344191 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm) Nov 28 04:52:00 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:52:01 localhost ceph-mon[287629]: Reconfiguring osd.1 (monmap changed)... Nov 28 04:52:01 localhost ceph-mon[287629]: Reconfiguring daemon osd.1 on np0005538515.localdomain Nov 28 04:52:01 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:01 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:01 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 28 04:52:02 localhost ceph-mon[287629]: Reconfiguring osd.4 (monmap changed)... Nov 28 04:52:02 localhost ceph-mon[287629]: Reconfiguring daemon osd.4 on np0005538515.localdomain Nov 28 04:52:02 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:02 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:02 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:52:02 localhost nova_compute[279673]: 2025-11-28 09:52:02.944 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:52:03 localhost ceph-mon[287629]: Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)... Nov 28 04:52:03 localhost ceph-mon[287629]: Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain Nov 28 04:52:03 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:03 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:03 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:52:03 localhost ceph-mon[287629]: mon.np0005538513@4(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:52:04 localhost ceph-mon[287629]: Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)... Nov 28 04:52:04 localhost ceph-mon[287629]: Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain Nov 28 04:52:04 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:04 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:04 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:52:04 localhost nova_compute[279673]: 2025-11-28 09:52:04.713 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:52:05 localhost ceph-mon[287629]: Reconfiguring mon.np0005538515 (monmap changed)... Nov 28 04:52:05 localhost ceph-mon[287629]: Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain Nov 28 04:52:05 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:05 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:52:05 localhost systemd[1]: tmp-crun.iZFTbB.mount: Deactivated successfully. Nov 28 04:52:05 localhost podman[290076]: 2025-11-28 09:52:05.489601107 +0000 UTC m=+0.100464914 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 28 04:52:05 localhost podman[290076]: 2025-11-28 09:52:05.526404777 +0000 UTC m=+0.137268554 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:52:05 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:52:07 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:07 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:07 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:52:07 localhost ceph-mon[287629]: Removing np0005538510.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:52:07 localhost ceph-mon[287629]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:07 localhost ceph-mon[287629]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:07 localhost ceph-mon[287629]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:07 localhost ceph-mon[287629]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:07 localhost ceph-mon[287629]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:07 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:07 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:07 localhost nova_compute[279673]: 2025-11-28 09:52:07.949 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:52:08 localhost ceph-mon[287629]: Removing np0005538510.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:52:08 localhost ceph-mon[287629]: Removing np0005538510.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:52:08 localhost ceph-mon[287629]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:52:08 localhost ceph-mon[287629]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:52:08 localhost ceph-mon[287629]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:52:08 localhost ceph-mon[287629]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:52:08 localhost ceph-mon[287629]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:52:08 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:08 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:08 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:08 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:08 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:08 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:08 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:08 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:08 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:08 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:08 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:08 localhost ceph-mon[287629]: mon.np0005538513@4(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:52:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:52:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:52:08 localhost systemd[1]: tmp-crun.eouQjm.mount: Deactivated successfully. Nov 28 04:52:08 localhost podman[290420]: 2025-11-28 09:52:08.873353293 +0000 UTC m=+0.103419494 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:52:08 localhost podman[290420]: 2025-11-28 09:52:08.904765956 +0000 UTC m=+0.134832187 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:52:08 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:52:08 localhost systemd[1]: tmp-crun.GGu8yV.mount: Deactivated successfully. Nov 28 04:52:08 localhost podman[290419]: 2025-11-28 09:52:08.973285718 +0000 UTC m=+0.202333908 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0) Nov 28 04:52:09 localhost podman[290419]: 2025-11-28 09:52:09.04984776 +0000 UTC m=+0.278895970 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:52:09 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:52:09 localhost ceph-mon[287629]: Removing daemon mgr.np0005538510.nzitwz from np0005538510.localdomain -- ports [9283, 8765] Nov 28 04:52:09 localhost nova_compute[279673]: 2025-11-28 09:52:09.716 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:52:10 localhost podman[238687]: time="2025-11-28T09:52:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:52:10 localhost podman[238687]: @ - - [28/Nov/2025:09:52:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 28 04:52:10 localhost podman[238687]: @ - - [28/Nov/2025:09:52:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18709 "" "Go-http-client/1.1" Nov 28 04:52:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:52:10 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:10 localhost ceph-mon[287629]: Added label _no_schedule to host np0005538510.localdomain Nov 28 04:52:10 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:10 localhost ceph-mon[287629]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005538510.localdomain Nov 28 04:52:10 localhost ceph-mon[287629]: Removing key for mgr.np0005538510.nzitwz Nov 28 04:52:10 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth rm", "entity": "mgr.np0005538510.nzitwz"} : dispatch Nov 28 04:52:10 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005538510.nzitwz"}]': finished Nov 28 04:52:10 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:10 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:10 localhost systemd[1]: tmp-crun.lQFQ6z.mount: Deactivated successfully. Nov 28 04:52:10 localhost podman[290478]: 2025-11-28 09:52:10.674203147 +0000 UTC m=+0.100722250 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 28 04:52:10 localhost podman[290478]: 2025-11-28 09:52:10.687623183 +0000 UTC m=+0.114142326 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3) Nov 28 04:52:10 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:52:12 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:12 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:12 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:52:12 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:12 localhost nova_compute[279673]: 2025-11-28 09:52:12.984 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:52:13 localhost ceph-mon[287629]: Removing daemon crash.np0005538510 from np0005538510.localdomain -- ports [] Nov 28 04:52:13 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:13 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:13 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain"} : dispatch Nov 28 04:52:13 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain"}]': finished Nov 28 04:52:13 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth rm", "entity": "client.crash.np0005538510.localdomain"} : dispatch Nov 28 04:52:13 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd='[{"prefix": "auth rm", "entity": "client.crash.np0005538510.localdomain"}]': finished Nov 28 04:52:13 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:13 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:13 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:52:13 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:13 localhost ceph-mon[287629]: mon.np0005538513@4(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:52:14 localhost ceph-mon[287629]: Removing key for client.crash.np0005538510.localdomain Nov 28 04:52:14 localhost ceph-mon[287629]: Removed host np0005538510.localdomain Nov 28 04:52:14 localhost ceph-mon[287629]: Reconfiguring crash.np0005538511 (monmap changed)... Nov 28 04:52:14 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538511.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:52:14 localhost ceph-mon[287629]: Reconfiguring daemon crash.np0005538511 on np0005538511.localdomain Nov 28 04:52:14 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:14 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:14 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0. Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:14.429212) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13 Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323534429305, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 12076, "num_deletes": 528, "total_data_size": 19062293, "memory_usage": 19777560, "flush_reason": "Manual Compaction"} Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323534520476, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 12283092, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 12081, "table_properties": {"data_size": 12227037, "index_size": 29773, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25221, "raw_key_size": 262952, "raw_average_key_size": 26, "raw_value_size": 12056539, "raw_average_value_size": 1196, "num_data_blocks": 1136, "num_entries": 10074, "num_filter_entries": 10074, "num_deletions": 527, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323473, "oldest_key_time": 1764323473, "file_creation_time": 1764323534, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "53876326-3f1b-4342-b386-ddefe9bbd825", "db_session_id": "ND9860OIBZS6OJ35KIN7", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}} Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 91337 microseconds, and 25448 cpu microseconds. Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:14.520551) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 12283092 bytes OK Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:14.520581) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:14.522276) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:14.522306) EVENT_LOG_v1 {"time_micros": 1764323534522299, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0} Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:14.522329) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50 Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 18983401, prev total WAL file size 18984150, number of live WAL files 2. Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:14.525713) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033353039' seq:72057594037927935, type:22 .. '6D6772737461740033373631' seq:0, type:0; will stop at (end) Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00 Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(11MB) 8(2012B)] Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323534525813, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 12285104, "oldest_snapshot_seqno": -1} Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 9550 keys, 12274999 bytes, temperature: kUnknown Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323534620119, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 12274999, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12220302, "index_size": 29700, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23941, "raw_key_size": 254014, "raw_average_key_size": 26, "raw_value_size": 12056597, "raw_average_value_size": 1262, "num_data_blocks": 1134, "num_entries": 9550, "num_filter_entries": 9550, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323473, "oldest_key_time": 0, "file_creation_time": 1764323534, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "53876326-3f1b-4342-b386-ddefe9bbd825", "db_session_id": "ND9860OIBZS6OJ35KIN7", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}} Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:14.620306) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 12274999 bytes Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:14.621709) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 130.2 rd, 130.1 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(11.7, 0.0 +0.0 blob) out(11.7 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 10079, records dropped: 529 output_compression: NoCompression Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:14.621724) EVENT_LOG_v1 {"time_micros": 1764323534621717, "job": 4, "event": "compaction_finished", "compaction_time_micros": 94365, "compaction_time_cpu_micros": 35922, "output_level": 6, "num_output_files": 1, "total_output_size": 12274999, "num_input_records": 10079, "num_output_records": 9550, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323534622605, "job": 4, "event": "table_file_deletion", "file_number": 14} Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323534622634, "job": 4, "event": "table_file_deletion", "file_number": 8} Nov 28 04:52:14 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:14.525551) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:52:14 localhost sshd[290534]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:52:14 localhost nova_compute[279673]: 2025-11-28 09:52:14.755 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:52:14 localhost systemd-logind[764]: New session 65 of user tripleo-admin. Nov 28 04:52:14 localhost systemd[1]: Created slice User Slice of UID 1003. Nov 28 04:52:14 localhost systemd[1]: Starting User Runtime Directory /run/user/1003... Nov 28 04:52:14 localhost systemd[1]: Finished User Runtime Directory /run/user/1003. Nov 28 04:52:14 localhost systemd[1]: Starting User Manager for UID 1003... Nov 28 04:52:15 localhost systemd[290538]: Queued start job for default target Main User Target. Nov 28 04:52:15 localhost systemd[290538]: Created slice User Application Slice. Nov 28 04:52:15 localhost systemd[290538]: Started Mark boot as successful after the user session has run 2 minutes. Nov 28 04:52:15 localhost systemd[290538]: Started Daily Cleanup of User's Temporary Directories. Nov 28 04:52:15 localhost systemd[290538]: Reached target Paths. Nov 28 04:52:15 localhost systemd[290538]: Reached target Timers. Nov 28 04:52:15 localhost systemd[290538]: Starting D-Bus User Message Bus Socket... Nov 28 04:52:15 localhost systemd[290538]: Starting Create User's Volatile Files and Directories... Nov 28 04:52:15 localhost systemd[290538]: Listening on D-Bus User Message Bus Socket. Nov 28 04:52:15 localhost systemd[290538]: Reached target Sockets. Nov 28 04:52:15 localhost systemd[290538]: Finished Create User's Volatile Files and Directories. Nov 28 04:52:15 localhost systemd[290538]: Reached target Basic System. Nov 28 04:52:15 localhost systemd[290538]: Reached target Main User Target. Nov 28 04:52:15 localhost systemd[290538]: Startup finished in 131ms. Nov 28 04:52:15 localhost systemd[1]: Started User Manager for UID 1003. Nov 28 04:52:15 localhost systemd[1]: Started Session 65 of User tripleo-admin. Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0. Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.205312) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16 Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323535205391, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 292, "num_deletes": 251, "total_data_size": 91200, "memory_usage": 96920, "flush_reason": "Manual Compaction"} Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323535208605, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 59562, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12086, "largest_seqno": 12373, "table_properties": {"data_size": 57583, "index_size": 218, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 5380, "raw_average_key_size": 19, "raw_value_size": 53620, "raw_average_value_size": 195, "num_data_blocks": 8, "num_entries": 274, "num_filter_entries": 274, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323534, "oldest_key_time": 1764323534, "file_creation_time": 1764323535, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "53876326-3f1b-4342-b386-ddefe9bbd825", "db_session_id": "ND9860OIBZS6OJ35KIN7", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}} Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 3333 microseconds, and 1071 cpu microseconds. Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.208655) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 59562 bytes OK Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.208677) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.210271) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.210293) EVENT_LOG_v1 {"time_micros": 1764323535210286, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.210315) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 89037, prev total WAL file size 89037, number of live WAL files 2. Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.210812) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130303430' seq:72057594037927935, type:22 .. '7061786F73003130323932' seq:0, type:0; will stop at (end) Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(58KB)], [15(11MB)] Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323535210844, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 12334561, "oldest_snapshot_seqno": -1} Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 9307 keys, 11191532 bytes, temperature: kUnknown Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323535289199, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 11191532, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11140072, "index_size": 27103, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23301, "raw_key_size": 249390, "raw_average_key_size": 26, "raw_value_size": 10982149, "raw_average_value_size": 1179, "num_data_blocks": 1021, "num_entries": 9307, "num_filter_entries": 9307, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323473, "oldest_key_time": 0, "file_creation_time": 1764323535, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "53876326-3f1b-4342-b386-ddefe9bbd825", "db_session_id": "ND9860OIBZS6OJ35KIN7", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}} Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.289519) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 11191532 bytes Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.291912) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 157.2 rd, 142.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 11.7 +0.0 blob) out(10.7 +0.0 blob), read-write-amplify(395.0) write-amplify(187.9) OK, records in: 9824, records dropped: 517 output_compression: NoCompression Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.291943) EVENT_LOG_v1 {"time_micros": 1764323535291929, "job": 6, "event": "compaction_finished", "compaction_time_micros": 78479, "compaction_time_cpu_micros": 32028, "output_level": 6, "num_output_files": 1, "total_output_size": 11191532, "num_input_records": 9824, "num_output_records": 9307, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323535292113, "job": 6, "event": "table_file_deletion", "file_number": 17} Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323535293995, "job": 6, "event": "table_file_deletion", "file_number": 15} Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.210708) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.294110) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.294118) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.294131) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.294134) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:52:15 localhost ceph-mon[287629]: rocksdb: (Original Log Time 2025/11/28-09:52:15.294140) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:52:15 localhost ceph-mon[287629]: Reconfiguring mon.np0005538511 (monmap changed)... Nov 28 04:52:15 localhost ceph-mon[287629]: Reconfiguring daemon mon.np0005538511 on np0005538511.localdomain Nov 28 04:52:15 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:15 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:15 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538511.fvuybw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:52:15 localhost python3[290681]: ansible-ansible.builtin.lineinfile Invoked with dest=/etc/os-net-config/tripleo_config.yaml insertafter=172.18.0 line= - ip_netmask: 172.18.0.103/24 backup=True path=/etc/os-net-config/tripleo_config.yaml state=present backrefs=False create=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Nov 28 04:52:16 localhost ceph-mon[287629]: Reconfiguring mgr.np0005538511.fvuybw (monmap changed)... Nov 28 04:52:16 localhost ceph-mon[287629]: Reconfiguring daemon mgr.np0005538511.fvuybw on np0005538511.localdomain Nov 28 04:52:16 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:16 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:16 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:52:16 localhost python3[290827]: ansible-ansible.legacy.command Invoked with _raw_params=ip a add 172.18.0.103/24 dev vlan21 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:52:17 localhost ceph-mon[287629]: Reconfiguring mon.np0005538512 (monmap changed)... Nov 28 04:52:17 localhost ceph-mon[287629]: Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain Nov 28 04:52:17 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:17 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:17 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:52:17 localhost python3[290972]: ansible-ansible.legacy.command Invoked with _raw_params=ping -W1 -c 3 172.18.0.103 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 04:52:17 localhost nova_compute[279673]: 2025-11-28 09:52:17.984 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:52:18 localhost openstack_network_exporter[240658]: ERROR 09:52:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:52:18 localhost openstack_network_exporter[240658]: ERROR 09:52:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:52:18 localhost openstack_network_exporter[240658]: ERROR 09:52:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:52:18 localhost openstack_network_exporter[240658]: ERROR 09:52:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:52:18 localhost openstack_network_exporter[240658]: Nov 28 04:52:18 localhost openstack_network_exporter[240658]: ERROR 09:52:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:52:18 localhost openstack_network_exporter[240658]: Nov 28 04:52:18 localhost ceph-mon[287629]: Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)... Nov 28 04:52:18 localhost ceph-mon[287629]: Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain Nov 28 04:52:18 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:18 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:18 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:18 localhost ceph-mon[287629]: Reconfiguring crash.np0005538512 (monmap changed)... Nov 28 04:52:18 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:52:18 localhost ceph-mon[287629]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain Nov 28 04:52:18 localhost ceph-mon[287629]: mon.np0005538513@4(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:52:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:52:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:52:18 localhost podman[290990]: 2025-11-28 09:52:18.869757298 +0000 UTC m=+0.097542303 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:52:18 localhost podman[290990]: 2025-11-28 09:52:18.883527684 +0000 UTC m=+0.111312709 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:52:18 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:52:18 localhost systemd[1]: tmp-crun.7Bs7yN.mount: Deactivated successfully. Nov 28 04:52:18 localhost podman[290991]: 2025-11-28 09:52:18.981296903 +0000 UTC m=+0.209184131 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3) Nov 28 04:52:18 localhost podman[290991]: 2025-11-28 09:52:18.992568432 +0000 UTC m=+0.220455670 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 04:52:19 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:52:19 localhost podman[291070]: Nov 28 04:52:19 localhost podman[291070]: 2025-11-28 09:52:19.399191058 +0000 UTC m=+0.085551922 container create 9e78265ce0d00d6e4e8c80ad3876710da092d71bf07fe53c044efe1c57db97a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_neumann, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, vcs-type=git, description=Red Hat Ceph Storage 7, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=553, maintainer=Guillaume Abrioux , ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, version=7) Nov 28 04:52:19 localhost systemd[1]: Started libpod-conmon-9e78265ce0d00d6e4e8c80ad3876710da092d71bf07fe53c044efe1c57db97a3.scope. Nov 28 04:52:19 localhost systemd[1]: Started libcrun container. Nov 28 04:52:19 localhost podman[291070]: 2025-11-28 09:52:19.362229723 +0000 UTC m=+0.048590617 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:52:19 localhost podman[291070]: 2025-11-28 09:52:19.479052902 +0000 UTC m=+0.165413776 container init 9e78265ce0d00d6e4e8c80ad3876710da092d71bf07fe53c044efe1c57db97a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_neumann, vcs-type=git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, GIT_CLEAN=True, io.openshift.expose-services=, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, release=553, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph) Nov 28 04:52:19 localhost podman[291070]: 2025-11-28 09:52:19.487877095 +0000 UTC m=+0.174237959 container start 9e78265ce0d00d6e4e8c80ad3876710da092d71bf07fe53c044efe1c57db97a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_neumann, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-type=git, GIT_BRANCH=main, release=553, GIT_CLEAN=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, distribution-scope=public, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 28 04:52:19 localhost podman[291070]: 2025-11-28 09:52:19.488120443 +0000 UTC m=+0.174481337 container attach 9e78265ce0d00d6e4e8c80ad3876710da092d71bf07fe53c044efe1c57db97a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_neumann, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, name=rhceph, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, release=553) Nov 28 04:52:19 localhost hungry_neumann[291085]: 167 167 Nov 28 04:52:19 localhost systemd[1]: libpod-9e78265ce0d00d6e4e8c80ad3876710da092d71bf07fe53c044efe1c57db97a3.scope: Deactivated successfully. Nov 28 04:52:19 localhost podman[291070]: 2025-11-28 09:52:19.495269734 +0000 UTC m=+0.181630628 container died 9e78265ce0d00d6e4e8c80ad3876710da092d71bf07fe53c044efe1c57db97a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_neumann, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, release=553, vendor=Red Hat, Inc., RELEASE=main, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, GIT_CLEAN=True, com.redhat.component=rhceph-container, architecture=x86_64, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 28 04:52:19 localhost podman[291090]: 2025-11-28 09:52:19.591933248 +0000 UTC m=+0.089258836 container remove 9e78265ce0d00d6e4e8c80ad3876710da092d71bf07fe53c044efe1c57db97a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_neumann, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, RELEASE=main, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, name=rhceph, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, distribution-scope=public, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 28 04:52:19 localhost systemd[1]: libpod-conmon-9e78265ce0d00d6e4e8c80ad3876710da092d71bf07fe53c044efe1c57db97a3.scope: Deactivated successfully. Nov 28 04:52:19 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:19 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:19 localhost ceph-mon[287629]: Reconfiguring crash.np0005538513 (monmap changed)... Nov 28 04:52:19 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:52:19 localhost ceph-mon[287629]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain Nov 28 04:52:19 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:19 localhost nova_compute[279673]: 2025-11-28 09:52:19.758 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:52:19 localhost systemd[1]: var-lib-containers-storage-overlay-c7196ca7a0596408ad9b5a2c485d84bcc458f121f7d7d314aa3a45288d25201f-merged.mount: Deactivated successfully. Nov 28 04:52:20 localhost podman[291177]: Nov 28 04:52:20 localhost podman[291177]: 2025-11-28 09:52:20.338002729 +0000 UTC m=+0.079793193 container create be262855e2ac77df34f7c19bf4280a49624d8456b6640a98284654a5ca34597f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_turing, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, maintainer=Guillaume Abrioux , io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True) Nov 28 04:52:20 localhost systemd[1]: Started libpod-conmon-be262855e2ac77df34f7c19bf4280a49624d8456b6640a98284654a5ca34597f.scope. Nov 28 04:52:20 localhost systemd[1]: Started libcrun container. Nov 28 04:52:20 localhost podman[291177]: 2025-11-28 09:52:20.304561113 +0000 UTC m=+0.046351617 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:52:20 localhost podman[291177]: 2025-11-28 09:52:20.408901675 +0000 UTC m=+0.150692139 container init be262855e2ac77df34f7c19bf4280a49624d8456b6640a98284654a5ca34597f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_turing, architecture=x86_64, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , name=rhceph, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, release=553, description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container) Nov 28 04:52:20 localhost podman[291177]: 2025-11-28 09:52:20.419123282 +0000 UTC m=+0.160913746 container start be262855e2ac77df34f7c19bf4280a49624d8456b6640a98284654a5ca34597f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_turing, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., ceph=True, name=rhceph, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, architecture=x86_64, version=7, io.openshift.expose-services=, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 28 04:52:20 localhost podman[291177]: 2025-11-28 09:52:20.41938842 +0000 UTC m=+0.161178904 container attach be262855e2ac77df34f7c19bf4280a49624d8456b6640a98284654a5ca34597f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_turing, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , io.openshift.expose-services=, RELEASE=main, GIT_BRANCH=main, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_CLEAN=True, distribution-scope=public, CEPH_POINT_RELEASE=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, ceph=True, io.buildah.version=1.33.12) Nov 28 04:52:20 localhost dazzling_turing[291193]: 167 167 Nov 28 04:52:20 localhost systemd[1]: libpod-be262855e2ac77df34f7c19bf4280a49624d8456b6640a98284654a5ca34597f.scope: Deactivated successfully. Nov 28 04:52:20 localhost podman[291177]: 2025-11-28 09:52:20.422695212 +0000 UTC m=+0.164485706 container died be262855e2ac77df34f7c19bf4280a49624d8456b6640a98284654a5ca34597f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_turing, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, build-date=2025-09-24T08:57:55, version=7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, name=rhceph, io.openshift.expose-services=, architecture=x86_64, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , ceph=True) Nov 28 04:52:20 localhost podman[291198]: 2025-11-28 09:52:20.508799949 +0000 UTC m=+0.073932040 container remove be262855e2ac77df34f7c19bf4280a49624d8456b6640a98284654a5ca34597f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_turing, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , RELEASE=main, com.redhat.component=rhceph-container, release=553, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, vcs-type=git, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, ceph=True) Nov 28 04:52:20 localhost systemd[1]: libpod-conmon-be262855e2ac77df34f7c19bf4280a49624d8456b6640a98284654a5ca34597f.scope: Deactivated successfully. Nov 28 04:52:20 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:20 localhost ceph-mon[287629]: Reconfiguring osd.2 (monmap changed)... Nov 28 04:52:20 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 28 04:52:20 localhost ceph-mon[287629]: Reconfiguring daemon osd.2 on np0005538513.localdomain Nov 28 04:52:20 localhost systemd[1]: var-lib-containers-storage-overlay-4b391dbd2ed654773b7277cd1822234aea0448970ba6b0d093433c6ac0de246b-merged.mount: Deactivated successfully. Nov 28 04:52:21 localhost podman[291273]: Nov 28 04:52:21 localhost podman[291273]: 2025-11-28 09:52:21.381670668 +0000 UTC m=+0.079365230 container create a05a5965bcf46a342759dccea69443af2d35257cbece258f45b18ec395427b19 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_nightingale, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, RELEASE=main, version=7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , architecture=x86_64, release=553, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, name=rhceph, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True) Nov 28 04:52:21 localhost systemd[1]: Started libpod-conmon-a05a5965bcf46a342759dccea69443af2d35257cbece258f45b18ec395427b19.scope. Nov 28 04:52:21 localhost systemd[1]: Started libcrun container. Nov 28 04:52:21 localhost podman[291273]: 2025-11-28 09:52:21.446806696 +0000 UTC m=+0.144501258 container init a05a5965bcf46a342759dccea69443af2d35257cbece258f45b18ec395427b19 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_nightingale, GIT_BRANCH=main, vcs-type=git, io.openshift.expose-services=, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_CLEAN=True, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container) Nov 28 04:52:21 localhost podman[291273]: 2025-11-28 09:52:21.348508161 +0000 UTC m=+0.046202783 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:52:21 localhost podman[291273]: 2025-11-28 09:52:21.456146585 +0000 UTC m=+0.153841197 container start a05a5965bcf46a342759dccea69443af2d35257cbece258f45b18ec395427b19 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_nightingale, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, distribution-scope=public, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vendor=Red Hat, Inc., ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_CLEAN=True, RELEASE=main, version=7, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7) Nov 28 04:52:21 localhost podman[291273]: 2025-11-28 09:52:21.456426653 +0000 UTC m=+0.154121215 container attach a05a5965bcf46a342759dccea69443af2d35257cbece258f45b18ec395427b19 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_nightingale, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, io.openshift.expose-services=, release=553, architecture=x86_64, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 28 04:52:21 localhost trusting_nightingale[291288]: 167 167 Nov 28 04:52:21 localhost systemd[1]: libpod-a05a5965bcf46a342759dccea69443af2d35257cbece258f45b18ec395427b19.scope: Deactivated successfully. Nov 28 04:52:21 localhost podman[291273]: 2025-11-28 09:52:21.459282652 +0000 UTC m=+0.156977244 container died a05a5965bcf46a342759dccea69443af2d35257cbece258f45b18ec395427b19 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_nightingale, io.buildah.version=1.33.12, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, architecture=x86_64, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, release=553, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, io.openshift.tags=rhceph ceph, name=rhceph, maintainer=Guillaume Abrioux , distribution-scope=public, RELEASE=main, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc.) Nov 28 04:52:21 localhost podman[291293]: 2025-11-28 09:52:21.551937892 +0000 UTC m=+0.085015374 container remove a05a5965bcf46a342759dccea69443af2d35257cbece258f45b18ec395427b19 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_nightingale, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, GIT_BRANCH=main, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, version=7, architecture=x86_64, io.openshift.tags=rhceph ceph, distribution-scope=public, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , ceph=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=) Nov 28 04:52:21 localhost systemd[1]: libpod-conmon-a05a5965bcf46a342759dccea69443af2d35257cbece258f45b18ec395427b19.scope: Deactivated successfully. Nov 28 04:52:21 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:21 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:21 localhost ceph-mon[287629]: Reconfiguring osd.5 (monmap changed)... Nov 28 04:52:21 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 28 04:52:21 localhost ceph-mon[287629]: Reconfiguring daemon osd.5 on np0005538513.localdomain Nov 28 04:52:21 localhost ceph-mon[287629]: Saving service mon spec with placement label:mon Nov 28 04:52:21 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:21 localhost systemd[1]: var-lib-containers-storage-overlay-14a912ffd230ebe86e4792c06d4f8070f8a1280237a16266ce23f83fba99b912-merged.mount: Deactivated successfully. Nov 28 04:52:22 localhost podman[291370]: Nov 28 04:52:22 localhost podman[291370]: 2025-11-28 09:52:22.381343895 +0000 UTC m=+0.072210208 container create fe5109be67baeb775791f3c9aaea73c2aa276d979252911350b2a55ddbf35669 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_galileo, RELEASE=main, name=rhceph, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, vcs-type=git, version=7, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, architecture=x86_64, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 28 04:52:22 localhost systemd[1]: Started libpod-conmon-fe5109be67baeb775791f3c9aaea73c2aa276d979252911350b2a55ddbf35669.scope. Nov 28 04:52:22 localhost systemd[1]: Started libcrun container. Nov 28 04:52:22 localhost podman[291370]: 2025-11-28 09:52:22.45062081 +0000 UTC m=+0.141487113 container init fe5109be67baeb775791f3c9aaea73c2aa276d979252911350b2a55ddbf35669 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_galileo, release=553, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, version=7, com.redhat.component=rhceph-container, architecture=x86_64, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, name=rhceph, distribution-scope=public, ceph=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 28 04:52:22 localhost podman[291370]: 2025-11-28 09:52:22.352306325 +0000 UTC m=+0.043172658 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:52:22 localhost podman[291370]: 2025-11-28 09:52:22.458863046 +0000 UTC m=+0.149729349 container start fe5109be67baeb775791f3c9aaea73c2aa276d979252911350b2a55ddbf35669 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_galileo, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_BRANCH=main, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, RELEASE=main, CEPH_POINT_RELEASE=, release=553, name=rhceph, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, architecture=x86_64, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 28 04:52:22 localhost podman[291370]: 2025-11-28 09:52:22.459120553 +0000 UTC m=+0.149986906 container attach fe5109be67baeb775791f3c9aaea73c2aa276d979252911350b2a55ddbf35669 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_galileo, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, architecture=x86_64, GIT_CLEAN=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, ceph=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.expose-services=, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, name=rhceph) Nov 28 04:52:22 localhost sweet_galileo[291385]: 167 167 Nov 28 04:52:22 localhost systemd[1]: libpod-fe5109be67baeb775791f3c9aaea73c2aa276d979252911350b2a55ddbf35669.scope: Deactivated successfully. Nov 28 04:52:22 localhost podman[291370]: 2025-11-28 09:52:22.461820997 +0000 UTC m=+0.152687340 container died fe5109be67baeb775791f3c9aaea73c2aa276d979252911350b2a55ddbf35669 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_galileo, release=553, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , architecture=x86_64, name=rhceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.buildah.version=1.33.12, vendor=Red Hat, Inc., distribution-scope=public, ceph=True, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=) Nov 28 04:52:22 localhost podman[291390]: 2025-11-28 09:52:22.559320648 +0000 UTC m=+0.085211651 container remove fe5109be67baeb775791f3c9aaea73c2aa276d979252911350b2a55ddbf35669 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_galileo, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-type=git, release=553, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 28 04:52:22 localhost systemd[1]: libpod-conmon-fe5109be67baeb775791f3c9aaea73c2aa276d979252911350b2a55ddbf35669.scope: Deactivated successfully. Nov 28 04:52:22 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:22 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:22 localhost ceph-mon[287629]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)... Nov 28 04:52:22 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:52:22 localhost ceph-mon[287629]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain Nov 28 04:52:22 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:22 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:22 localhost ceph-mon[287629]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:52:22 localhost systemd[1]: var-lib-containers-storage-overlay-76516cfe65d994939cb819dd7c2e351040b3c2cc9d88659c45a8e00945726c60-merged.mount: Deactivated successfully. Nov 28 04:52:23 localhost nova_compute[279673]: 2025-11-28 09:52:23.017 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:52:23 localhost podman[291459]: Nov 28 04:52:23 localhost podman[291459]: 2025-11-28 09:52:23.264877103 +0000 UTC m=+0.077629495 container create cc3b6900b442300513f7425b1eebcdd73e23e362a22a67fd0929583ba0539efe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_hofstadter, GIT_BRANCH=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, architecture=x86_64, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, ceph=True, distribution-scope=public) Nov 28 04:52:23 localhost systemd[1]: Started libpod-conmon-cc3b6900b442300513f7425b1eebcdd73e23e362a22a67fd0929583ba0539efe.scope. Nov 28 04:52:23 localhost systemd[1]: Started libcrun container. Nov 28 04:52:23 localhost podman[291459]: 2025-11-28 09:52:23.329157415 +0000 UTC m=+0.141909817 container init cc3b6900b442300513f7425b1eebcdd73e23e362a22a67fd0929583ba0539efe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_hofstadter, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_BRANCH=main, io.buildah.version=1.33.12, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, maintainer=Guillaume Abrioux , RELEASE=main, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7) Nov 28 04:52:23 localhost podman[291459]: 2025-11-28 09:52:23.233334647 +0000 UTC m=+0.046087069 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:52:23 localhost podman[291459]: 2025-11-28 09:52:23.338273757 +0000 UTC m=+0.151026149 container start cc3b6900b442300513f7425b1eebcdd73e23e362a22a67fd0929583ba0539efe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_hofstadter, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True) Nov 28 04:52:23 localhost podman[291459]: 2025-11-28 09:52:23.338603767 +0000 UTC m=+0.151356219 container attach cc3b6900b442300513f7425b1eebcdd73e23e362a22a67fd0929583ba0539efe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_hofstadter, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., ceph=True, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, description=Red Hat Ceph Storage 7, RELEASE=main, name=rhceph, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 28 04:52:23 localhost zen_hofstadter[291474]: 167 167 Nov 28 04:52:23 localhost systemd[1]: libpod-cc3b6900b442300513f7425b1eebcdd73e23e362a22a67fd0929583ba0539efe.scope: Deactivated successfully. Nov 28 04:52:23 localhost podman[291459]: 2025-11-28 09:52:23.343700535 +0000 UTC m=+0.156452987 container died cc3b6900b442300513f7425b1eebcdd73e23e362a22a67fd0929583ba0539efe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_hofstadter, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, version=7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, name=rhceph, maintainer=Guillaume Abrioux , RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, distribution-scope=public) Nov 28 04:52:23 localhost podman[291479]: 2025-11-28 09:52:23.4361982 +0000 UTC m=+0.083740465 container remove cc3b6900b442300513f7425b1eebcdd73e23e362a22a67fd0929583ba0539efe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_hofstadter, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7) Nov 28 04:52:23 localhost systemd[1]: libpod-conmon-cc3b6900b442300513f7425b1eebcdd73e23e362a22a67fd0929583ba0539efe.scope: Deactivated successfully. Nov 28 04:52:23 localhost ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b8f20 mon_map magic: 0 from mon.4 v2:172.18.0.106:3300/0 Nov 28 04:52:23 localhost ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b91e0 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0 Nov 28 04:52:23 localhost ceph-mon[287629]: mon.np0005538513@4(peon) e8 removed from monmap, suicide. Nov 28 04:52:23 localhost ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b9600 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0 Nov 28 04:52:23 localhost podman[291506]: 2025-11-28 09:52:23.666997169 +0000 UTC m=+0.063885439 container died b72bfc005269ade02879c161a498dd471b1542ee13d035dbcf0188cb36c61613 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mon-np0005538513, distribution-scope=public, description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, ceph=True, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-type=git, GIT_CLEAN=True) Nov 28 04:52:23 localhost podman[291506]: 2025-11-28 09:52:23.700679743 +0000 UTC m=+0.097567973 container remove b72bfc005269ade02879c161a498dd471b1542ee13d035dbcf0188cb36c61613 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mon-np0005538513, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, version=7, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, GIT_CLEAN=True, release=553, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_BRANCH=main, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc.) Nov 28 04:52:23 localhost systemd[1]: var-lib-containers-storage-overlay-fa146ef048fc0f615b2043c07baf9f6ab786ee7824e42405da8d80c5752a9ca4-merged.mount: Deactivated successfully. Nov 28 04:52:23 localhost systemd[1]: var-lib-containers-storage-overlay-aa77a2e7ddc389a4da392f7d457efdd2cb70596fdbafdc2c079e625848050438-merged.mount: Deactivated successfully. Nov 28 04:52:24 localhost systemd[1]: ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1@mon.np0005538513.service: Deactivated successfully. Nov 28 04:52:24 localhost systemd[1]: Stopped Ceph mon.np0005538513 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1. Nov 28 04:52:24 localhost systemd[1]: ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1@mon.np0005538513.service: Consumed 4.361s CPU time. Nov 28 04:52:24 localhost systemd[1]: Reloading. Nov 28 04:52:24 localhost systemd-rc-local-generator[291811]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:52:24 localhost systemd-sysv-generator[291818]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:52:24 localhost nova_compute[279673]: 2025-11-28 09:52:24.787 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:52:24 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:52:24 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:52:24 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:52:24 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:52:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:52:24 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:52:24 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:52:24 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:52:24 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:52:28 localhost nova_compute[279673]: 2025-11-28 09:52:28.018 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:52:29 localhost nova_compute[279673]: 2025-11-28 09:52:29.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:52:29 localhost nova_compute[279673]: 2025-11-28 09:52:29.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:52:29 localhost nova_compute[279673]: 2025-11-28 09:52:29.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 04:52:29 localhost nova_compute[279673]: 2025-11-28 09:52:29.790 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:52:30 localhost podman[292127]: Nov 28 04:52:30 localhost podman[292127]: 2025-11-28 09:52:30.805723523 +0000 UTC m=+0.080115613 container create 4b9448c347bbd22919dad67c0530ab2e43fd35625b446f389a8d2a69478e9caa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_tharp, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_CLEAN=True, maintainer=Guillaume Abrioux , io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, ceph=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, name=rhceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, version=7, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 28 04:52:30 localhost systemd[1]: Started libpod-conmon-4b9448c347bbd22919dad67c0530ab2e43fd35625b446f389a8d2a69478e9caa.scope. Nov 28 04:52:30 localhost systemd[1]: Started libcrun container. Nov 28 04:52:30 localhost podman[292127]: 2025-11-28 09:52:30.771622687 +0000 UTC m=+0.046014817 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:52:30 localhost podman[292127]: 2025-11-28 09:52:30.880500639 +0000 UTC m=+0.154892729 container init 4b9448c347bbd22919dad67c0530ab2e43fd35625b446f389a8d2a69478e9caa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_tharp, ceph=True, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, RELEASE=main, GIT_CLEAN=True, io.buildah.version=1.33.12, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, name=rhceph, version=7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux ) Nov 28 04:52:30 localhost podman[292127]: 2025-11-28 09:52:30.892576034 +0000 UTC m=+0.166968124 container start 4b9448c347bbd22919dad67c0530ab2e43fd35625b446f389a8d2a69478e9caa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_tharp, GIT_CLEAN=True, release=553, distribution-scope=public, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, vcs-type=git, RELEASE=main, io.openshift.expose-services=, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12) Nov 28 04:52:30 localhost podman[292127]: 2025-11-28 09:52:30.892843262 +0000 UTC m=+0.167235402 container attach 4b9448c347bbd22919dad67c0530ab2e43fd35625b446f389a8d2a69478e9caa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_tharp, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., version=7, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, release=553, ceph=True, name=rhceph, distribution-scope=public, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, RELEASE=main, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 28 04:52:30 localhost frosty_tharp[292142]: 167 167 Nov 28 04:52:30 localhost systemd[1]: libpod-4b9448c347bbd22919dad67c0530ab2e43fd35625b446f389a8d2a69478e9caa.scope: Deactivated successfully. Nov 28 04:52:30 localhost podman[292127]: 2025-11-28 09:52:30.897099863 +0000 UTC m=+0.171492003 container died 4b9448c347bbd22919dad67c0530ab2e43fd35625b446f389a8d2a69478e9caa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_tharp, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, architecture=x86_64, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, ceph=True, maintainer=Guillaume Abrioux , distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, version=7, io.buildah.version=1.33.12, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_BRANCH=main) Nov 28 04:52:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:52:31 localhost podman[292147]: 2025-11-28 09:52:31.013486589 +0000 UTC m=+0.108150261 container remove 4b9448c347bbd22919dad67c0530ab2e43fd35625b446f389a8d2a69478e9caa (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_tharp, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, distribution-scope=public, io.openshift.expose-services=, maintainer=Guillaume Abrioux , RELEASE=main, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, name=rhceph, version=7, release=553, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-type=git, description=Red Hat Ceph Storage 7) Nov 28 04:52:31 localhost systemd[1]: libpod-conmon-4b9448c347bbd22919dad67c0530ab2e43fd35625b446f389a8d2a69478e9caa.scope: Deactivated successfully. Nov 28 04:52:31 localhost podman[292161]: 2025-11-28 09:52:31.083438156 +0000 UTC m=+0.081665281 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.tags=minimal rhel9, version=9.6, container_name=openstack_network_exporter, name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, distribution-scope=public) Nov 28 04:52:31 localhost podman[292161]: 2025-11-28 09:52:31.093200628 +0000 UTC m=+0.091427793 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41) Nov 28 04:52:31 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:52:31 localhost podman[292234]: Nov 28 04:52:31 localhost podman[292234]: 2025-11-28 09:52:31.715223996 +0000 UTC m=+0.077574573 container create 32c43026a18399bc4d268edd864df58f77b474ef2cd51fcb8ece24acaae71bd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_brattain, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, version=7, architecture=x86_64, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, release=553, GIT_CLEAN=True, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , name=rhceph, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-type=git, distribution-scope=public, ceph=True) Nov 28 04:52:31 localhost systemd[1]: Started libpod-conmon-32c43026a18399bc4d268edd864df58f77b474ef2cd51fcb8ece24acaae71bd6.scope. Nov 28 04:52:31 localhost systemd[1]: Started libcrun container. Nov 28 04:52:31 localhost nova_compute[279673]: 2025-11-28 09:52:31.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:52:31 localhost nova_compute[279673]: 2025-11-28 09:52:31.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:52:31 localhost podman[292234]: 2025-11-28 09:52:31.684718702 +0000 UTC m=+0.047069299 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:52:31 localhost podman[292234]: 2025-11-28 09:52:31.787069672 +0000 UTC m=+0.149420259 container init 32c43026a18399bc4d268edd864df58f77b474ef2cd51fcb8ece24acaae71bd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_brattain, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, distribution-scope=public, description=Red Hat Ceph Storage 7, RELEASE=main, version=7, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, vendor=Red Hat, Inc., release=553, maintainer=Guillaume Abrioux , architecture=x86_64) Nov 28 04:52:31 localhost podman[292234]: 2025-11-28 09:52:31.798124634 +0000 UTC m=+0.160475211 container start 32c43026a18399bc4d268edd864df58f77b474ef2cd51fcb8ece24acaae71bd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_brattain, io.openshift.expose-services=, GIT_CLEAN=True, io.buildah.version=1.33.12, GIT_BRANCH=main, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, name=rhceph, vendor=Red Hat, Inc., ceph=True, version=7, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=) Nov 28 04:52:31 localhost podman[292234]: 2025-11-28 09:52:31.798384412 +0000 UTC m=+0.160735029 container attach 32c43026a18399bc4d268edd864df58f77b474ef2cd51fcb8ece24acaae71bd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_brattain, name=rhceph, vcs-type=git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, ceph=True, maintainer=Guillaume Abrioux , architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, distribution-scope=public, io.openshift.expose-services=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.component=rhceph-container, RELEASE=main, io.openshift.tags=rhceph ceph) Nov 28 04:52:31 localhost stoic_brattain[292250]: 167 167 Nov 28 04:52:31 localhost systemd[1]: libpod-32c43026a18399bc4d268edd864df58f77b474ef2cd51fcb8ece24acaae71bd6.scope: Deactivated successfully. Nov 28 04:52:31 localhost podman[292234]: 2025-11-28 09:52:31.802101027 +0000 UTC m=+0.164451644 container died 32c43026a18399bc4d268edd864df58f77b474ef2cd51fcb8ece24acaae71bd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_brattain, distribution-scope=public, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_BRANCH=main, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.12, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., release=553, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 04:52:31 localhost systemd[1]: var-lib-containers-storage-overlay-d2472805b6e65faea0a922838297e0a878fcac27f317ddbce17813762cb01898-merged.mount: Deactivated successfully. Nov 28 04:52:31 localhost systemd[1]: var-lib-containers-storage-overlay-8ca598cba0f76d13ff7187c0eeb32ff3a2065c42911d66d61cb0ab9a61e3a769-merged.mount: Deactivated successfully. Nov 28 04:52:31 localhost podman[292255]: 2025-11-28 09:52:31.915559952 +0000 UTC m=+0.097761469 container remove 32c43026a18399bc4d268edd864df58f77b474ef2cd51fcb8ece24acaae71bd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_brattain, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, distribution-scope=public, ceph=True, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_CLEAN=True, name=rhceph, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12) Nov 28 04:52:31 localhost systemd[1]: libpod-conmon-32c43026a18399bc4d268edd864df58f77b474ef2cd51fcb8ece24acaae71bd6.scope: Deactivated successfully. Nov 28 04:52:32 localhost nova_compute[279673]: 2025-11-28 09:52:32.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:52:32 localhost nova_compute[279673]: 2025-11-28 09:52:32.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:52:32 localhost podman[292332]: Nov 28 04:52:32 localhost podman[292332]: 2025-11-28 09:52:32.794161659 +0000 UTC m=+0.082732285 container create b02b84ecb3dcb870adce08c53d7583b5227dadc28aec69d789fbbb92dee46820 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_darwin, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, architecture=x86_64, com.redhat.component=rhceph-container, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vcs-type=git, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12) Nov 28 04:52:32 localhost systemd[1]: Started libpod-conmon-b02b84ecb3dcb870adce08c53d7583b5227dadc28aec69d789fbbb92dee46820.scope. Nov 28 04:52:32 localhost systemd[1]: Started libcrun container. Nov 28 04:52:32 localhost podman[292332]: 2025-11-28 09:52:32.857352465 +0000 UTC m=+0.145923081 container init b02b84ecb3dcb870adce08c53d7583b5227dadc28aec69d789fbbb92dee46820 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_darwin, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.openshift.tags=rhceph ceph, version=7, maintainer=Guillaume Abrioux , ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, RELEASE=main, distribution-scope=public, GIT_CLEAN=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7) Nov 28 04:52:32 localhost podman[292332]: 2025-11-28 09:52:32.75936062 +0000 UTC m=+0.047931246 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:52:32 localhost podman[292332]: 2025-11-28 09:52:32.870100981 +0000 UTC m=+0.158671607 container start b02b84ecb3dcb870adce08c53d7583b5227dadc28aec69d789fbbb92dee46820 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_darwin, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, ceph=True, release=553, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, name=rhceph, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, version=7) Nov 28 04:52:32 localhost compassionate_darwin[292347]: 167 167 Nov 28 04:52:32 localhost podman[292332]: 2025-11-28 09:52:32.87040683 +0000 UTC m=+0.158977486 container attach b02b84ecb3dcb870adce08c53d7583b5227dadc28aec69d789fbbb92dee46820 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_darwin, version=7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, ceph=True, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, maintainer=Guillaume Abrioux , RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, GIT_CLEAN=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.buildah.version=1.33.12, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 28 04:52:32 localhost systemd[1]: libpod-b02b84ecb3dcb870adce08c53d7583b5227dadc28aec69d789fbbb92dee46820.scope: Deactivated successfully. Nov 28 04:52:32 localhost podman[292332]: 2025-11-28 09:52:32.876286642 +0000 UTC m=+0.164857258 container died b02b84ecb3dcb870adce08c53d7583b5227dadc28aec69d789fbbb92dee46820 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_darwin, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.component=rhceph-container, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, vcs-type=git, ceph=True) Nov 28 04:52:32 localhost podman[292352]: 2025-11-28 09:52:32.972255265 +0000 UTC m=+0.086439409 container remove b02b84ecb3dcb870adce08c53d7583b5227dadc28aec69d789fbbb92dee46820 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_darwin, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, name=rhceph, vendor=Red Hat, Inc., ceph=True, io.buildah.version=1.33.12, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph) Nov 28 04:52:32 localhost systemd[1]: libpod-conmon-b02b84ecb3dcb870adce08c53d7583b5227dadc28aec69d789fbbb92dee46820.scope: Deactivated successfully. Nov 28 04:52:33 localhost nova_compute[279673]: 2025-11-28 09:52:33.058 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:52:33 localhost nova_compute[279673]: 2025-11-28 09:52:33.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:52:33 localhost nova_compute[279673]: 2025-11-28 09:52:33.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:52:33 localhost nova_compute[279673]: 2025-11-28 09:52:33.797 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:52:33 localhost nova_compute[279673]: 2025-11-28 09:52:33.797 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:52:33 localhost nova_compute[279673]: 2025-11-28 09:52:33.798 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:52:33 localhost nova_compute[279673]: 2025-11-28 09:52:33.798 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 04:52:33 localhost nova_compute[279673]: 2025-11-28 09:52:33.799 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:52:33 localhost podman[292427]: Nov 28 04:52:33 localhost podman[292427]: 2025-11-28 09:52:33.836122014 +0000 UTC m=+0.080546485 container create f7ff2578c7a2e12d93fa3f943fc29fff90c5a9e48efd6b2a9dceff2f980753f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_mccarthy, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, release=553, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, architecture=x86_64, vcs-type=git, distribution-scope=public, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, CEPH_POINT_RELEASE=) Nov 28 04:52:33 localhost systemd[1]: tmp-crun.FbRJ4q.mount: Deactivated successfully. Nov 28 04:52:33 localhost systemd[1]: var-lib-containers-storage-overlay-881907754b830c474470259834474c34dad40024dd2547f3ed3cbcf3fffcda2f-merged.mount: Deactivated successfully. Nov 28 04:52:33 localhost systemd[1]: Started libpod-conmon-f7ff2578c7a2e12d93fa3f943fc29fff90c5a9e48efd6b2a9dceff2f980753f8.scope. Nov 28 04:52:33 localhost podman[292427]: 2025-11-28 09:52:33.804121543 +0000 UTC m=+0.048546044 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:52:33 localhost systemd[1]: Started libcrun container. Nov 28 04:52:33 localhost podman[292427]: 2025-11-28 09:52:33.929333662 +0000 UTC m=+0.173758153 container init f7ff2578c7a2e12d93fa3f943fc29fff90c5a9e48efd6b2a9dceff2f980753f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_mccarthy, CEPH_POINT_RELEASE=, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=553, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12) Nov 28 04:52:33 localhost podman[292427]: 2025-11-28 09:52:33.941825958 +0000 UTC m=+0.186250429 container start f7ff2578c7a2e12d93fa3f943fc29fff90c5a9e48efd6b2a9dceff2f980753f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_mccarthy, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, CEPH_POINT_RELEASE=, GIT_CLEAN=True, version=7, maintainer=Guillaume Abrioux , name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, com.redhat.component=rhceph-container) Nov 28 04:52:33 localhost podman[292427]: 2025-11-28 09:52:33.942169059 +0000 UTC m=+0.186593530 container attach f7ff2578c7a2e12d93fa3f943fc29fff90c5a9e48efd6b2a9dceff2f980753f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_mccarthy, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, build-date=2025-09-24T08:57:55, name=rhceph, GIT_BRANCH=main, version=7, vcs-type=git, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.openshift.expose-services=, maintainer=Guillaume Abrioux , architecture=x86_64, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 04:52:33 localhost angry_mccarthy[292443]: 167 167 Nov 28 04:52:33 localhost systemd[1]: libpod-f7ff2578c7a2e12d93fa3f943fc29fff90c5a9e48efd6b2a9dceff2f980753f8.scope: Deactivated successfully. Nov 28 04:52:33 localhost podman[292427]: 2025-11-28 09:52:33.944968376 +0000 UTC m=+0.189392837 container died f7ff2578c7a2e12d93fa3f943fc29fff90c5a9e48efd6b2a9dceff2f980753f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_mccarthy, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, version=7, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, release=553, GIT_CLEAN=True, RELEASE=main, architecture=x86_64, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 28 04:52:34 localhost podman[292460]: 2025-11-28 09:52:34.029483385 +0000 UTC m=+0.074690265 container remove f7ff2578c7a2e12d93fa3f943fc29fff90c5a9e48efd6b2a9dceff2f980753f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_mccarthy, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_BRANCH=main, release=553, GIT_CLEAN=True, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., ceph=True) Nov 28 04:52:34 localhost systemd[1]: libpod-conmon-f7ff2578c7a2e12d93fa3f943fc29fff90c5a9e48efd6b2a9dceff2f980753f8.scope: Deactivated successfully. Nov 28 04:52:34 localhost nova_compute[279673]: 2025-11-28 09:52:34.325 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.526s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:52:34 localhost nova_compute[279673]: 2025-11-28 09:52:34.391 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:52:34 localhost nova_compute[279673]: 2025-11-28 09:52:34.392 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:52:34 localhost nova_compute[279673]: 2025-11-28 09:52:34.703 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 04:52:34 localhost nova_compute[279673]: 2025-11-28 09:52:34.706 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11838MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 04:52:34 localhost nova_compute[279673]: 2025-11-28 09:52:34.707 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:52:34 localhost nova_compute[279673]: 2025-11-28 09:52:34.707 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:52:34 localhost nova_compute[279673]: 2025-11-28 09:52:34.798 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 04:52:34 localhost nova_compute[279673]: 2025-11-28 09:52:34.799 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 04:52:34 localhost nova_compute[279673]: 2025-11-28 09:52:34.800 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 04:52:34 localhost nova_compute[279673]: 2025-11-28 09:52:34.823 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:52:34 localhost systemd[1]: tmp-crun.SiymzB.mount: Deactivated successfully. Nov 28 04:52:34 localhost systemd[1]: var-lib-containers-storage-overlay-820f235033de985fc02ee247ea603f4d95f34deb878d756e05cb2bed4a5ee24e-merged.mount: Deactivated successfully. Nov 28 04:52:34 localhost nova_compute[279673]: 2025-11-28 09:52:34.855 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:52:34 localhost podman[292540]: Nov 28 04:52:34 localhost podman[292540]: 2025-11-28 09:52:34.875576214 +0000 UTC m=+0.101336711 container create 1538f2835789f0d34897b6c2ae399d129cd9d6c1f9fac1b057f086961d956fe1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_greider, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, distribution-scope=public, ceph=True, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, release=553, maintainer=Guillaume Abrioux , version=7, vcs-type=git) Nov 28 04:52:34 localhost systemd[1]: Started libpod-conmon-1538f2835789f0d34897b6c2ae399d129cd9d6c1f9fac1b057f086961d956fe1.scope. Nov 28 04:52:34 localhost systemd[1]: Started libcrun container. Nov 28 04:52:34 localhost podman[292540]: 2025-11-28 09:52:34.841573399 +0000 UTC m=+0.067333936 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:52:34 localhost podman[292540]: 2025-11-28 09:52:34.943552699 +0000 UTC m=+0.169313156 container init 1538f2835789f0d34897b6c2ae399d129cd9d6c1f9fac1b057f086961d956fe1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_greider, CEPH_POINT_RELEASE=, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, version=7, RELEASE=main, com.redhat.component=rhceph-container, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 28 04:52:34 localhost podman[292540]: 2025-11-28 09:52:34.951941109 +0000 UTC m=+0.177701566 container start 1538f2835789f0d34897b6c2ae399d129cd9d6c1f9fac1b057f086961d956fe1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_greider, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.k8s.description=Red Hat Ceph Storage 7, version=7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, vcs-type=git, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, distribution-scope=public, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_CLEAN=True) Nov 28 04:52:34 localhost podman[292540]: 2025-11-28 09:52:34.952221397 +0000 UTC m=+0.177981854 container attach 1538f2835789f0d34897b6c2ae399d129cd9d6c1f9fac1b057f086961d956fe1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_greider, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , RELEASE=main, name=rhceph, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, architecture=x86_64, build-date=2025-09-24T08:57:55, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7) Nov 28 04:52:34 localhost reverent_greider[292553]: 167 167 Nov 28 04:52:34 localhost systemd[1]: libpod-1538f2835789f0d34897b6c2ae399d129cd9d6c1f9fac1b057f086961d956fe1.scope: Deactivated successfully. Nov 28 04:52:34 localhost podman[292540]: 2025-11-28 09:52:34.958575874 +0000 UTC m=+0.184336421 container died 1538f2835789f0d34897b6c2ae399d129cd9d6c1f9fac1b057f086961d956fe1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_greider, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, GIT_BRANCH=main, version=7, name=rhceph) Nov 28 04:52:35 localhost podman[292559]: 2025-11-28 09:52:35.06882209 +0000 UTC m=+0.094039775 container remove 1538f2835789f0d34897b6c2ae399d129cd9d6c1f9fac1b057f086961d956fe1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_greider, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, ceph=True, release=553, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=) Nov 28 04:52:35 localhost systemd[1]: libpod-conmon-1538f2835789f0d34897b6c2ae399d129cd9d6c1f9fac1b057f086961d956fe1.scope: Deactivated successfully. Nov 28 04:52:35 localhost nova_compute[279673]: 2025-11-28 09:52:35.298 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:52:35 localhost nova_compute[279673]: 2025-11-28 09:52:35.306 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 04:52:35 localhost nova_compute[279673]: 2025-11-28 09:52:35.323 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 04:52:35 localhost nova_compute[279673]: 2025-11-28 09:52:35.326 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 04:52:35 localhost nova_compute[279673]: 2025-11-28 09:52:35.326 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:52:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:52:35 localhost systemd[1]: var-lib-containers-storage-overlay-cb887550d0d9e4663a530abb81fb3d5212ae9ad3989098cf7d46352fcd6c38a2-merged.mount: Deactivated successfully. Nov 28 04:52:35 localhost podman[292594]: 2025-11-28 09:52:35.858841372 +0000 UTC m=+0.087728250 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 04:52:35 localhost podman[292594]: 2025-11-28 09:52:35.867382186 +0000 UTC m=+0.096269114 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 04:52:35 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:52:37 localhost podman[292693]: Nov 28 04:52:37 localhost podman[292693]: 2025-11-28 09:52:37.538823021 +0000 UTC m=+0.079598806 container create 6223c4783206a29407ce0a4ba2c9124c2f63fab8e562c2a29fd5fd0bcab372fd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_blackburn, version=7, io.openshift.tags=rhceph ceph, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, GIT_BRANCH=main, GIT_CLEAN=True, com.redhat.component=rhceph-container, architecture=x86_64, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12) Nov 28 04:52:37 localhost systemd[1]: Started libpod-conmon-6223c4783206a29407ce0a4ba2c9124c2f63fab8e562c2a29fd5fd0bcab372fd.scope. Nov 28 04:52:37 localhost podman[292693]: 2025-11-28 09:52:37.503976102 +0000 UTC m=+0.044751917 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:52:37 localhost systemd[1]: Started libcrun container. Nov 28 04:52:37 localhost podman[292693]: 2025-11-28 09:52:37.61917897 +0000 UTC m=+0.159954755 container init 6223c4783206a29407ce0a4ba2c9124c2f63fab8e562c2a29fd5fd0bcab372fd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_blackburn, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, release=553, ceph=True, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, build-date=2025-09-24T08:57:55, name=rhceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, architecture=x86_64, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 28 04:52:37 localhost podman[292693]: 2025-11-28 09:52:37.629144469 +0000 UTC m=+0.169920254 container start 6223c4783206a29407ce0a4ba2c9124c2f63fab8e562c2a29fd5fd0bcab372fd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_blackburn, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, version=7, release=553, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, RELEASE=main, ceph=True, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=) Nov 28 04:52:37 localhost podman[292693]: 2025-11-28 09:52:37.629436049 +0000 UTC m=+0.170211884 container attach 6223c4783206a29407ce0a4ba2c9124c2f63fab8e562c2a29fd5fd0bcab372fd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_blackburn, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, name=rhceph, release=553, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, distribution-scope=public, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, ceph=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container) Nov 28 04:52:37 localhost keen_blackburn[292708]: 167 167 Nov 28 04:52:37 localhost systemd[1]: libpod-6223c4783206a29407ce0a4ba2c9124c2f63fab8e562c2a29fd5fd0bcab372fd.scope: Deactivated successfully. Nov 28 04:52:37 localhost podman[292693]: 2025-11-28 09:52:37.633504084 +0000 UTC m=+0.174279919 container died 6223c4783206a29407ce0a4ba2c9124c2f63fab8e562c2a29fd5fd0bcab372fd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_blackburn, architecture=x86_64, vcs-type=git, distribution-scope=public, maintainer=Guillaume Abrioux , name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 28 04:52:37 localhost podman[292713]: 2025-11-28 09:52:37.73151354 +0000 UTC m=+0.086262463 container remove 6223c4783206a29407ce0a4ba2c9124c2f63fab8e562c2a29fd5fd0bcab372fd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_blackburn, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, com.redhat.component=rhceph-container, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 28 04:52:37 localhost systemd[1]: libpod-conmon-6223c4783206a29407ce0a4ba2c9124c2f63fab8e562c2a29fd5fd0bcab372fd.scope: Deactivated successfully. Nov 28 04:52:37 localhost podman[292730]: Nov 28 04:52:37 localhost podman[292730]: 2025-11-28 09:52:37.842127296 +0000 UTC m=+0.070244467 container create bb869c3a0c394b4ec34158681c35ecd579415ddceadc5c323599a2a55cbc1f7c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_satoshi, ceph=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, distribution-scope=public, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, version=7, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 28 04:52:37 localhost systemd[1]: Started libpod-conmon-bb869c3a0c394b4ec34158681c35ecd579415ddceadc5c323599a2a55cbc1f7c.scope. Nov 28 04:52:37 localhost systemd[1]: Started libcrun container. Nov 28 04:52:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90dab33e2ca513faa2eb61fc496ab4abd6ccd8904e7ed39f236081f510491e11/merged/tmp/config supports timestamps until 2038 (0x7fffffff) Nov 28 04:52:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90dab33e2ca513faa2eb61fc496ab4abd6ccd8904e7ed39f236081f510491e11/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff) Nov 28 04:52:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90dab33e2ca513faa2eb61fc496ab4abd6ccd8904e7ed39f236081f510491e11/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 28 04:52:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90dab33e2ca513faa2eb61fc496ab4abd6ccd8904e7ed39f236081f510491e11/merged/var/lib/ceph/mon/ceph-np0005538513 supports timestamps until 2038 (0x7fffffff) Nov 28 04:52:37 localhost podman[292730]: 2025-11-28 09:52:37.894654263 +0000 UTC m=+0.122771474 container init bb869c3a0c394b4ec34158681c35ecd579415ddceadc5c323599a2a55cbc1f7c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_satoshi, RELEASE=main, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, version=7, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.component=rhceph-container, name=rhceph, architecture=x86_64, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, distribution-scope=public, maintainer=Guillaume Abrioux , vcs-type=git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 28 04:52:37 localhost podman[292730]: 2025-11-28 09:52:37.901101804 +0000 UTC m=+0.129219005 container start bb869c3a0c394b4ec34158681c35ecd579415ddceadc5c323599a2a55cbc1f7c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_satoshi, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., ceph=True, name=rhceph, architecture=x86_64, io.buildah.version=1.33.12, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, maintainer=Guillaume Abrioux , GIT_CLEAN=True, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7) Nov 28 04:52:37 localhost podman[292730]: 2025-11-28 09:52:37.90130569 +0000 UTC m=+0.129422891 container attach bb869c3a0c394b4ec34158681c35ecd579415ddceadc5c323599a2a55cbc1f7c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_satoshi, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, ceph=True, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, distribution-scope=public, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 28 04:52:37 localhost podman[292730]: 2025-11-28 09:52:37.822050824 +0000 UTC m=+0.050168025 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:52:38 localhost systemd[1]: libpod-bb869c3a0c394b4ec34158681c35ecd579415ddceadc5c323599a2a55cbc1f7c.scope: Deactivated successfully. Nov 28 04:52:38 localhost podman[292730]: 2025-11-28 09:52:38.002608378 +0000 UTC m=+0.230725619 container died bb869c3a0c394b4ec34158681c35ecd579415ddceadc5c323599a2a55cbc1f7c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_satoshi, architecture=x86_64, release=553, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux , io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, vcs-type=git, description=Red Hat Ceph Storage 7) Nov 28 04:52:38 localhost nova_compute[279673]: 2025-11-28 09:52:38.057 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:52:38 localhost podman[292771]: 2025-11-28 09:52:38.083253526 +0000 UTC m=+0.072285270 container remove bb869c3a0c394b4ec34158681c35ecd579415ddceadc5c323599a2a55cbc1f7c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_satoshi, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, name=rhceph, RELEASE=main, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, ceph=True, io.openshift.expose-services=, version=7, maintainer=Guillaume Abrioux , GIT_CLEAN=True, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55) Nov 28 04:52:38 localhost systemd[1]: libpod-conmon-bb869c3a0c394b4ec34158681c35ecd579415ddceadc5c323599a2a55cbc1f7c.scope: Deactivated successfully. Nov 28 04:52:38 localhost systemd[1]: Reloading. Nov 28 04:52:38 localhost systemd-sysv-generator[292817]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:52:38 localhost systemd-rc-local-generator[292811]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:52:38 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:52:38 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:52:38 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:52:38 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:52:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:52:38 localhost nova_compute[279673]: 2025-11-28 09:52:38.327 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:52:38 localhost nova_compute[279673]: 2025-11-28 09:52:38.328 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 04:52:38 localhost nova_compute[279673]: 2025-11-28 09:52:38.328 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 04:52:38 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:52:38 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:52:38 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:52:38 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:52:38 localhost nova_compute[279673]: 2025-11-28 09:52:38.402 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:52:38 localhost nova_compute[279673]: 2025-11-28 09:52:38.402 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:52:38 localhost nova_compute[279673]: 2025-11-28 09:52:38.402 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 04:52:38 localhost nova_compute[279673]: 2025-11-28 09:52:38.403 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:52:38 localhost systemd[1]: var-lib-containers-storage-overlay-36ebc93b29125860e1b39929ac9957a9d2872b5ba394384215c9eabc3709c0ef-merged.mount: Deactivated successfully. Nov 28 04:52:38 localhost systemd[1]: Reloading. Nov 28 04:52:38 localhost systemd-sysv-generator[292857]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Nov 28 04:52:38 localhost systemd-rc-local-generator[292849]: /etc/rc.d/rc.local is not marked executable, skipping. Nov 28 04:52:38 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:52:38 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Nov 28 04:52:38 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:52:38 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:52:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Nov 28 04:52:38 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Nov 28 04:52:38 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:52:38 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:52:38 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Nov 28 04:52:38 localhost nova_compute[279673]: 2025-11-28 09:52:38.784 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 04:52:38 localhost nova_compute[279673]: 2025-11-28 09:52:38.798 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:52:38 localhost nova_compute[279673]: 2025-11-28 09:52:38.799 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 04:52:38 localhost systemd[1]: Starting Ceph mon.np0005538513 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1... Nov 28 04:52:39 localhost podman[292915]: Nov 28 04:52:39 localhost podman[292915]: 2025-11-28 09:52:39.203394354 +0000 UTC m=+0.076246653 container create ca134fa658f8ef463fc51bf5d195486ba0b637d105c74e04079135961095ec50 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mon-np0005538513, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, maintainer=Guillaume Abrioux , release=553, vcs-type=git, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True) Nov 28 04:52:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:52:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:52:39 localhost systemd[1]: tmp-crun.4AzS7j.mount: Deactivated successfully. Nov 28 04:52:39 localhost podman[292915]: 2025-11-28 09:52:39.17262642 +0000 UTC m=+0.045478809 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:52:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20ed6216cd9662b2a673d46f9bd63d7634576c3919ee4f8211cda62883165257/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Nov 28 04:52:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20ed6216cd9662b2a673d46f9bd63d7634576c3919ee4f8211cda62883165257/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Nov 28 04:52:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20ed6216cd9662b2a673d46f9bd63d7634576c3919ee4f8211cda62883165257/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Nov 28 04:52:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/20ed6216cd9662b2a673d46f9bd63d7634576c3919ee4f8211cda62883165257/merged/var/lib/ceph/mon/ceph-np0005538513 supports timestamps until 2038 (0x7fffffff) Nov 28 04:52:39 localhost podman[292915]: 2025-11-28 09:52:39.290569845 +0000 UTC m=+0.163422254 container init ca134fa658f8ef463fc51bf5d195486ba0b637d105c74e04079135961095ec50 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mon-np0005538513, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, maintainer=Guillaume Abrioux , vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., release=553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, name=rhceph, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, version=7, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph) Nov 28 04:52:39 localhost podman[292929]: 2025-11-28 09:52:39.334378311 +0000 UTC m=+0.090140533 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 04:52:39 localhost ceph-mon[292954]: set uid:gid to 167:167 (ceph:ceph) Nov 28 04:52:39 localhost ceph-mon[292954]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2 Nov 28 04:52:39 localhost ceph-mon[292954]: pidfile_write: ignore empty --pid-file Nov 28 04:52:39 localhost podman[292929]: 2025-11-28 09:52:39.340303055 +0000 UTC m=+0.096065287 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible) Nov 28 04:52:39 localhost ceph-mon[292954]: load: jerasure load: lrc Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: RocksDB version: 7.9.2 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Git sha 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Compile date 2025-09-23 00:00:00 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: DB SUMMARY Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: DB Session ID: MM4LCQC4OTZXQR5A0TS6 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: CURRENT file: CURRENT Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: IDENTITY file: IDENTITY Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: MANIFEST file: MANIFEST-000005 size: 59 Bytes Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005538513/store.db dir, Total Num: 0, files: Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005538513/store.db: 000004.log size: 886 ; Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.error_if_exists: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.create_if_missing: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.paranoid_checks: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.flush_verify_memtable_count: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.env: 0x55b5b91039e0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.fs: PosixFileSystem Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.info_log: 0x55b5bb67cd20 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_file_opening_threads: 16 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.statistics: (nil) Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.use_fsync: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_log_file_size: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_manifest_file_size: 1073741824 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.log_file_time_to_roll: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.keep_log_file_num: 1000 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.recycle_log_file_num: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.allow_fallocate: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.allow_mmap_reads: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.allow_mmap_writes: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.use_direct_reads: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.create_missing_column_families: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.db_log_dir: Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.wal_dir: Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.table_cache_numshardbits: 6 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.WAL_ttl_seconds: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.WAL_size_limit_MB: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.manifest_preallocation_size: 4194304 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.is_fd_close_on_exec: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.advise_random_on_open: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.db_write_buffer_size: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.write_buffer_manager: 0x55b5bb68d540 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.access_hint_on_compaction_start: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.random_access_max_buffer_size: 1048576 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.use_adaptive_mutex: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.rate_limiter: (nil) Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.wal_recovery_mode: 2 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.enable_thread_tracking: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.enable_pipelined_write: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.unordered_write: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.allow_concurrent_memtable_write: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.write_thread_max_yield_usec: 100 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.write_thread_slow_yield_usec: 3 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.row_cache: None Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.wal_filter: None Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.avoid_flush_during_recovery: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.allow_ingest_behind: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.two_write_queues: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.manual_wal_flush: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.wal_compression: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.atomic_flush: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.persist_stats_to_disk: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.write_dbid_to_manifest: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.log_readahead_size: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.file_checksum_gen_factory: Unknown Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.best_efforts_recovery: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.allow_data_in_errors: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.db_host_id: __hostname__ Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.enforce_single_del_contracts: true Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_background_jobs: 2 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_background_compactions: -1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_subcompactions: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.avoid_flush_during_shutdown: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.writable_file_max_buffer_size: 1048576 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.delayed_write_rate : 16777216 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_total_wal_size: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.stats_dump_period_sec: 600 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.stats_persist_period_sec: 600 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.stats_history_buffer_size: 1048576 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_open_files: -1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.bytes_per_sync: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.wal_bytes_per_sync: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.strict_bytes_per_sync: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.compaction_readahead_size: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_background_flushes: -1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Compression algorithms supported: Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: #011kZSTD supported: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: #011kXpressCompression supported: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: #011kBZip2Compression supported: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: #011kLZ4Compression supported: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: #011kZlibCompression supported: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: #011kLZ4HCCompression supported: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: #011kSnappyCompression supported: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Fast CRC32 supported: Supported on x86 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: DMutex implementation: pthread_mutex_t Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005538513/store.db/MANIFEST-000005 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.comparator: leveldb.BytewiseComparator Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.merge_operator: Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.compaction_filter: None Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.compaction_filter_factory: None Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.sst_partitioner_factory: None Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.memtable_factory: SkipListFactory Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.table_factory: BlockBasedTable Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55b5bb67c980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55b5bb679350#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.write_buffer_size: 33554432 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_write_buffer_number: 2 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.compression: NoCompression Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.bottommost_compression: Disabled Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.prefix_extractor: nullptr Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.num_levels: 7 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.min_write_buffer_number_to_merge: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.bottommost_compression_opts.level: 32767 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.bottommost_compression_opts.enabled: false Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.compression_opts.window_bits: -14 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.compression_opts.level: 32767 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.compression_opts.strategy: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.compression_opts.parallel_threads: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.compression_opts.enabled: false Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.level0_file_num_compaction_trigger: 4 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.level0_stop_writes_trigger: 36 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.target_file_size_base: 67108864 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.target_file_size_multiplier: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_bytes_for_level_base: 268435456 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_compaction_bytes: 1677721600 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.arena_block_size: 1048576 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.disable_auto_compactions: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.compaction_style: kCompactionStyleLevel Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.table_properties_collectors: Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.inplace_update_support: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.inplace_update_num_locks: 10000 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.memtable_whole_key_filtering: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.memtable_huge_page_size: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.bloom_locality: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.max_successive_merges: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.optimize_filters_for_hits: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.paranoid_file_checks: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.force_consistency_checks: 1 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.report_bg_io_stats: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.ttl: 2592000 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.periodic_compaction_seconds: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.preclude_last_level_data_seconds: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.preserve_internal_time_seconds: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.enable_blob_files: false Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.min_blob_size: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.blob_file_size: 268435456 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.blob_compression_type: NoCompression Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.enable_blob_garbage_collection: false Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.blob_compaction_readahead_size: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.blob_file_starting_level: 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005538513/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 49d3ae8b-2ff6-4713-88ed-5986b1f8221e Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323559348949, "job": 1, "event": "recovery_started", "wal_files": [4]} Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323559353286, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 2012, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 898, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 776, "raw_average_value_size": 155, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}} Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323559353426, "job": 1, "event": "recovery_finished"} Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:5047] Creating manifest 10 Nov 28 04:52:39 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55b5bb6a0e00 Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: DB pointer 0x55b5bb796000 Nov 28 04:52:39 localhost ceph-mon[292954]: mon.np0005538513 does not exist in monmap, will attempt to join an existing cluster Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 04:52:39 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 1/0 1.96 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.4 0.00 0.00 1 0.004 0 0 0.0 0.0#012 Sum 1/0 1.96 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.4 0.00 0.00 1 0.004 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.4 0.00 0.00 1 0.004 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4 0.00 0.00 1 0.004 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.10 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.10 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b5bb679350#2 capacity: 512.00 MB usage: 1.30 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 3.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1,1.08 KB,0.000205636%) FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Nov 28 04:52:39 localhost ceph-mon[292954]: using public_addr v2:172.18.0.103:0/0 -> [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] Nov 28 04:52:39 localhost ceph-mon[292954]: starting mon.np0005538513 rank -1 at public addrs [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] at bind addrs [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005538513 fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 Nov 28 04:52:39 localhost ceph-mon[292954]: mon.np0005538513@-1(???) e0 preinit fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 Nov 28 04:52:39 localhost ceph-mon[292954]: mon.np0005538513@-1(synchronizing) e8 sync_obtain_latest_monmap Nov 28 04:52:39 localhost ceph-mon[292954]: mon.np0005538513@-1(synchronizing) e8 sync_obtain_latest_monmap obtained monmap e8 Nov 28 04:52:39 localhost podman[292927]: 2025-11-28 09:52:39.393873084 +0000 UTC m=+0.150744210 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller) Nov 28 04:52:39 localhost podman[292915]: 2025-11-28 09:52:39.402049817 +0000 UTC m=+0.274902156 container start ca134fa658f8ef463fc51bf5d195486ba0b637d105c74e04079135961095ec50 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mon-np0005538513, com.redhat.component=rhceph-container, version=7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, name=rhceph, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.buildah.version=1.33.12, release=553) Nov 28 04:52:39 localhost bash[292915]: ca134fa658f8ef463fc51bf5d195486ba0b637d105c74e04079135961095ec50 Nov 28 04:52:39 localhost systemd[1]: Started Ceph mon.np0005538513 for 2c5417c9-00eb-57d5-a565-ddecbc7995c1. Nov 28 04:52:39 localhost podman[292927]: 2025-11-28 09:52:39.458661402 +0000 UTC m=+0.215532528 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 04:52:39 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:52:39 localhost ceph-mon[292954]: mon.np0005538513@-1(synchronizing).mds e17 new map Nov 28 04:52:39 localhost ceph-mon[292954]: mon.np0005538513@-1(synchronizing).mds e17 print_map#012e17#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#01115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-28T08:07:30.958224+0000#012modified#0112025-11-28T09:49:53.259185+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#01183#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=26449}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[6]#012metadata_pool#0117#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 26449 members: 26449#012[mds.mds.np0005538514.umgtoy{0:26449} state up:active seq 12 addr [v2:172.18.0.107:6808/1969410151,v1:172.18.0.107:6809/1969410151] compat {c=[1],r=[1],i=[17ff]}]#012 #012 #012Standby daemons:#012 #012[mds.mds.np0005538513.yljthc{-1:16968} state up:standby seq 1 addr [v2:172.18.0.106:6808/2782735008,v1:172.18.0.106:6809/2782735008] compat {c=[1],r=[1],i=[17ff]}]#012[mds.mds.np0005538515.anvatb{-1:26446} state up:standby seq 1 addr [v2:172.18.0.108:6808/2640180,v1:172.18.0.108:6809/2640180] compat {c=[1],r=[1],i=[17ff]}] Nov 28 04:52:39 localhost ceph-mon[292954]: mon.np0005538513@-1(synchronizing).osd e86 crush map has features 3314933000852226048, adjusting msgr requires Nov 28 04:52:39 localhost ceph-mon[292954]: mon.np0005538513@-1(synchronizing).osd e86 crush map has features 288514051259236352, adjusting msgr requires Nov 28 04:52:39 localhost ceph-mon[292954]: mon.np0005538513@-1(synchronizing).osd e86 crush map has features 288514051259236352, adjusting msgr requires Nov 28 04:52:39 localhost ceph-mon[292954]: mon.np0005538513@-1(synchronizing).osd e86 crush map has features 288514051259236352, adjusting msgr requires Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring crash.np0005538514 (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: mon.np0005538514 calling monitor election Nov 28 04:52:39 localhost ceph-mon[292954]: Removed label mon from host np0005538510.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring osd.3 (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon osd.3 on np0005538514.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: mon.np0005538511 calling monitor election Nov 28 04:52:39 localhost ceph-mon[292954]: mon.np0005538515 calling monitor election Nov 28 04:52:39 localhost ceph-mon[292954]: mon.np0005538512 calling monitor election Nov 28 04:52:39 localhost ceph-mon[292954]: mon.np0005538512 is new leader, mons np0005538512,np0005538511,np0005538515,np0005538514,np0005538513 in quorum (ranks 0,1,2,3,4) Nov 28 04:52:39 localhost ceph-mon[292954]: Health check cleared: MON_DOWN (was: 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538513) Nov 28 04:52:39 localhost ceph-mon[292954]: Cluster is now healthy Nov 28 04:52:39 localhost ceph-mon[292954]: overall HEALTH_OK Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: Removed label mgr from host np0005538510.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538514.djozup (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: Removed label _admin from host np0005538510.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring mon.np0005538514 (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon mon.np0005538514 on np0005538514.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring crash.np0005538515 (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring osd.1 (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon osd.1 on np0005538515.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring osd.4 (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon osd.4 on np0005538515.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring mon.np0005538515 (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Removing np0005538510.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:52:39 localhost ceph-mon[292954]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:39 localhost ceph-mon[292954]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:39 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:39 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:39 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: Removing np0005538510.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:52:39 localhost ceph-mon[292954]: Removing np0005538510.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:52:39 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:52:39 localhost ceph-mon[292954]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:52:39 localhost ceph-mon[292954]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:52:39 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:52:39 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: Removing daemon mgr.np0005538510.nzitwz from np0005538510.localdomain -- ports [9283, 8765] Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: Added label _no_schedule to host np0005538510.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005538510.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: Removing key for mgr.np0005538510.nzitwz Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth rm", "entity": "mgr.np0005538510.nzitwz"} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005538510.nzitwz"}]': finished Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: Removing daemon crash.np0005538510 from np0005538510.localdomain -- ports [] Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain"} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain"}]': finished Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth rm", "entity": "client.crash.np0005538510.localdomain"} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd='[{"prefix": "auth rm", "entity": "client.crash.np0005538510.localdomain"}]': finished Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: Removing key for client.crash.np0005538510.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: Removed host np0005538510.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring crash.np0005538511 (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538511.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538511 on np0005538511.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring mon.np0005538511 (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon mon.np0005538511 on np0005538511.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538511.fvuybw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538511.fvuybw (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538511.fvuybw on np0005538511.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring mon.np0005538512 (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring crash.np0005538512 (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring crash.np0005538513 (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring osd.2 (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon osd.2 on np0005538513.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring osd.5 (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon osd.5 on np0005538513.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: Saving service mon spec with placement label:mon Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Remove daemons mon.np0005538513 Nov 28 04:52:39 localhost ceph-mon[292954]: Safe to remove mon.np0005538513: new quorum should be ['np0005538512', 'np0005538511', 'np0005538515', 'np0005538514'] (from ['np0005538512', 'np0005538511', 'np0005538515', 'np0005538514']) Nov 28 04:52:39 localhost ceph-mon[292954]: Removing monitor np0005538513 from monmap... Nov 28 04:52:39 localhost ceph-mon[292954]: Removing daemon mon.np0005538513 from np0005538513.localdomain -- ports [] Nov 28 04:52:39 localhost ceph-mon[292954]: mon.np0005538511 calling monitor election Nov 28 04:52:39 localhost ceph-mon[292954]: mon.np0005538515 calling monitor election Nov 28 04:52:39 localhost ceph-mon[292954]: mon.np0005538514 calling monitor election Nov 28 04:52:39 localhost ceph-mon[292954]: mon.np0005538512 calling monitor election Nov 28 04:52:39 localhost ceph-mon[292954]: mon.np0005538512 is new leader, mons np0005538512,np0005538511,np0005538515,np0005538514 in quorum (ranks 0,1,2,3) Nov 28 04:52:39 localhost ceph-mon[292954]: overall HEALTH_OK Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:39 localhost ceph-mon[292954]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:39 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:39 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:39 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:39 localhost ceph-mon[292954]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:52:39 localhost ceph-mon[292954]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:52:39 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:52:39 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:52:39 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538511.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring crash.np0005538511 (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538511 on np0005538511.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538511.fvuybw (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538511.fvuybw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538511.fvuybw on np0005538511.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring crash.np0005538512 (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring crash.np0005538513 (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring osd.2 (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon osd.2 on np0005538513.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring osd.5 (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon osd.5 on np0005538513.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring crash.np0005538514 (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring osd.0 (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon osd.0 on np0005538514.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Deploying daemon mon.np0005538513 on np0005538513.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring osd.3 (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon osd.3 on np0005538514.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)... Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:39 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:52:39 localhost ceph-mon[292954]: mon.np0005538513@-1(synchronizing).paxosservice(auth 1..37) refresh upgraded, format 0 -> 3 Nov 28 04:52:39 localhost ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b8f20 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0 Nov 28 04:52:39 localhost nova_compute[279673]: 2025-11-28 09:52:39.825 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:52:40 localhost podman[238687]: time="2025-11-28T09:52:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:52:40 localhost podman[238687]: @ - - [28/Nov/2025:09:52:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 28 04:52:40 localhost podman[238687]: @ - - [28/Nov/2025:09:52:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18696 "" "Go-http-client/1.1" Nov 28 04:52:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:52:40 localhost systemd[1]: tmp-crun.eahCiq.mount: Deactivated successfully. Nov 28 04:52:40 localhost podman[293015]: 2025-11-28 09:52:40.859577658 +0000 UTC m=+0.097061448 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 28 04:52:40 localhost podman[293015]: 2025-11-28 09:52:40.871415894 +0000 UTC m=+0.108899684 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible) Nov 28 04:52:40 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:52:41 localhost ceph-mon[292954]: mon.np0005538513@-1(probing) e9 my rank is now 4 (was -1) Nov 28 04:52:41 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : mon.np0005538513 calling monitor election Nov 28 04:52:41 localhost ceph-mon[292954]: paxos.4).electionLogic(0) init, first boot, initializing epoch at 1 Nov 28 04:52:41 localhost ceph-mon[292954]: mon.np0005538513@4(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:52:43 localhost nova_compute[279673]: 2025-11-28 09:52:43.093 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:52:44 localhost nova_compute[279673]: 2025-11-28 09:52:44.870 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:52:46 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : mon.np0005538513 calling monitor election Nov 28 04:52:46 localhost ceph-mon[292954]: paxos.4).electionLogic(0) init, first boot, initializing epoch at 1 Nov 28 04:52:46 localhost ceph-mon[292954]: mon.np0005538513@4(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:52:46 localhost ceph-mon[292954]: mon.np0005538513@4(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:52:46 localhost ceph-mon[292954]: mon.np0005538513@4(peon) e9 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code} Nov 28 04:52:46 localhost ceph-mon[292954]: mon.np0005538513@4(peon) e9 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout} Nov 28 04:52:46 localhost ceph-mon[292954]: mon.np0005538511 calling monitor election Nov 28 04:52:46 localhost ceph-mon[292954]: mon.np0005538515 calling monitor election Nov 28 04:52:46 localhost ceph-mon[292954]: mon.np0005538514 calling monitor election Nov 28 04:52:46 localhost ceph-mon[292954]: mon.np0005538512 calling monitor election Nov 28 04:52:46 localhost ceph-mon[292954]: mon.np0005538512 is new leader, mons np0005538512,np0005538511,np0005538515,np0005538514 in quorum (ranks 0,1,2,3) Nov 28 04:52:46 localhost ceph-mon[292954]: Health check failed: 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538514 (MON_DOWN) Nov 28 04:52:46 localhost ceph-mon[292954]: Health detail: HEALTH_WARN 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538514 Nov 28 04:52:46 localhost ceph-mon[292954]: [WRN] MON_DOWN: 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538514 Nov 28 04:52:46 localhost ceph-mon[292954]: mon.np0005538513 (rank 4) addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] is down (out of quorum) Nov 28 04:52:46 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:46 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:46 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:52:46 localhost ceph-mon[292954]: Reconfiguring crash.np0005538515 (monmap changed)... Nov 28 04:52:46 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain Nov 28 04:52:46 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:46 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:46 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 28 04:52:46 localhost ceph-mon[292954]: mon.np0005538513@4(peon) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:52:46 localhost ceph-mon[292954]: mgrc update_daemon_metadata mon.np0005538513 metadata {addrs=[v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005538513.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.6 (Plow),distro_version=9.6,hostname=np0005538513.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux} Nov 28 04:52:47 localhost ceph-mon[292954]: mon.np0005538513 calling monitor election Nov 28 04:52:47 localhost ceph-mon[292954]: mon.np0005538513 calling monitor election Nov 28 04:52:47 localhost ceph-mon[292954]: mon.np0005538511 calling monitor election Nov 28 04:52:47 localhost ceph-mon[292954]: mon.np0005538515 calling monitor election Nov 28 04:52:47 localhost ceph-mon[292954]: mon.np0005538514 calling monitor election Nov 28 04:52:47 localhost ceph-mon[292954]: mon.np0005538512 calling monitor election Nov 28 04:52:47 localhost ceph-mon[292954]: mon.np0005538512 is new leader, mons np0005538512,np0005538511,np0005538515,np0005538514,np0005538513 in quorum (ranks 0,1,2,3,4) Nov 28 04:52:47 localhost ceph-mon[292954]: Health check cleared: MON_DOWN (was: 1/5 mons down, quorum np0005538512,np0005538511,np0005538515,np0005538514) Nov 28 04:52:47 localhost ceph-mon[292954]: Cluster is now healthy Nov 28 04:52:47 localhost ceph-mon[292954]: overall HEALTH_OK Nov 28 04:52:47 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:47 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:47 localhost ceph-mon[292954]: Reconfiguring osd.4 (monmap changed)... Nov 28 04:52:47 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 28 04:52:47 localhost ceph-mon[292954]: Reconfiguring daemon osd.4 on np0005538515.localdomain Nov 28 04:52:48 localhost openstack_network_exporter[240658]: ERROR 09:52:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:52:48 localhost openstack_network_exporter[240658]: ERROR 09:52:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:52:48 localhost openstack_network_exporter[240658]: ERROR 09:52:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:52:48 localhost openstack_network_exporter[240658]: ERROR 09:52:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:52:48 localhost openstack_network_exporter[240658]: Nov 28 04:52:48 localhost openstack_network_exporter[240658]: ERROR 09:52:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:52:48 localhost openstack_network_exporter[240658]: Nov 28 04:52:48 localhost nova_compute[279673]: 2025-11-28 09:52:48.092 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:52:48 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:48 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:48 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)... Nov 28 04:52:48 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:52:48 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain Nov 28 04:52:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:52:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:52:49 localhost podman[293035]: 2025-11-28 09:52:49.854197348 +0000 UTC m=+0.086272982 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3) Nov 28 04:52:49 localhost podman[293035]: 2025-11-28 09:52:49.866054746 +0000 UTC m=+0.098130370 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:52:49 localhost nova_compute[279673]: 2025-11-28 09:52:49.872 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:52:49 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:52:49 localhost systemd[1]: tmp-crun.KIUrbA.mount: Deactivated successfully. Nov 28 04:52:49 localhost podman[293034]: 2025-11-28 09:52:49.963775523 +0000 UTC m=+0.198868071 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:52:49 localhost podman[293034]: 2025-11-28 09:52:49.973967589 +0000 UTC m=+0.209060137 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:52:49 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:52:49 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:49 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:49 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)... Nov 28 04:52:49 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:52:49 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain Nov 28 04:52:49 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:49 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:52:50.830 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:52:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:52:50.831 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:52:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:52:50.833 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:52:50 localhost ceph-mon[292954]: mon.np0005538513@4(peon) e9 handle_command mon_command({"prefix": "status", "format": "json"} v 0) Nov 28 04:52:50 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/438518273' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch Nov 28 04:52:51 localhost podman[293181]: 2025-11-28 09:52:51.035313366 +0000 UTC m=+0.105369266 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, ceph=True, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-type=git, release=553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, architecture=x86_64, io.buildah.version=1.33.12, version=7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=) Nov 28 04:52:51 localhost podman[293181]: 2025-11-28 09:52:51.137963776 +0000 UTC m=+0.208019636 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, version=7, vendor=Red Hat, Inc., release=553, name=rhceph, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_CLEAN=True, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.openshift.expose-services=, ceph=True) Nov 28 04:52:52 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:52 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:52 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:52 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:52 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:52 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:52 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:52 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:52 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:52 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:52 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:52 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:52 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:52 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:52 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:52:53 localhost nova_compute[279673]: 2025-11-28 09:52:53.136 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:52:53 localhost ceph-mon[292954]: Reconfig service osd.default_drive_group Nov 28 04:52:53 localhost ceph-mon[292954]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:53 localhost ceph-mon[292954]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:53 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:53 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:53 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:54 localhost ceph-mon[292954]: mon.np0005538513@4(peon).osd e86 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375 Nov 28 04:52:54 localhost ceph-mon[292954]: mon.np0005538513@4(peon).osd e86 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1 Nov 28 04:52:54 localhost ceph-mon[292954]: mon.np0005538513@4(peon).osd e87 e87: 6 total, 6 up, 6 in Nov 28 04:52:54 localhost systemd-logind[764]: Session 64 logged out. Waiting for processes to exit. Nov 28 04:52:54 localhost systemd[1]: session-64.scope: Deactivated successfully. Nov 28 04:52:54 localhost systemd[1]: session-64.scope: Consumed 29.579s CPU time. Nov 28 04:52:54 localhost systemd-logind[764]: Removed session 64. Nov 28 04:52:54 localhost ceph-mon[292954]: mon.np0005538513@4(peon).osd e87 _set_new_cache_sizes cache_size:1019519450 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:52:54 localhost sshd[293690]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:52:54 localhost systemd-logind[764]: New session 67 of user ceph-admin. Nov 28 04:52:54 localhost systemd[1]: Started Session 67 of User ceph-admin. Nov 28 04:52:54 localhost nova_compute[279673]: 2025-11-28 09:52:54.901 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:52:54 localhost ceph-mon[292954]: from='client.? 172.18.0.200:0/1216930330' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 28 04:52:54 localhost ceph-mon[292954]: Activating manager daemon np0005538514.djozup Nov 28 04:52:54 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:54 localhost ceph-mon[292954]: from='client.? 172.18.0.200:0/1216930330' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Nov 28 04:52:54 localhost ceph-mon[292954]: from='mgr.14190 172.18.0.105:0/2825788323' entity='mgr.np0005538512.zyhkxs' Nov 28 04:52:54 localhost ceph-mon[292954]: Manager daemon np0005538514.djozup is now available Nov 28 04:52:54 localhost ceph-mon[292954]: removing stray HostCache host record np0005538510.localdomain.devices.0 Nov 28 04:52:54 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain.devices.0"} : dispatch Nov 28 04:52:54 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain.devices.0"} : dispatch Nov 28 04:52:54 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain.devices.0"}]': finished Nov 28 04:52:54 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain.devices.0"} : dispatch Nov 28 04:52:54 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain.devices.0"} : dispatch Nov 28 04:52:54 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538510.localdomain.devices.0"}]': finished Nov 28 04:52:54 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/mirror_snapshot_schedule"} : dispatch Nov 28 04:52:54 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/mirror_snapshot_schedule"} : dispatch Nov 28 04:52:54 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/trash_purge_schedule"} : dispatch Nov 28 04:52:54 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/trash_purge_schedule"} : dispatch Nov 28 04:52:55 localhost podman[293799]: 2025-11-28 09:52:55.71800585 +0000 UTC m=+0.077776641 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, distribution-scope=public, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, release=553, RELEASE=main, ceph=True, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , vcs-type=git) Nov 28 04:52:55 localhost podman[293799]: 2025-11-28 09:52:55.828120671 +0000 UTC m=+0.187891482 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, RELEASE=main, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, name=rhceph, distribution-scope=public, CEPH_POINT_RELEASE=) Nov 28 04:52:56 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:52:56 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:52:56 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:52:57 localhost ceph-mon[292954]: [28/Nov/2025:09:52:55] ENGINE Bus STARTING Nov 28 04:52:57 localhost ceph-mon[292954]: [28/Nov/2025:09:52:56] ENGINE Serving on http://172.18.0.107:8765 Nov 28 04:52:57 localhost ceph-mon[292954]: [28/Nov/2025:09:52:56] ENGINE Serving on https://172.18.0.107:7150 Nov 28 04:52:57 localhost ceph-mon[292954]: [28/Nov/2025:09:52:56] ENGINE Bus STARTED Nov 28 04:52:57 localhost ceph-mon[292954]: [28/Nov/2025:09:52:56] ENGINE Client ('172.18.0.107', 40776) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Nov 28 04:52:57 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:52:57 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:52:57 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:52:57 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:52:57 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:52:57 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:52:57 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:52:58 localhost nova_compute[279673]: 2025-11-28 09:52:58.141 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} : dispatch Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} : dispatch Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} : dispatch Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} : dispatch Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 28 04:52:58 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:52:59 localhost ceph-mon[292954]: Adjusting osd_memory_target on np0005538514.localdomain to 836.6M Nov 28 04:52:59 localhost ceph-mon[292954]: Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 04:52:59 localhost ceph-mon[292954]: Adjusting osd_memory_target on np0005538515.localdomain to 836.6M Nov 28 04:52:59 localhost ceph-mon[292954]: Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 04:52:59 localhost ceph-mon[292954]: Adjusting osd_memory_target on np0005538513.localdomain to 836.6M Nov 28 04:52:59 localhost ceph-mon[292954]: Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 04:52:59 localhost ceph-mon[292954]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:59 localhost ceph-mon[292954]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:59 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:59 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:59 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:52:59 localhost ceph-mon[292954]: mon.np0005538513@4(peon).osd e87 _set_new_cache_sizes cache_size:1020040748 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:52:59 localhost nova_compute[279673]: 2025-11-28 09:52:59.904 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:53:00 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:53:00 localhost ceph-mon[292954]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:53:00 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:53:00 localhost ceph-mon[292954]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:53:00 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:53:00 localhost ceph-mon[292954]: Updating np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:53:00 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:53:00 localhost ceph-mon[292954]: Updating np0005538511.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:53:00 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:53:01 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:53:01 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:53:01 localhost ceph-mon[292954]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:53:01 localhost ceph-mon[292954]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:53:01 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:53:01 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:53:01 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:01 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:01 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:01 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:01 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:01 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:01 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:01 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:01 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:01 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:01 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:01 localhost ceph-mon[292954]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Nov 28 04:53:01 localhost ceph-mon[292954]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Nov 28 04:53:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:53:01 localhost podman[294698]: 2025-11-28 09:53:01.574330248 +0000 UTC m=+0.093222378 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible) Nov 28 04:53:01 localhost podman[294698]: 2025-11-28 09:53:01.620418226 +0000 UTC m=+0.139310386 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, distribution-scope=public, io.buildah.version=1.33.7, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, release=1755695350) Nov 28 04:53:01 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:53:02 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538511.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:53:02 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538511.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:53:03 localhost nova_compute[279673]: 2025-11-28 09:53:03.143 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:53:03 localhost ceph-mon[292954]: Reconfiguring crash.np0005538511 (monmap changed)... Nov 28 04:53:03 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538511 on np0005538511.localdomain Nov 28 04:53:03 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:03 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:03 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538511.fvuybw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:53:03 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538511.fvuybw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:53:03 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:03 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:03 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:53:03 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:53:04 localhost ceph-mon[292954]: mon.np0005538513@4(peon).osd e87 _set_new_cache_sizes cache_size:1020054355 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:53:04 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538511.fvuybw (monmap changed)... Nov 28 04:53:04 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538511.fvuybw on np0005538511.localdomain Nov 28 04:53:04 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)... Nov 28 04:53:04 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain Nov 28 04:53:04 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:04 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:04 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:53:04 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:53:04 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:04 localhost nova_compute[279673]: 2025-11-28 09:53:04.928 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:53:05 localhost ceph-mon[292954]: Reconfiguring crash.np0005538512 (monmap changed)... Nov 28 04:53:05 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain Nov 28 04:53:05 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:05 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:05 localhost ceph-mon[292954]: Reconfiguring crash.np0005538513 (monmap changed)... Nov 28 04:53:05 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:53:05 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:53:05 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain Nov 28 04:53:05 localhost podman[294773]: Nov 28 04:53:05 localhost podman[294773]: 2025-11-28 09:53:05.860239403 +0000 UTC m=+0.095351775 container create db931d5ba21eeabd950812d709b1c23c6bd4df8f66693bea362abecf1aaf3285 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_chaplygin, GIT_CLEAN=True, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , name=rhceph, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, architecture=x86_64, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, CEPH_POINT_RELEASE=, vcs-type=git, release=553, io.buildah.version=1.33.12) Nov 28 04:53:05 localhost systemd[1]: Started libpod-conmon-db931d5ba21eeabd950812d709b1c23c6bd4df8f66693bea362abecf1aaf3285.scope. Nov 28 04:53:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:53:05 localhost podman[294773]: 2025-11-28 09:53:05.818128648 +0000 UTC m=+0.053240990 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:53:05 localhost systemd[1]: Started libcrun container. Nov 28 04:53:05 localhost podman[294773]: 2025-11-28 09:53:05.945370689 +0000 UTC m=+0.180483011 container init db931d5ba21eeabd950812d709b1c23c6bd4df8f66693bea362abecf1aaf3285 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_chaplygin, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_CLEAN=True, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, ceph=True, distribution-scope=public, vcs-type=git, release=553) Nov 28 04:53:05 localhost podman[294773]: 2025-11-28 09:53:05.957130164 +0000 UTC m=+0.192242476 container start db931d5ba21eeabd950812d709b1c23c6bd4df8f66693bea362abecf1aaf3285 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_chaplygin, vcs-type=git, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, GIT_CLEAN=True, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , version=7, distribution-scope=public, release=553, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, io.openshift.tags=rhceph ceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7) Nov 28 04:53:05 localhost podman[294773]: 2025-11-28 09:53:05.957394412 +0000 UTC m=+0.192506724 container attach db931d5ba21eeabd950812d709b1c23c6bd4df8f66693bea362abecf1aaf3285 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_chaplygin, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, name=rhceph, ceph=True, maintainer=Guillaume Abrioux , release=553, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, GIT_BRANCH=main, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc.) Nov 28 04:53:05 localhost naughty_chaplygin[294788]: 167 167 Nov 28 04:53:05 localhost systemd[1]: libpod-db931d5ba21eeabd950812d709b1c23c6bd4df8f66693bea362abecf1aaf3285.scope: Deactivated successfully. Nov 28 04:53:05 localhost podman[294773]: 2025-11-28 09:53:05.964827362 +0000 UTC m=+0.199939674 container died db931d5ba21eeabd950812d709b1c23c6bd4df8f66693bea362abecf1aaf3285 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_chaplygin, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, version=7, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.openshift.expose-services=, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, release=553, vendor=Red Hat, Inc., architecture=x86_64, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 28 04:53:06 localhost podman[294789]: 2025-11-28 09:53:06.034146429 +0000 UTC m=+0.109850153 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 28 04:53:06 localhost podman[294804]: 2025-11-28 09:53:06.082455146 +0000 UTC m=+0.106364266 container remove db931d5ba21eeabd950812d709b1c23c6bd4df8f66693bea362abecf1aaf3285 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_chaplygin, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, RELEASE=main, CEPH_POINT_RELEASE=, architecture=x86_64, release=553, ceph=True, version=7, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., io.buildah.version=1.33.12, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 28 04:53:06 localhost systemd[1]: libpod-conmon-db931d5ba21eeabd950812d709b1c23c6bd4df8f66693bea362abecf1aaf3285.scope: Deactivated successfully. Nov 28 04:53:06 localhost podman[294789]: 2025-11-28 09:53:06.124497188 +0000 UTC m=+0.200200942 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 28 04:53:06 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:53:06 localhost systemd[1]: var-lib-containers-storage-overlay-68ef4e00ce326bb2ea1f7906d2df4a8e9fc1c97e404c1c9736f5f7a0bf556f1b-merged.mount: Deactivated successfully. Nov 28 04:53:06 localhost podman[294884]: Nov 28 04:53:06 localhost podman[294884]: 2025-11-28 09:53:06.927534984 +0000 UTC m=+0.089061260 container create 48a2fd060f1d09c2c03ef659a8d747f2a55c74f1cec2a6eacd0ba1ffec9424cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_clarke, vcs-type=git, io.buildah.version=1.33.12, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, release=553, com.redhat.component=rhceph-container, version=7, architecture=x86_64, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True) Nov 28 04:53:06 localhost systemd[1]: Started libpod-conmon-48a2fd060f1d09c2c03ef659a8d747f2a55c74f1cec2a6eacd0ba1ffec9424cd.scope. Nov 28 04:53:06 localhost podman[294884]: 2025-11-28 09:53:06.892460087 +0000 UTC m=+0.053986383 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:53:07 localhost systemd[1]: Started libcrun container. Nov 28 04:53:07 localhost podman[294884]: 2025-11-28 09:53:07.027462149 +0000 UTC m=+0.188988425 container init 48a2fd060f1d09c2c03ef659a8d747f2a55c74f1cec2a6eacd0ba1ffec9424cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_clarke, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, CEPH_POINT_RELEASE=, vcs-type=git, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, GIT_CLEAN=True, release=553, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 28 04:53:07 localhost podman[294884]: 2025-11-28 09:53:07.038174941 +0000 UTC m=+0.199701217 container start 48a2fd060f1d09c2c03ef659a8d747f2a55c74f1cec2a6eacd0ba1ffec9424cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_clarke, name=rhceph, GIT_BRANCH=main, vendor=Red Hat, Inc., distribution-scope=public, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, release=553, ceph=True, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12) Nov 28 04:53:07 localhost podman[294884]: 2025-11-28 09:53:07.038546833 +0000 UTC m=+0.200073169 container attach 48a2fd060f1d09c2c03ef659a8d747f2a55c74f1cec2a6eacd0ba1ffec9424cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_clarke, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, com.redhat.component=rhceph-container, version=7, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, GIT_CLEAN=True, release=553, io.openshift.expose-services=, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, distribution-scope=public, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 04:53:07 localhost blissful_clarke[294899]: 167 167 Nov 28 04:53:07 localhost systemd[1]: libpod-48a2fd060f1d09c2c03ef659a8d747f2a55c74f1cec2a6eacd0ba1ffec9424cd.scope: Deactivated successfully. Nov 28 04:53:07 localhost podman[294884]: 2025-11-28 09:53:07.044172877 +0000 UTC m=+0.205699173 container died 48a2fd060f1d09c2c03ef659a8d747f2a55c74f1cec2a6eacd0ba1ffec9424cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_clarke, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, release=553, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_BRANCH=main) Nov 28 04:53:07 localhost podman[294904]: 2025-11-28 09:53:07.153946527 +0000 UTC m=+0.091245798 container remove 48a2fd060f1d09c2c03ef659a8d747f2a55c74f1cec2a6eacd0ba1ffec9424cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_clarke, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, architecture=x86_64, version=7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, name=rhceph, com.redhat.component=rhceph-container, io.openshift.expose-services=, RELEASE=main, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 28 04:53:07 localhost systemd[1]: libpod-conmon-48a2fd060f1d09c2c03ef659a8d747f2a55c74f1cec2a6eacd0ba1ffec9424cd.scope: Deactivated successfully. Nov 28 04:53:07 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:07 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:07 localhost ceph-mon[292954]: Reconfiguring osd.2 (monmap changed)... Nov 28 04:53:07 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 28 04:53:07 localhost ceph-mon[292954]: Reconfiguring daemon osd.2 on np0005538513.localdomain Nov 28 04:53:07 localhost systemd[1]: var-lib-containers-storage-overlay-5034f1fed43bbc6712c686731f1f23a696002f82c4d98fba781293102e0388ff-merged.mount: Deactivated successfully. Nov 28 04:53:08 localhost podman[294981]: Nov 28 04:53:08 localhost podman[294981]: 2025-11-28 09:53:08.093972726 +0000 UTC m=+0.091467694 container create f6be8b90b0e8a48745edaf2e8ac9d5fae6084777353c680abb82bf261065f4cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_edison, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, release=553, vcs-type=git, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, RELEASE=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , GIT_BRANCH=main, distribution-scope=public, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True) Nov 28 04:53:08 localhost systemd[1]: Started libpod-conmon-f6be8b90b0e8a48745edaf2e8ac9d5fae6084777353c680abb82bf261065f4cb.scope. Nov 28 04:53:08 localhost systemd[1]: Started libcrun container. Nov 28 04:53:08 localhost nova_compute[279673]: 2025-11-28 09:53:08.150 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:53:08 localhost podman[294981]: 2025-11-28 09:53:08.053302056 +0000 UTC m=+0.050797094 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:53:08 localhost podman[294981]: 2025-11-28 09:53:08.157005789 +0000 UTC m=+0.154500787 container init f6be8b90b0e8a48745edaf2e8ac9d5fae6084777353c680abb82bf261065f4cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_edison, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, GIT_CLEAN=True, ceph=True, io.openshift.expose-services=, GIT_BRANCH=main, RELEASE=main, name=rhceph, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, io.openshift.tags=rhceph ceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux ) Nov 28 04:53:08 localhost podman[294981]: 2025-11-28 09:53:08.167131772 +0000 UTC m=+0.164626800 container start f6be8b90b0e8a48745edaf2e8ac9d5fae6084777353c680abb82bf261065f4cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_edison, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_CLEAN=True, ceph=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, name=rhceph, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, release=553, version=7, io.buildah.version=1.33.12) Nov 28 04:53:08 localhost podman[294981]: 2025-11-28 09:53:08.167496474 +0000 UTC m=+0.164991512 container attach f6be8b90b0e8a48745edaf2e8ac9d5fae6084777353c680abb82bf261065f4cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_edison, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, release=553, vcs-type=git, version=7, distribution-scope=public, ceph=True, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, GIT_CLEAN=True, architecture=x86_64, CEPH_POINT_RELEASE=, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7) Nov 28 04:53:08 localhost confident_edison[294996]: 167 167 Nov 28 04:53:08 localhost systemd[1]: libpod-f6be8b90b0e8a48745edaf2e8ac9d5fae6084777353c680abb82bf261065f4cb.scope: Deactivated successfully. Nov 28 04:53:08 localhost podman[294981]: 2025-11-28 09:53:08.171420245 +0000 UTC m=+0.168915233 container died f6be8b90b0e8a48745edaf2e8ac9d5fae6084777353c680abb82bf261065f4cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_edison, io.buildah.version=1.33.12, version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, name=rhceph, vendor=Red Hat, Inc., ceph=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, com.redhat.component=rhceph-container, distribution-scope=public, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=) Nov 28 04:53:08 localhost podman[295001]: 2025-11-28 09:53:08.269120892 +0000 UTC m=+0.087589604 container remove f6be8b90b0e8a48745edaf2e8ac9d5fae6084777353c680abb82bf261065f4cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_edison, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, RELEASE=main, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, vcs-type=git, vendor=Red Hat, Inc.) Nov 28 04:53:08 localhost systemd[1]: libpod-conmon-f6be8b90b0e8a48745edaf2e8ac9d5fae6084777353c680abb82bf261065f4cb.scope: Deactivated successfully. Nov 28 04:53:08 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:08 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:08 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:08 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:08 localhost ceph-mon[292954]: Reconfiguring osd.5 (monmap changed)... Nov 28 04:53:08 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 28 04:53:08 localhost ceph-mon[292954]: Reconfiguring daemon osd.5 on np0005538513.localdomain Nov 28 04:53:08 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:08 localhost systemd[1]: var-lib-containers-storage-overlay-d4182eb10206f7f6ec0d226e6b68a747a63dd49045a7946fb9aeffeaa9affe85-merged.mount: Deactivated successfully. Nov 28 04:53:09 localhost podman[295077]: Nov 28 04:53:09 localhost podman[295077]: 2025-11-28 09:53:09.24915852 +0000 UTC m=+0.084162338 container create c31ccd1e5f47c4e67f6cf8225cf3cbb75821f4d3f06909a920088350b66a73af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_clarke, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, distribution-scope=public, vcs-type=git, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, version=7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, name=rhceph, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 28 04:53:09 localhost systemd[1]: Started libpod-conmon-c31ccd1e5f47c4e67f6cf8225cf3cbb75821f4d3f06909a920088350b66a73af.scope. Nov 28 04:53:09 localhost podman[295077]: 2025-11-28 09:53:09.21334576 +0000 UTC m=+0.048349628 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:53:09 localhost systemd[1]: Started libcrun container. Nov 28 04:53:09 localhost podman[295077]: 2025-11-28 09:53:09.328884499 +0000 UTC m=+0.163888327 container init c31ccd1e5f47c4e67f6cf8225cf3cbb75821f4d3f06909a920088350b66a73af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_clarke, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, name=rhceph, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, ceph=True, com.redhat.component=rhceph-container) Nov 28 04:53:09 localhost podman[295077]: 2025-11-28 09:53:09.340060075 +0000 UTC m=+0.175063943 container start c31ccd1e5f47c4e67f6cf8225cf3cbb75821f4d3f06909a920088350b66a73af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_clarke, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, build-date=2025-09-24T08:57:55, release=553, io.buildah.version=1.33.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, maintainer=Guillaume Abrioux , GIT_BRANCH=main, vcs-type=git, name=rhceph) Nov 28 04:53:09 localhost podman[295077]: 2025-11-28 09:53:09.341186051 +0000 UTC m=+0.176189929 container attach c31ccd1e5f47c4e67f6cf8225cf3cbb75821f4d3f06909a920088350b66a73af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_clarke, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, distribution-scope=public, io.openshift.tags=rhceph ceph, RELEASE=main, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , release=553, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 04:53:09 localhost distracted_clarke[295092]: 167 167 Nov 28 04:53:09 localhost systemd[1]: libpod-c31ccd1e5f47c4e67f6cf8225cf3cbb75821f4d3f06909a920088350b66a73af.scope: Deactivated successfully. Nov 28 04:53:09 localhost podman[295077]: 2025-11-28 09:53:09.344589216 +0000 UTC m=+0.179593124 container died c31ccd1e5f47c4e67f6cf8225cf3cbb75821f4d3f06909a920088350b66a73af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_clarke, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., RELEASE=main, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, version=7, architecture=x86_64, GIT_CLEAN=True, io.buildah.version=1.33.12, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph) Nov 28 04:53:09 localhost ceph-mon[292954]: mon.np0005538513@4(peon).osd e87 _set_new_cache_sizes cache_size:1020054721 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:53:09 localhost ceph-mon[292954]: Saving service mon spec with placement label:mon Nov 28 04:53:09 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:09 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:09 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:09 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:09 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:53:09 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:53:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:53:09 localhost podman[295097]: 2025-11-28 09:53:09.446691648 +0000 UTC m=+0.093759034 container remove c31ccd1e5f47c4e67f6cf8225cf3cbb75821f4d3f06909a920088350b66a73af (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_clarke, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.component=rhceph-container, ceph=True, release=553, distribution-scope=public, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, version=7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 28 04:53:09 localhost systemd[1]: libpod-conmon-c31ccd1e5f47c4e67f6cf8225cf3cbb75821f4d3f06909a920088350b66a73af.scope: Deactivated successfully. Nov 28 04:53:09 localhost podman[295112]: 2025-11-28 09:53:09.563640311 +0000 UTC m=+0.103874898 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 28 04:53:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:53:09 localhost podman[295112]: 2025-11-28 09:53:09.598505241 +0000 UTC m=+0.138739828 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:53:09 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:53:09 localhost podman[295132]: 2025-11-28 09:53:09.696522967 +0000 UTC m=+0.095720356 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 28 04:53:09 localhost podman[295132]: 2025-11-28 09:53:09.770799419 +0000 UTC m=+0.169996798 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 28 04:53:09 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:53:09 localhost systemd[1]: var-lib-containers-storage-overlay-5ea726546635a277b6fee427c926bf381607320f22e8d5a1b8aa56e1cdc6a33d-merged.mount: Deactivated successfully. Nov 28 04:53:09 localhost nova_compute[279673]: 2025-11-28 09:53:09.929 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:53:10 localhost podman[238687]: time="2025-11-28T09:53:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:53:10 localhost podman[238687]: @ - - [28/Nov/2025:09:53:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 28 04:53:10 localhost podman[238687]: @ - - [28/Nov/2025:09:53:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18716 "" "Go-http-client/1.1" Nov 28 04:53:10 localhost podman[295211]: Nov 28 04:53:10 localhost podman[295211]: 2025-11-28 09:53:10.274417869 +0000 UTC m=+0.093618891 container create fc9e2d5abd2267ba769fc12257a666518a2d5020f2685105e8ed725192fbc1f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_ride, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.component=rhceph-container, architecture=x86_64, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, release=553, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_BRANCH=main, distribution-scope=public, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12) Nov 28 04:53:10 localhost systemd[1]: Started libpod-conmon-fc9e2d5abd2267ba769fc12257a666518a2d5020f2685105e8ed725192fbc1f6.scope. Nov 28 04:53:10 localhost podman[295211]: 2025-11-28 09:53:10.235906646 +0000 UTC m=+0.055107698 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:53:10 localhost systemd[1]: Started libcrun container. Nov 28 04:53:10 localhost podman[295211]: 2025-11-28 09:53:10.362003742 +0000 UTC m=+0.181204764 container init fc9e2d5abd2267ba769fc12257a666518a2d5020f2685105e8ed725192fbc1f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_ride, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, vcs-type=git, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, name=rhceph, RELEASE=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7) Nov 28 04:53:10 localhost podman[295211]: 2025-11-28 09:53:10.371256298 +0000 UTC m=+0.190457350 container start fc9e2d5abd2267ba769fc12257a666518a2d5020f2685105e8ed725192fbc1f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_ride, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vendor=Red Hat, Inc., GIT_BRANCH=main, name=rhceph, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_CLEAN=True, version=7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git) Nov 28 04:53:10 localhost podman[295211]: 2025-11-28 09:53:10.371484175 +0000 UTC m=+0.190685197 container attach fc9e2d5abd2267ba769fc12257a666518a2d5020f2685105e8ed725192fbc1f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_ride, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, maintainer=Guillaume Abrioux , vcs-type=git, version=7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, name=rhceph, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, release=553, io.buildah.version=1.33.12, ceph=True, CEPH_POINT_RELEASE=) Nov 28 04:53:10 localhost sweet_ride[295228]: 167 167 Nov 28 04:53:10 localhost systemd[1]: libpod-fc9e2d5abd2267ba769fc12257a666518a2d5020f2685105e8ed725192fbc1f6.scope: Deactivated successfully. Nov 28 04:53:10 localhost podman[295211]: 2025-11-28 09:53:10.378242105 +0000 UTC m=+0.197443157 container died fc9e2d5abd2267ba769fc12257a666518a2d5020f2685105e8ed725192fbc1f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_ride, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, RELEASE=main, ceph=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, release=553, vcs-type=git, io.openshift.expose-services=, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True) Nov 28 04:53:10 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)... Nov 28 04:53:10 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain Nov 28 04:53:10 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:10 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:10 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:53:10 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:53:10 localhost podman[295233]: 2025-11-28 09:53:10.482715621 +0000 UTC m=+0.092620820 container remove fc9e2d5abd2267ba769fc12257a666518a2d5020f2685105e8ed725192fbc1f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_ride, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, architecture=x86_64, name=rhceph, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, RELEASE=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, ceph=True, com.redhat.component=rhceph-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 28 04:53:10 localhost systemd[1]: libpod-conmon-fc9e2d5abd2267ba769fc12257a666518a2d5020f2685105e8ed725192fbc1f6.scope: Deactivated successfully. Nov 28 04:53:10 localhost systemd[1]: var-lib-containers-storage-overlay-3d0caea60d6719bb0da89d6e5067c0e3539c804930c77fcb1727715237e8de84-merged.mount: Deactivated successfully. Nov 28 04:53:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:53:11 localhost podman[295300]: 2025-11-28 09:53:11.324047552 +0000 UTC m=+0.105841259 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute) Nov 28 04:53:11 localhost podman[295308]: Nov 28 04:53:11 localhost podman[295308]: 2025-11-28 09:53:11.339296105 +0000 UTC m=+0.095036085 container create 256b114422cb5e0745162aee5d5dbaccc54894c8791b369eb38d47243d47a3d0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_lewin, build-date=2025-09-24T08:57:55, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, version=7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.buildah.version=1.33.12) Nov 28 04:53:11 localhost podman[295300]: 2025-11-28 09:53:11.364064603 +0000 UTC m=+0.145858290 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:53:11 localhost systemd[1]: Started libpod-conmon-256b114422cb5e0745162aee5d5dbaccc54894c8791b369eb38d47243d47a3d0.scope. Nov 28 04:53:11 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:53:11 localhost podman[295308]: 2025-11-28 09:53:11.29879709 +0000 UTC m=+0.054537120 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:53:11 localhost systemd[1]: Started libcrun container. Nov 28 04:53:11 localhost podman[295308]: 2025-11-28 09:53:11.451637135 +0000 UTC m=+0.207377115 container init 256b114422cb5e0745162aee5d5dbaccc54894c8791b369eb38d47243d47a3d0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_lewin, GIT_CLEAN=True, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, release=553, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, distribution-scope=public, version=7, com.redhat.component=rhceph-container, ceph=True, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, architecture=x86_64) Nov 28 04:53:11 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)... Nov 28 04:53:11 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain Nov 28 04:53:11 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:11 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:11 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:53:11 localhost systemd[1]: libpod-256b114422cb5e0745162aee5d5dbaccc54894c8791b369eb38d47243d47a3d0.scope: Deactivated successfully. Nov 28 04:53:11 localhost hopeful_lewin[295337]: 167 167 Nov 28 04:53:11 localhost podman[295308]: 2025-11-28 09:53:11.483316156 +0000 UTC m=+0.239056146 container start 256b114422cb5e0745162aee5d5dbaccc54894c8791b369eb38d47243d47a3d0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_lewin, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, version=7) Nov 28 04:53:11 localhost podman[295308]: 2025-11-28 09:53:11.483963286 +0000 UTC m=+0.239703286 container attach 256b114422cb5e0745162aee5d5dbaccc54894c8791b369eb38d47243d47a3d0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_lewin, distribution-scope=public, vcs-type=git, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, release=553, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.openshift.expose-services=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 28 04:53:11 localhost podman[295308]: 2025-11-28 09:53:11.490454888 +0000 UTC m=+0.246194898 container died 256b114422cb5e0745162aee5d5dbaccc54894c8791b369eb38d47243d47a3d0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_lewin, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, release=553, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, name=rhceph, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git) Nov 28 04:53:11 localhost podman[295342]: 2025-11-28 09:53:11.583610443 +0000 UTC m=+0.098356148 container remove 256b114422cb5e0745162aee5d5dbaccc54894c8791b369eb38d47243d47a3d0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_lewin, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, RELEASE=main, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, architecture=x86_64, io.openshift.expose-services=, GIT_BRANCH=main, ceph=True, version=7, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, GIT_CLEAN=True, vcs-type=git, io.buildah.version=1.33.12) Nov 28 04:53:11 localhost systemd[1]: libpod-conmon-256b114422cb5e0745162aee5d5dbaccc54894c8791b369eb38d47243d47a3d0.scope: Deactivated successfully. Nov 28 04:53:11 localhost systemd[1]: var-lib-containers-storage-overlay-f4cfdb30d543e6f8d1b36776da3435febeefc1b8f2a8f2002a4ea3d127bd1268-merged.mount: Deactivated successfully. Nov 28 04:53:12 localhost ceph-mon[292954]: Reconfiguring mon.np0005538513 (monmap changed)... Nov 28 04:53:12 localhost ceph-mon[292954]: Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain Nov 28 04:53:12 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:12 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:12 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:53:12 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:53:13 localhost nova_compute[279673]: 2025-11-28 09:53:13.173 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:53:13 localhost ceph-mon[292954]: Reconfiguring crash.np0005538514 (monmap changed)... Nov 28 04:53:13 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain Nov 28 04:53:13 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:13 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:13 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0. Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:13.483383) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13 Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323593483550, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 11496, "num_deletes": 257, "total_data_size": 19523211, "memory_usage": 20414488, "flush_reason": "Manual Compaction"} Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323593580596, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 14818754, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 11501, "table_properties": {"data_size": 14760009, "index_size": 31695, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25989, "raw_key_size": 274264, "raw_average_key_size": 26, "raw_value_size": 14583379, "raw_average_value_size": 1404, "num_data_blocks": 1216, "num_entries": 10383, "num_filter_entries": 10383, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 1764323559, "file_creation_time": 1764323593, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}} Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 97302 microseconds, and 31725 cpu microseconds. Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:13.580693) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 14818754 bytes OK Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:13.580736) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:13.582572) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:13.582597) EVENT_LOG_v1 {"time_micros": 1764323593582590, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0} Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:13.582626) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50 Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 19446154, prev total WAL file size 19446154, number of live WAL files 2. Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:13.586633) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130323931' seq:72057594037927935, type:22 .. '7061786F73003130353433' seq:0, type:0; will stop at (end) Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00 Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(14MB) 8(2012B)] Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323593586787, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 14820766, "oldest_snapshot_seqno": -1} Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 10132 keys, 14815360 bytes, temperature: kUnknown Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323593701307, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 14815360, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14757320, "index_size": 31635, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25349, "raw_key_size": 269421, "raw_average_key_size": 26, "raw_value_size": 14584034, "raw_average_value_size": 1439, "num_data_blocks": 1215, "num_entries": 10132, "num_filter_entries": 10132, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764323593, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}} Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:13.701957) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 14815360 bytes Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:13.703976) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 129.0 rd, 129.0 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(14.1, 0.0 +0.0 blob) out(14.1 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 10388, records dropped: 256 output_compression: NoCompression Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:13.704010) EVENT_LOG_v1 {"time_micros": 1764323593703994, "job": 4, "event": "compaction_finished", "compaction_time_micros": 114858, "compaction_time_cpu_micros": 42264, "output_level": 6, "num_output_files": 1, "total_output_size": 14815360, "num_input_records": 10388, "num_output_records": 10132, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323593707689, "job": 4, "event": "table_file_deletion", "file_number": 14} Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323593707968, "job": 4, "event": "table_file_deletion", "file_number": 8} Nov 28 04:53:13 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:13.586432) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:53:14 localhost ceph-mon[292954]: mon.np0005538513@4(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:53:14 localhost ceph-mon[292954]: Reconfiguring osd.0 (monmap changed)... Nov 28 04:53:14 localhost ceph-mon[292954]: Reconfiguring daemon osd.0 on np0005538514.localdomain Nov 28 04:53:14 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:14 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:14 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:14 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:14 localhost ceph-mon[292954]: Reconfiguring osd.3 (monmap changed)... Nov 28 04:53:14 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 28 04:53:14 localhost ceph-mon[292954]: Reconfiguring daemon osd.3 on np0005538514.localdomain Nov 28 04:53:14 localhost nova_compute[279673]: 2025-11-28 09:53:14.966 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:53:15 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:15 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:15 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:15 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:15 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)... Nov 28 04:53:15 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:53:15 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:53:15 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain Nov 28 04:53:15 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:15 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:15 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:53:15 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:53:16 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538514.djozup (monmap changed)... Nov 28 04:53:16 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain Nov 28 04:53:16 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:16 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:16 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:53:17 localhost ceph-mon[292954]: Reconfiguring mon.np0005538514 (monmap changed)... Nov 28 04:53:17 localhost ceph-mon[292954]: Reconfiguring daemon mon.np0005538514 on np0005538514.localdomain Nov 28 04:53:17 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:17 localhost ceph-mon[292954]: from='mgr.17142 ' entity='mgr.np0005538514.djozup' Nov 28 04:53:17 localhost ceph-mon[292954]: from='mgr.17142 172.18.0.107:0/992118114' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 28 04:53:17 localhost ceph-mon[292954]: mon.np0005538513@4(peon).osd e88 e88: 6 total, 6 up, 6 in Nov 28 04:53:18 localhost systemd[1]: session-67.scope: Deactivated successfully. Nov 28 04:53:18 localhost systemd[1]: session-67.scope: Consumed 11.737s CPU time. Nov 28 04:53:18 localhost systemd-logind[764]: Session 67 logged out. Waiting for processes to exit. Nov 28 04:53:18 localhost systemd-logind[764]: Removed session 67. Nov 28 04:53:18 localhost openstack_network_exporter[240658]: ERROR 09:53:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:53:18 localhost openstack_network_exporter[240658]: ERROR 09:53:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:53:18 localhost openstack_network_exporter[240658]: ERROR 09:53:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:53:18 localhost openstack_network_exporter[240658]: ERROR 09:53:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:53:18 localhost openstack_network_exporter[240658]: Nov 28 04:53:18 localhost openstack_network_exporter[240658]: ERROR 09:53:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:53:18 localhost openstack_network_exporter[240658]: Nov 28 04:53:18 localhost nova_compute[279673]: 2025-11-28 09:53:18.175 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:53:18 localhost sshd[295359]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:53:18 localhost systemd-logind[764]: New session 68 of user ceph-admin. Nov 28 04:53:18 localhost systemd[1]: Started Session 68 of User ceph-admin. Nov 28 04:53:18 localhost ceph-mon[292954]: from='client.? 172.18.0.200:0/1038640921' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 28 04:53:18 localhost ceph-mon[292954]: Activating manager daemon np0005538515.yfkzhl Nov 28 04:53:18 localhost ceph-mon[292954]: from='client.? 172.18.0.200:0/1038640921' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Nov 28 04:53:18 localhost ceph-mon[292954]: Manager daemon np0005538515.yfkzhl is now available Nov 28 04:53:18 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/mirror_snapshot_schedule"} : dispatch Nov 28 04:53:18 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/mirror_snapshot_schedule"} : dispatch Nov 28 04:53:18 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/trash_purge_schedule"} : dispatch Nov 28 04:53:18 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/trash_purge_schedule"} : dispatch Nov 28 04:53:19 localhost systemd[1]: session-65.scope: Deactivated successfully. Nov 28 04:53:19 localhost systemd[1]: session-65.scope: Consumed 1.776s CPU time. Nov 28 04:53:19 localhost systemd-logind[764]: Session 65 logged out. Waiting for processes to exit. Nov 28 04:53:19 localhost systemd-logind[764]: Removed session 65. Nov 28 04:53:19 localhost ceph-mon[292954]: mon.np0005538513@4(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:53:19 localhost podman[295472]: 2025-11-28 09:53:19.474549777 +0000 UTC m=+0.097408259 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, vendor=Red Hat, Inc., RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, ceph=True, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55) Nov 28 04:53:19 localhost podman[295472]: 2025-11-28 09:53:19.609374493 +0000 UTC m=+0.232232955 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, version=7, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.buildah.version=1.33.12, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, name=rhceph, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.expose-services=, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public) Nov 28 04:53:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:53:19 localhost nova_compute[279673]: 2025-11-28 09:53:19.969 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:53:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:53:20 localhost podman[295548]: 2025-11-28 09:53:20.027608628 +0000 UTC m=+0.117193542 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Nov 28 04:53:20 localhost podman[295548]: 2025-11-28 09:53:20.067106451 +0000 UTC m=+0.156691385 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2) Nov 28 04:53:20 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:53:20 localhost podman[295576]: 2025-11-28 09:53:20.119880816 +0000 UTC m=+0.091126113 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 28 04:53:20 localhost podman[295576]: 2025-11-28 09:53:20.161464305 +0000 UTC m=+0.132709582 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 28 04:53:20 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:53:21 localhost ceph-mon[292954]: [28/Nov/2025:09:53:19] ENGINE Bus STARTING Nov 28 04:53:21 localhost ceph-mon[292954]: [28/Nov/2025:09:53:19] ENGINE Serving on https://172.18.0.108:7150 Nov 28 04:53:21 localhost ceph-mon[292954]: [28/Nov/2025:09:53:19] ENGINE Client ('172.18.0.108', 34804) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Nov 28 04:53:21 localhost ceph-mon[292954]: [28/Nov/2025:09:53:19] ENGINE Serving on http://172.18.0.108:8765 Nov 28 04:53:21 localhost ceph-mon[292954]: [28/Nov/2025:09:53:19] ENGINE Bus STARTED Nov 28 04:53:21 localhost ceph-mon[292954]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm) Nov 28 04:53:21 localhost ceph-mon[292954]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm) Nov 28 04:53:21 localhost ceph-mon[292954]: Cluster is now healthy Nov 28 04:53:21 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:21 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:21 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:21 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:21 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:21 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:21 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:21 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:21 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:21 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} : dispatch Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd/host:np0005538511", "name": "osd_memory_target"} : dispatch Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} : dispatch Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} : dispatch Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 28 04:53:22 localhost ceph-mon[292954]: Adjusting osd_memory_target on np0005538514.localdomain to 836.6M Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 28 04:53:22 localhost ceph-mon[292954]: Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 28 04:53:22 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:53:23 localhost nova_compute[279673]: 2025-11-28 09:53:23.209 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:53:23 localhost ceph-mon[292954]: Adjusting osd_memory_target on np0005538513.localdomain to 836.6M Nov 28 04:53:23 localhost ceph-mon[292954]: Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 04:53:23 localhost ceph-mon[292954]: Adjusting osd_memory_target on np0005538515.localdomain to 836.6M Nov 28 04:53:23 localhost ceph-mon[292954]: Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 04:53:23 localhost ceph-mon[292954]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf Nov 28 04:53:23 localhost ceph-mon[292954]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf Nov 28 04:53:23 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:53:23 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:53:23 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:53:23 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:53:23 localhost ceph-mon[292954]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:53:23 localhost ceph-mon[292954]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:53:23 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:53:23 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:53:24 localhost ceph-mon[292954]: mon.np0005538513@4(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:53:24 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:53:24 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:53:24 localhost ceph-mon[292954]: Updating np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:53:24 localhost ceph-mon[292954]: Updating np0005538511.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:53:24 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:53:24 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:24 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:24 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:24 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0. Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.522738) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16 Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323604522868, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 973, "num_deletes": 276, "total_data_size": 5014026, "memory_usage": 5158592, "flush_reason": "Manual Compaction"} Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323604548296, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 3201627, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 11506, "largest_seqno": 12474, "table_properties": {"data_size": 3196696, "index_size": 2334, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11990, "raw_average_key_size": 20, "raw_value_size": 3186115, "raw_average_value_size": 5372, "num_data_blocks": 96, "num_entries": 593, "num_filter_entries": 593, "num_deletions": 275, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323593, "oldest_key_time": 1764323593, "file_creation_time": 1764323604, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}} Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 25626 microseconds, and 8212 cpu microseconds. Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.548367) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 3201627 bytes OK Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.548404) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.550912) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.550944) EVENT_LOG_v1 {"time_micros": 1764323604550935, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.550972) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 5008595, prev total WAL file size 5008595, number of live WAL files 2. Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.552551) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031303231' seq:72057594037927935, type:22 .. '6B760031323936' seq:0, type:0; will stop at (end) Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(3126KB)], [15(14MB)] Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323604552639, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 18016987, "oldest_snapshot_seqno": -1} Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 10142 keys, 16854943 bytes, temperature: kUnknown Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323604669911, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 16854943, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16797320, "index_size": 31154, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25413, "raw_key_size": 271820, "raw_average_key_size": 26, "raw_value_size": 16624127, "raw_average_value_size": 1639, "num_data_blocks": 1176, "num_entries": 10142, "num_filter_entries": 10142, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764323604, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}} Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.670482) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 16854943 bytes Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.673425) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 153.2 rd, 143.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.1, 14.1 +0.0 blob) out(16.1 +0.0 blob), read-write-amplify(10.9) write-amplify(5.3) OK, records in: 10725, records dropped: 583 output_compression: NoCompression Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.673464) EVENT_LOG_v1 {"time_micros": 1764323604673448, "job": 6, "event": "compaction_finished", "compaction_time_micros": 117591, "compaction_time_cpu_micros": 46771, "output_level": 6, "num_output_files": 1, "total_output_size": 16854943, "num_input_records": 10725, "num_output_records": 10142, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323604674326, "job": 6, "event": "table_file_deletion", "file_number": 17} Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323604677170, "job": 6, "event": "table_file_deletion", "file_number": 15} Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.552416) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.677303) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.677312) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.677316) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.677319) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:53:24 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:24.677323) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:53:25 localhost nova_compute[279673]: 2025-11-28 09:53:25.002 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:53:25 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:53:25 localhost ceph-mon[292954]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:53:25 localhost ceph-mon[292954]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:53:25 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:53:25 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:53:25 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:25 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:25 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:25 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:25 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:25 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:25 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:25 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:25 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:53:26 localhost ceph-mon[292954]: Reconfiguring mon.np0005538511 (monmap changed)... Nov 28 04:53:26 localhost ceph-mon[292954]: Reconfiguring daemon mon.np0005538511 on np0005538511.localdomain Nov 28 04:53:26 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:26 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:26 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:53:27 localhost ceph-mon[292954]: Reconfiguring mon.np0005538512 (monmap changed)... Nov 28 04:53:27 localhost ceph-mon[292954]: Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain Nov 28 04:53:27 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:27 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:27 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 28 04:53:28 localhost nova_compute[279673]: 2025-11-28 09:53:28.210 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:53:28 localhost ceph-mon[292954]: Reconfiguring daemon osd.1 on np0005538515.localdomain Nov 28 04:53:28 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:28 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:28 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:28 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:28 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 28 04:53:28 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:29 localhost ceph-mon[292954]: mon.np0005538513@4(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:53:29 localhost systemd[1]: Stopping User Manager for UID 1003... Nov 28 04:53:29 localhost systemd[290538]: Activating special unit Exit the Session... Nov 28 04:53:29 localhost systemd[290538]: Stopped target Main User Target. Nov 28 04:53:29 localhost systemd[290538]: Stopped target Basic System. Nov 28 04:53:29 localhost systemd[290538]: Stopped target Paths. Nov 28 04:53:29 localhost systemd[290538]: Stopped target Sockets. Nov 28 04:53:29 localhost systemd[290538]: Stopped target Timers. Nov 28 04:53:29 localhost systemd[290538]: Stopped Mark boot as successful after the user session has run 2 minutes. Nov 28 04:53:29 localhost systemd[290538]: Stopped Daily Cleanup of User's Temporary Directories. Nov 28 04:53:29 localhost systemd[290538]: Closed D-Bus User Message Bus Socket. Nov 28 04:53:29 localhost systemd[290538]: Stopped Create User's Volatile Files and Directories. Nov 28 04:53:29 localhost systemd[290538]: Removed slice User Application Slice. Nov 28 04:53:29 localhost systemd[290538]: Reached target Shutdown. Nov 28 04:53:29 localhost systemd[290538]: Finished Exit the Session. Nov 28 04:53:29 localhost systemd[290538]: Reached target Exit the Session. Nov 28 04:53:29 localhost systemd[1]: user@1003.service: Deactivated successfully. Nov 28 04:53:29 localhost systemd[1]: Stopped User Manager for UID 1003. Nov 28 04:53:29 localhost systemd[1]: Stopping User Runtime Directory /run/user/1003... Nov 28 04:53:29 localhost systemd[1]: run-user-1003.mount: Deactivated successfully. Nov 28 04:53:29 localhost systemd[1]: user-runtime-dir@1003.service: Deactivated successfully. Nov 28 04:53:29 localhost systemd[1]: Stopped User Runtime Directory /run/user/1003. Nov 28 04:53:29 localhost systemd[1]: Removed slice User Slice of UID 1003. Nov 28 04:53:29 localhost systemd[1]: user-1003.slice: Consumed 2.408s CPU time. Nov 28 04:53:29 localhost ceph-mon[292954]: Reconfiguring daemon osd.4 on np0005538515.localdomain Nov 28 04:53:29 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:29 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:29 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:29 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:29 localhost ceph-mon[292954]: Reconfiguring mon.np0005538515 (monmap changed)... Nov 28 04:53:29 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:53:29 localhost ceph-mon[292954]: Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain Nov 28 04:53:30 localhost nova_compute[279673]: 2025-11-28 09:53:30.004 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:53:30 localhost nova_compute[279673]: 2025-11-28 09:53:30.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:53:30 localhost nova_compute[279673]: 2025-11-28 09:53:30.770 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 04:53:31 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:31 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:31 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:53:31 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:53:31 localhost nova_compute[279673]: 2025-11-28 09:53:31.767 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:53:31 localhost nova_compute[279673]: 2025-11-28 09:53:31.769 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:53:31 localhost systemd[1]: tmp-crun.TyCxMF.mount: Deactivated successfully. Nov 28 04:53:31 localhost podman[296437]: 2025-11-28 09:53:31.860039345 +0000 UTC m=+0.095045075 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, config_id=edpm, vcs-type=git, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.buildah.version=1.33.7, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 28 04:53:31 localhost podman[296437]: 2025-11-28 09:53:31.876533126 +0000 UTC m=+0.111538816 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, container_name=openstack_network_exporter, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, distribution-scope=public, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Nov 28 04:53:31 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:53:32 localhost nova_compute[279673]: 2025-11-28 09:53:32.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:53:33 localhost nova_compute[279673]: 2025-11-28 09:53:33.247 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:53:33 localhost nova_compute[279673]: 2025-11-28 09:53:33.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:53:33 localhost nova_compute[279673]: 2025-11-28 09:53:33.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:53:33 localhost nova_compute[279673]: 2025-11-28 09:53:33.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:53:33 localhost ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b9600 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0 Nov 28 04:53:33 localhost ceph-mon[292954]: mon.np0005538513@4(peon) e10 my rank is now 3 (was 4) Nov 28 04:53:33 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : mon.np0005538513 calling monitor election Nov 28 04:53:33 localhost ceph-mon[292954]: paxos.3).electionLogic(38) init, last seen epoch 38 Nov 28 04:53:33 localhost ceph-mon[292954]: mon.np0005538513@3(electing) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:53:33 localhost ceph-mon[292954]: mon.np0005538513@3(electing) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:53:34 localhost ceph-mon[292954]: mon.np0005538513@3(electing) e10 handle_auth_request failed to assign global_id Nov 28 04:53:34 localhost ceph-mon[292954]: mon.np0005538513@3(electing) e10 handle_auth_request failed to assign global_id Nov 28 04:53:34 localhost ceph-mon[292954]: mon.np0005538513@3(electing) e10 handle_auth_request failed to assign global_id Nov 28 04:53:34 localhost ceph-mon[292954]: mon.np0005538513@3(electing) e10 handle_auth_request failed to assign global_id Nov 28 04:53:35 localhost nova_compute[279673]: 2025-11-28 09:53:35.038 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:53:35 localhost ceph-mon[292954]: mon.np0005538513@3(electing) e10 handle_auth_request failed to assign global_id Nov 28 04:53:35 localhost ceph-mon[292954]: mon.np0005538513@3(electing) e10 handle_auth_request failed to assign global_id Nov 28 04:53:35 localhost ceph-mon[292954]: mon.np0005538513@3(electing) e10 handle_auth_request failed to assign global_id Nov 28 04:53:35 localhost nova_compute[279673]: 2025-11-28 09:53:35.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:53:35 localhost nova_compute[279673]: 2025-11-28 09:53:35.793 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:53:35 localhost nova_compute[279673]: 2025-11-28 09:53:35.793 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:53:35 localhost nova_compute[279673]: 2025-11-28 09:53:35.794 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:53:35 localhost nova_compute[279673]: 2025-11-28 09:53:35.794 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 04:53:35 localhost nova_compute[279673]: 2025-11-28 09:53:35.795 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:53:35 localhost ceph-mon[292954]: mon.np0005538513@3(electing) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:53:35 localhost ceph-mon[292954]: mon.np0005538513@3(peon) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:53:35 localhost ceph-mon[292954]: Remove daemons mon.np0005538511 Nov 28 04:53:35 localhost ceph-mon[292954]: Safe to remove mon.np0005538511: new quorum should be ['np0005538512', 'np0005538515', 'np0005538514', 'np0005538513'] (from ['np0005538512', 'np0005538515', 'np0005538514', 'np0005538513']) Nov 28 04:53:35 localhost ceph-mon[292954]: Removing monitor np0005538511 from monmap... Nov 28 04:53:35 localhost ceph-mon[292954]: Removing daemon mon.np0005538511 from np0005538511.localdomain -- ports [] Nov 28 04:53:35 localhost ceph-mon[292954]: mon.np0005538513 calling monitor election Nov 28 04:53:35 localhost ceph-mon[292954]: mon.np0005538515 calling monitor election Nov 28 04:53:35 localhost ceph-mon[292954]: mon.np0005538514 calling monitor election Nov 28 04:53:35 localhost ceph-mon[292954]: mon.np0005538512 calling monitor election Nov 28 04:53:35 localhost ceph-mon[292954]: mon.np0005538512 is new leader, mons np0005538512,np0005538515,np0005538514,np0005538513 in quorum (ranks 0,1,2,3) Nov 28 04:53:35 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:53:35 localhost ceph-mon[292954]: overall HEALTH_OK Nov 28 04:53:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:53:36 localhost podman[296531]: 2025-11-28 09:53:36.279385309 +0000 UTC m=+0.094519441 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 04:53:36 localhost nova_compute[279673]: 2025-11-28 09:53:36.307 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:53:36 localhost podman[296531]: 2025-11-28 09:53:36.316565203 +0000 UTC m=+0.131699325 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 04:53:36 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:53:36 localhost nova_compute[279673]: 2025-11-28 09:53:36.384 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:53:36 localhost nova_compute[279673]: 2025-11-28 09:53:36.385 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:53:36 localhost ceph-mon[292954]: mon.np0005538513@3(peon) e10 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 04:53:36 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2640763287' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 04:53:36 localhost nova_compute[279673]: 2025-11-28 09:53:36.634 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 04:53:36 localhost nova_compute[279673]: 2025-11-28 09:53:36.636 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11810MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 04:53:36 localhost nova_compute[279673]: 2025-11-28 09:53:36.636 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:53:36 localhost nova_compute[279673]: 2025-11-28 09:53:36.636 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:53:36 localhost nova_compute[279673]: 2025-11-28 09:53:36.746 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 04:53:36 localhost nova_compute[279673]: 2025-11-28 09:53:36.746 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 04:53:36 localhost nova_compute[279673]: 2025-11-28 09:53:36.747 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 04:53:36 localhost nova_compute[279673]: 2025-11-28 09:53:36.818 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:53:37 localhost ceph-mon[292954]: Updating np0005538511.localdomain:/etc/ceph/ceph.conf Nov 28 04:53:37 localhost ceph-mon[292954]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf Nov 28 04:53:37 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:53:37 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:53:37 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:53:37 localhost ceph-mon[292954]: Updating np0005538511.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:53:37 localhost ceph-mon[292954]: mon.np0005538513@3(peon) e10 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 04:53:37 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/953514327' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 04:53:37 localhost nova_compute[279673]: 2025-11-28 09:53:37.268 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:53:37 localhost nova_compute[279673]: 2025-11-28 09:53:37.276 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 04:53:37 localhost nova_compute[279673]: 2025-11-28 09:53:37.291 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 04:53:37 localhost nova_compute[279673]: 2025-11-28 09:53:37.294 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 04:53:37 localhost nova_compute[279673]: 2025-11-28 09:53:37.295 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:53:38 localhost nova_compute[279673]: 2025-11-28 09:53:38.250 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:53:38 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:53:38 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:53:38 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:53:38 localhost ceph-mon[292954]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:53:38 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:38 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:38 localhost ceph-mon[292954]: Removed label mon from host np0005538511.localdomain Nov 28 04:53:38 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:38 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:38 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:38 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:38 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:38 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:38 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:38 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:38 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:38 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:38 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538511.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:53:38 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538511.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:53:38 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:38 localhost nova_compute[279673]: 2025-11-28 09:53:38.291 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:53:38 localhost nova_compute[279673]: 2025-11-28 09:53:38.318 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:53:38 localhost nova_compute[279673]: 2025-11-28 09:53:38.318 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 04:53:38 localhost nova_compute[279673]: 2025-11-28 09:53:38.318 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 04:53:39 localhost nova_compute[279673]: 2025-11-28 09:53:39.018 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:53:39 localhost nova_compute[279673]: 2025-11-28 09:53:39.018 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:53:39 localhost nova_compute[279673]: 2025-11-28 09:53:39.019 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 04:53:39 localhost nova_compute[279673]: 2025-11-28 09:53:39.019 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:53:39 localhost nova_compute[279673]: 2025-11-28 09:53:39.417 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 04:53:39 localhost nova_compute[279673]: 2025-11-28 09:53:39.443 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:53:39 localhost nova_compute[279673]: 2025-11-28 09:53:39.443 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 04:53:39 localhost ceph-mon[292954]: mon.np0005538513@3(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:53:39 localhost ceph-mon[292954]: Reconfiguring crash.np0005538511 (monmap changed)... Nov 28 04:53:39 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538511 on np0005538511.localdomain Nov 28 04:53:39 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:39 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:39 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538511.fvuybw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:53:39 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538511.fvuybw", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:53:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:53:39 localhost systemd[1]: tmp-crun.cMPeBd.mount: Deactivated successfully. Nov 28 04:53:39 localhost podman[296863]: 2025-11-28 09:53:39.851287086 +0000 UTC m=+0.085202584 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent) Nov 28 04:53:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:53:39 localhost podman[296863]: 2025-11-28 09:53:39.893544496 +0000 UTC m=+0.127459954 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:53:39 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:53:39 localhost podman[296881]: 2025-11-28 09:53:39.954791172 +0000 UTC m=+0.082500741 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS) Nov 28 04:53:39 localhost podman[296881]: 2025-11-28 09:53:39.997861777 +0000 UTC m=+0.125571356 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true) Nov 28 04:53:40 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:53:40 localhost nova_compute[279673]: 2025-11-28 09:53:40.040 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:53:40 localhost podman[238687]: time="2025-11-28T09:53:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:53:40 localhost podman[238687]: @ - - [28/Nov/2025:09:53:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 28 04:53:40 localhost podman[238687]: @ - - [28/Nov/2025:09:53:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18720 "" "Go-http-client/1.1" Nov 28 04:53:40 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538511.fvuybw (monmap changed)... Nov 28 04:53:40 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538511.fvuybw on np0005538511.localdomain Nov 28 04:53:40 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:40 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:40 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:53:41 localhost ceph-mon[292954]: Reconfiguring mon.np0005538512 (monmap changed)... Nov 28 04:53:41 localhost ceph-mon[292954]: Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain Nov 28 04:53:41 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:41 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:41 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:53:41 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:53:41 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:41 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:41 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:53:41 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:53:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:53:41 localhost podman[296906]: 2025-11-28 09:53:41.81727773 +0000 UTC m=+0.061221555 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:53:41 localhost podman[296906]: 2025-11-28 09:53:41.828632109 +0000 UTC m=+0.072575924 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Nov 28 04:53:41 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:53:42 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)... Nov 28 04:53:42 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain Nov 28 04:53:42 localhost ceph-mon[292954]: Reconfiguring crash.np0005538512 (monmap changed)... Nov 28 04:53:42 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain Nov 28 04:53:42 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:42 localhost ceph-mon[292954]: Removed label mgr from host np0005538511.localdomain Nov 28 04:53:42 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:42 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:42 localhost ceph-mon[292954]: Reconfiguring crash.np0005538513 (monmap changed)... Nov 28 04:53:42 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:53:42 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:53:42 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain Nov 28 04:53:43 localhost podman[296978]: Nov 28 04:53:43 localhost podman[296978]: 2025-11-28 09:53:43.090064578 +0000 UTC m=+0.067184659 container create d7bf7b5eec17bbbd3a16388fb0a3c80f9f241b2a2c2b0db19c401852c231658c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_curie, vendor=Red Hat, Inc., ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, architecture=x86_64, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_BRANCH=main, version=7, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , GIT_CLEAN=True) Nov 28 04:53:43 localhost systemd[1]: Started libpod-conmon-d7bf7b5eec17bbbd3a16388fb0a3c80f9f241b2a2c2b0db19c401852c231658c.scope. Nov 28 04:53:43 localhost systemd[1]: Started libcrun container. Nov 28 04:53:43 localhost podman[296978]: 2025-11-28 09:53:43.064827031 +0000 UTC m=+0.041947122 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:53:43 localhost podman[296978]: 2025-11-28 09:53:43.178485299 +0000 UTC m=+0.155605380 container init d7bf7b5eec17bbbd3a16388fb0a3c80f9f241b2a2c2b0db19c401852c231658c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_curie, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , RELEASE=main, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, architecture=x86_64, name=rhceph, version=7, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.openshift.expose-services=) Nov 28 04:53:43 localhost podman[296978]: 2025-11-28 09:53:43.190920692 +0000 UTC m=+0.168040773 container start d7bf7b5eec17bbbd3a16388fb0a3c80f9f241b2a2c2b0db19c401852c231658c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_curie, build-date=2025-09-24T08:57:55, release=553, com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.tags=rhceph ceph, RELEASE=main, ceph=True) Nov 28 04:53:43 localhost podman[296978]: 2025-11-28 09:53:43.191197651 +0000 UTC m=+0.168317772 container attach d7bf7b5eec17bbbd3a16388fb0a3c80f9f241b2a2c2b0db19c401852c231658c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_curie, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=553, io.openshift.tags=rhceph ceph, version=7, name=rhceph, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.openshift.expose-services=, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vendor=Red Hat, Inc., distribution-scope=public) Nov 28 04:53:43 localhost condescending_curie[296993]: 167 167 Nov 28 04:53:43 localhost systemd[1]: libpod-d7bf7b5eec17bbbd3a16388fb0a3c80f9f241b2a2c2b0db19c401852c231658c.scope: Deactivated successfully. Nov 28 04:53:43 localhost podman[296978]: 2025-11-28 09:53:43.195497202 +0000 UTC m=+0.172617313 container died d7bf7b5eec17bbbd3a16388fb0a3c80f9f241b2a2c2b0db19c401852c231658c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_curie, ceph=True, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, build-date=2025-09-24T08:57:55) Nov 28 04:53:43 localhost nova_compute[279673]: 2025-11-28 09:53:43.285 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:53:43 localhost podman[296998]: 2025-11-28 09:53:43.327457464 +0000 UTC m=+0.120873111 container remove d7bf7b5eec17bbbd3a16388fb0a3c80f9f241b2a2c2b0db19c401852c231658c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_curie, build-date=2025-09-24T08:57:55, version=7, vcs-type=git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.openshift.tags=rhceph ceph, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, name=rhceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, io.openshift.expose-services=) Nov 28 04:53:43 localhost systemd[1]: libpod-conmon-d7bf7b5eec17bbbd3a16388fb0a3c80f9f241b2a2c2b0db19c401852c231658c.scope: Deactivated successfully. Nov 28 04:53:43 localhost ceph-mon[292954]: Removed label _admin from host np0005538511.localdomain Nov 28 04:53:43 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:43 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:43 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:43 localhost ceph-mon[292954]: Reconfiguring osd.2 (monmap changed)... Nov 28 04:53:43 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 28 04:53:43 localhost ceph-mon[292954]: Reconfiguring daemon osd.2 on np0005538513.localdomain Nov 28 04:53:44 localhost podman[297067]: Nov 28 04:53:44 localhost podman[297067]: 2025-11-28 09:53:44.038078268 +0000 UTC m=+0.079610341 container create ad1222987a0e7ff30aab98dfa8bd0440036d29f869983b2a6410a64ccd05aa3a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_kare, io.buildah.version=1.33.12, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, vcs-type=git, maintainer=Guillaume Abrioux , ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_CLEAN=True, name=rhceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, RELEASE=main, version=7, distribution-scope=public, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 28 04:53:44 localhost systemd[1]: Started libpod-conmon-ad1222987a0e7ff30aab98dfa8bd0440036d29f869983b2a6410a64ccd05aa3a.scope. Nov 28 04:53:44 localhost systemd[1]: Started libcrun container. Nov 28 04:53:44 localhost systemd[1]: var-lib-containers-storage-overlay-563384d2cbbb14240e8678f8ea08988afbfcedf1cbab013f9542294f82232c37-merged.mount: Deactivated successfully. Nov 28 04:53:44 localhost podman[297067]: 2025-11-28 09:53:44.097543458 +0000 UTC m=+0.139075531 container init ad1222987a0e7ff30aab98dfa8bd0440036d29f869983b2a6410a64ccd05aa3a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_kare, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, CEPH_POINT_RELEASE=, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, version=7, GIT_BRANCH=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , vcs-type=git, release=553, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.buildah.version=1.33.12, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64) Nov 28 04:53:44 localhost podman[297067]: 2025-11-28 09:53:44.006043142 +0000 UTC m=+0.047575255 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:53:44 localhost podman[297067]: 2025-11-28 09:53:44.111533829 +0000 UTC m=+0.153065902 container start ad1222987a0e7ff30aab98dfa8bd0440036d29f869983b2a6410a64ccd05aa3a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_kare, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, maintainer=Guillaume Abrioux , RELEASE=main, com.redhat.component=rhceph-container, distribution-scope=public, CEPH_POINT_RELEASE=, ceph=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph) Nov 28 04:53:44 localhost podman[297067]: 2025-11-28 09:53:44.111804207 +0000 UTC m=+0.153336330 container attach ad1222987a0e7ff30aab98dfa8bd0440036d29f869983b2a6410a64ccd05aa3a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_kare, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , release=553, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.buildah.version=1.33.12, GIT_BRANCH=main, vcs-type=git, RELEASE=main) Nov 28 04:53:44 localhost affectionate_kare[297082]: 167 167 Nov 28 04:53:44 localhost systemd[1]: libpod-ad1222987a0e7ff30aab98dfa8bd0440036d29f869983b2a6410a64ccd05aa3a.scope: Deactivated successfully. Nov 28 04:53:44 localhost podman[297067]: 2025-11-28 09:53:44.114627654 +0000 UTC m=+0.156159767 container died ad1222987a0e7ff30aab98dfa8bd0440036d29f869983b2a6410a64ccd05aa3a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_kare, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, release=553, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., RELEASE=main, vcs-type=git, CEPH_POINT_RELEASE=, GIT_CLEAN=True, distribution-scope=public, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 28 04:53:44 localhost systemd[1]: var-lib-containers-storage-overlay-2fcb21b62f658603a8af6e6a4cdac3e228de3cab9d5d300153769dc45ec13c68-merged.mount: Deactivated successfully. Nov 28 04:53:44 localhost podman[297087]: 2025-11-28 09:53:44.207287017 +0000 UTC m=+0.082425679 container remove ad1222987a0e7ff30aab98dfa8bd0440036d29f869983b2a6410a64ccd05aa3a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_kare, RELEASE=main, io.openshift.tags=rhceph ceph, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, version=7, distribution-scope=public, io.buildah.version=1.33.12, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, vcs-type=git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, ceph=True, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 28 04:53:44 localhost systemd[1]: libpod-conmon-ad1222987a0e7ff30aab98dfa8bd0440036d29f869983b2a6410a64ccd05aa3a.scope: Deactivated successfully. Nov 28 04:53:44 localhost ceph-mon[292954]: mon.np0005538513@3(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0. Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.522258) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19 Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323624522345, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 1080, "num_deletes": 258, "total_data_size": 1715108, "memory_usage": 1736928, "flush_reason": "Manual Compaction"} Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323624531662, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 995869, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12479, "largest_seqno": 13554, "table_properties": {"data_size": 990623, "index_size": 2589, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 13555, "raw_average_key_size": 21, "raw_value_size": 979270, "raw_average_value_size": 1559, "num_data_blocks": 108, "num_entries": 628, "num_filter_entries": 628, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323604, "oldest_key_time": 1764323604, "file_creation_time": 1764323624, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}} Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 9444 microseconds, and 3944 cpu microseconds. Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.531710) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 995869 bytes OK Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.531736) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.534509) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.534531) EVENT_LOG_v1 {"time_micros": 1764323624534525, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.534553) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 1709330, prev total WAL file size 1709654, number of live WAL files 2. Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.535174) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033353138' seq:72057594037927935, type:22 .. '6C6F676D0033373731' seq:0, type:0; will stop at (end) Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(972KB)], [18(16MB)] Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323624535219, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 17850812, "oldest_snapshot_seqno": -1} Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 10221 keys, 17704717 bytes, temperature: kUnknown Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323624643761, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 17704717, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17644988, "index_size": 33068, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25605, "raw_key_size": 275182, "raw_average_key_size": 26, "raw_value_size": 17468822, "raw_average_value_size": 1709, "num_data_blocks": 1256, "num_entries": 10221, "num_filter_entries": 10221, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764323624, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}} Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.644132) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 17704717 bytes Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.658237) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 164.3 rd, 163.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 16.1 +0.0 blob) out(16.9 +0.0 blob), read-write-amplify(35.7) write-amplify(17.8) OK, records in: 10770, records dropped: 549 output_compression: NoCompression Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.658295) EVENT_LOG_v1 {"time_micros": 1764323624658273, "job": 8, "event": "compaction_finished", "compaction_time_micros": 108638, "compaction_time_cpu_micros": 43984, "output_level": 6, "num_output_files": 1, "total_output_size": 17704717, "num_input_records": 10770, "num_output_records": 10221, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323624658716, "job": 8, "event": "table_file_deletion", "file_number": 20} Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323624661522, "job": 8, "event": "table_file_deletion", "file_number": 18} Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.535100) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.661570) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.661582) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.661586) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.661589) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:53:44 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:53:44.661592) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:53:45 localhost podman[297163]: Nov 28 04:53:45 localhost podman[297163]: 2025-11-28 09:53:45.039955607 +0000 UTC m=+0.074262837 container create 37f0706866e6fe29256bcb02ec20b78740a4542cf23de399abd5d0def94bc632 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_herschel, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, RELEASE=main, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, distribution-scope=public, ceph=True, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12) Nov 28 04:53:45 localhost nova_compute[279673]: 2025-11-28 09:53:45.082 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:53:45 localhost systemd[1]: Started libpod-conmon-37f0706866e6fe29256bcb02ec20b78740a4542cf23de399abd5d0def94bc632.scope. Nov 28 04:53:45 localhost podman[297163]: 2025-11-28 09:53:45.009447838 +0000 UTC m=+0.043755128 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:53:45 localhost systemd[1]: Started libcrun container. Nov 28 04:53:45 localhost podman[297163]: 2025-11-28 09:53:45.125588262 +0000 UTC m=+0.159895492 container init 37f0706866e6fe29256bcb02ec20b78740a4542cf23de399abd5d0def94bc632 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_herschel, name=rhceph, architecture=x86_64, release=553, distribution-scope=public, io.openshift.tags=rhceph ceph, vcs-type=git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True) Nov 28 04:53:45 localhost reverent_herschel[297178]: 167 167 Nov 28 04:53:45 localhost podman[297163]: 2025-11-28 09:53:45.135752676 +0000 UTC m=+0.170059886 container start 37f0706866e6fe29256bcb02ec20b78740a4542cf23de399abd5d0def94bc632 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_herschel, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, name=rhceph, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, version=7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public) Nov 28 04:53:45 localhost systemd[1]: libpod-37f0706866e6fe29256bcb02ec20b78740a4542cf23de399abd5d0def94bc632.scope: Deactivated successfully. Nov 28 04:53:45 localhost podman[297163]: 2025-11-28 09:53:45.136474678 +0000 UTC m=+0.170781948 container attach 37f0706866e6fe29256bcb02ec20b78740a4542cf23de399abd5d0def94bc632 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_herschel, vendor=Red Hat, Inc., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-type=git, RELEASE=main, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, release=553, architecture=x86_64) Nov 28 04:53:45 localhost podman[297163]: 2025-11-28 09:53:45.13915773 +0000 UTC m=+0.173464960 container died 37f0706866e6fe29256bcb02ec20b78740a4542cf23de399abd5d0def94bc632 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_herschel, distribution-scope=public, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=) Nov 28 04:53:45 localhost podman[297184]: 2025-11-28 09:53:45.232614078 +0000 UTC m=+0.082814431 container remove 37f0706866e6fe29256bcb02ec20b78740a4542cf23de399abd5d0def94bc632 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_herschel, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.openshift.expose-services=, release=553, description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, com.redhat.component=rhceph-container, version=7, maintainer=Guillaume Abrioux , architecture=x86_64, vcs-type=git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 28 04:53:45 localhost systemd[1]: libpod-conmon-37f0706866e6fe29256bcb02ec20b78740a4542cf23de399abd5d0def94bc632.scope: Deactivated successfully. Nov 28 04:53:45 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:45 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:45 localhost ceph-mon[292954]: Reconfiguring osd.5 (monmap changed)... Nov 28 04:53:45 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 28 04:53:45 localhost ceph-mon[292954]: Reconfiguring daemon osd.5 on np0005538513.localdomain Nov 28 04:53:46 localhost podman[297262]: Nov 28 04:53:46 localhost podman[297262]: 2025-11-28 09:53:46.075389268 +0000 UTC m=+0.080487528 container create fc0c45a9abc5bb0b60cfa321d2bc380caf33a24a1a713d6aba3b68529e475572 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_kowalevski, maintainer=Guillaume Abrioux , vcs-type=git, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64) Nov 28 04:53:46 localhost systemd[1]: Started libpod-conmon-fc0c45a9abc5bb0b60cfa321d2bc380caf33a24a1a713d6aba3b68529e475572.scope. Nov 28 04:53:46 localhost systemd[1]: var-lib-containers-storage-overlay-0e9d2d9919f59f56c334681e7bbcd44f2d9e5beb67b9e273697c599e26f341df-merged.mount: Deactivated successfully. Nov 28 04:53:46 localhost systemd[1]: Started libcrun container. Nov 28 04:53:46 localhost podman[297262]: 2025-11-28 09:53:46.044285011 +0000 UTC m=+0.049383301 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:53:46 localhost podman[297262]: 2025-11-28 09:53:46.147013743 +0000 UTC m=+0.152115883 container init fc0c45a9abc5bb0b60cfa321d2bc380caf33a24a1a713d6aba3b68529e475572 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_kowalevski, CEPH_POINT_RELEASE=, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, version=7, release=553, vcs-type=git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , name=rhceph, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., com.redhat.component=rhceph-container) Nov 28 04:53:46 localhost podman[297262]: 2025-11-28 09:53:46.156102483 +0000 UTC m=+0.161200733 container start fc0c45a9abc5bb0b60cfa321d2bc380caf33a24a1a713d6aba3b68529e475572 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_kowalevski, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, RELEASE=main, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=rhceph-container, vcs-type=git, io.buildah.version=1.33.12, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 28 04:53:46 localhost nice_kowalevski[297277]: 167 167 Nov 28 04:53:46 localhost podman[297262]: 2025-11-28 09:53:46.156317789 +0000 UTC m=+0.161416049 container attach fc0c45a9abc5bb0b60cfa321d2bc380caf33a24a1a713d6aba3b68529e475572 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_kowalevski, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_BRANCH=main, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, GIT_CLEAN=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12) Nov 28 04:53:46 localhost systemd[1]: libpod-fc0c45a9abc5bb0b60cfa321d2bc380caf33a24a1a713d6aba3b68529e475572.scope: Deactivated successfully. Nov 28 04:53:46 localhost podman[297262]: 2025-11-28 09:53:46.162660645 +0000 UTC m=+0.167758945 container died fc0c45a9abc5bb0b60cfa321d2bc380caf33a24a1a713d6aba3b68529e475572 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_kowalevski, RELEASE=main, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_BRANCH=main, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, architecture=x86_64, ceph=True, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 28 04:53:46 localhost podman[297282]: 2025-11-28 09:53:46.255974397 +0000 UTC m=+0.086506354 container remove fc0c45a9abc5bb0b60cfa321d2bc380caf33a24a1a713d6aba3b68529e475572 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_kowalevski, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, distribution-scope=public, name=rhceph, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , release=553, GIT_BRANCH=main, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, io.openshift.expose-services=, version=7, RELEASE=main) Nov 28 04:53:46 localhost systemd[1]: libpod-conmon-fc0c45a9abc5bb0b60cfa321d2bc380caf33a24a1a713d6aba3b68529e475572.scope: Deactivated successfully. Nov 28 04:53:46 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:46 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:46 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)... Nov 28 04:53:46 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:53:46 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:53:46 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain Nov 28 04:53:46 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:46 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:46 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:53:46 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:53:46 localhost podman[297351]: Nov 28 04:53:46 localhost podman[297351]: 2025-11-28 09:53:46.990772505 +0000 UTC m=+0.078939281 container create 9a731305e1dff5ee0b9003a006d9a9e4ac90ce2fa8c40566b46cc7fac4ce4b53 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_shirley, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.component=rhceph-container, release=553, GIT_BRANCH=main, vcs-type=git, RELEASE=main, maintainer=Guillaume Abrioux , version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, GIT_CLEAN=True, name=rhceph, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.expose-services=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 28 04:53:47 localhost systemd[1]: Started libpod-conmon-9a731305e1dff5ee0b9003a006d9a9e4ac90ce2fa8c40566b46cc7fac4ce4b53.scope. Nov 28 04:53:47 localhost systemd[1]: Started libcrun container. Nov 28 04:53:47 localhost podman[297351]: 2025-11-28 09:53:47.051853925 +0000 UTC m=+0.140020691 container init 9a731305e1dff5ee0b9003a006d9a9e4ac90ce2fa8c40566b46cc7fac4ce4b53 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_shirley, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vcs-type=git, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, RELEASE=main, distribution-scope=public, name=rhceph) Nov 28 04:53:47 localhost podman[297351]: 2025-11-28 09:53:46.957325005 +0000 UTC m=+0.045491801 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:53:47 localhost podman[297351]: 2025-11-28 09:53:47.061420569 +0000 UTC m=+0.149587325 container start 9a731305e1dff5ee0b9003a006d9a9e4ac90ce2fa8c40566b46cc7fac4ce4b53 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_shirley, architecture=x86_64, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.buildah.version=1.33.12, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, RELEASE=main, distribution-scope=public, release=553, version=7, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, io.openshift.expose-services=, maintainer=Guillaume Abrioux , GIT_BRANCH=main, vendor=Red Hat, Inc., ceph=True) Nov 28 04:53:47 localhost podman[297351]: 2025-11-28 09:53:47.061744949 +0000 UTC m=+0.149911745 container attach 9a731305e1dff5ee0b9003a006d9a9e4ac90ce2fa8c40566b46cc7fac4ce4b53 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_shirley, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, distribution-scope=public, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, name=rhceph, io.openshift.expose-services=, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 28 04:53:47 localhost modest_shirley[297366]: 167 167 Nov 28 04:53:47 localhost systemd[1]: libpod-9a731305e1dff5ee0b9003a006d9a9e4ac90ce2fa8c40566b46cc7fac4ce4b53.scope: Deactivated successfully. Nov 28 04:53:47 localhost podman[297351]: 2025-11-28 09:53:47.063461902 +0000 UTC m=+0.151628678 container died 9a731305e1dff5ee0b9003a006d9a9e4ac90ce2fa8c40566b46cc7fac4ce4b53 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_shirley, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, release=553, description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, io.buildah.version=1.33.12, version=7, GIT_CLEAN=True, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc.) Nov 28 04:53:47 localhost systemd[1]: var-lib-containers-storage-overlay-ec8e9de951f0194103ce90d35f4b8d9b04aab629902595c7e7cb42fab324f707-merged.mount: Deactivated successfully. Nov 28 04:53:47 localhost systemd[1]: var-lib-containers-storage-overlay-4fa418497f1a463fa17622249fe7f5c576572657dcbbf7a30f8bb09bc5099ce8-merged.mount: Deactivated successfully. Nov 28 04:53:47 localhost podman[297371]: 2025-11-28 09:53:47.163057198 +0000 UTC m=+0.086949797 container remove 9a731305e1dff5ee0b9003a006d9a9e4ac90ce2fa8c40566b46cc7fac4ce4b53 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_shirley, name=rhceph, distribution-scope=public, maintainer=Guillaume Abrioux , RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, version=7, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=553, io.openshift.expose-services=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, ceph=True, vendor=Red Hat, Inc., io.buildah.version=1.33.12) Nov 28 04:53:47 localhost systemd[1]: libpod-conmon-9a731305e1dff5ee0b9003a006d9a9e4ac90ce2fa8c40566b46cc7fac4ce4b53.scope: Deactivated successfully. Nov 28 04:53:47 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)... Nov 28 04:53:47 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain Nov 28 04:53:47 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:47 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:47 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:53:47 localhost podman[297444]: Nov 28 04:53:47 localhost podman[297444]: 2025-11-28 09:53:47.874694433 +0000 UTC m=+0.071397809 container create c6045f97ec9240bded586f509be21bc2a7d5a70d29537b6d7b5069e71c5fa813 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_euclid, distribution-scope=public, io.openshift.tags=rhceph ceph, name=rhceph, io.buildah.version=1.33.12, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, ceph=True, com.redhat.component=rhceph-container, release=553, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, RELEASE=main, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55) Nov 28 04:53:47 localhost systemd[1]: Started libpod-conmon-c6045f97ec9240bded586f509be21bc2a7d5a70d29537b6d7b5069e71c5fa813.scope. Nov 28 04:53:47 localhost systemd[1]: Started libcrun container. Nov 28 04:53:47 localhost podman[297444]: 2025-11-28 09:53:47.94249297 +0000 UTC m=+0.139196306 container init c6045f97ec9240bded586f509be21bc2a7d5a70d29537b6d7b5069e71c5fa813 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_euclid, GIT_BRANCH=main, vcs-type=git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, distribution-scope=public, GIT_CLEAN=True, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , name=rhceph, io.buildah.version=1.33.12, io.openshift.expose-services=, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64) Nov 28 04:53:47 localhost podman[297444]: 2025-11-28 09:53:47.848867588 +0000 UTC m=+0.045570924 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:53:47 localhost podman[297444]: 2025-11-28 09:53:47.951961011 +0000 UTC m=+0.148664337 container start c6045f97ec9240bded586f509be21bc2a7d5a70d29537b6d7b5069e71c5fa813 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_euclid, name=rhceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., RELEASE=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, version=7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph) Nov 28 04:53:47 localhost podman[297444]: 2025-11-28 09:53:47.952315282 +0000 UTC m=+0.149018648 container attach c6045f97ec9240bded586f509be21bc2a7d5a70d29537b6d7b5069e71c5fa813 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_euclid, distribution-scope=public, io.openshift.expose-services=, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, GIT_CLEAN=True, version=7, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_BRANCH=main, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, com.redhat.component=rhceph-container, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 28 04:53:47 localhost gifted_euclid[297459]: 167 167 Nov 28 04:53:47 localhost systemd[1]: libpod-c6045f97ec9240bded586f509be21bc2a7d5a70d29537b6d7b5069e71c5fa813.scope: Deactivated successfully. Nov 28 04:53:47 localhost podman[297444]: 2025-11-28 09:53:47.954212311 +0000 UTC m=+0.150915707 container died c6045f97ec9240bded586f509be21bc2a7d5a70d29537b6d7b5069e71c5fa813 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_euclid, vendor=Red Hat, Inc., io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , release=553, distribution-scope=public, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, ceph=True, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64) Nov 28 04:53:48 localhost podman[297464]: 2025-11-28 09:53:48.054194838 +0000 UTC m=+0.087212765 container remove c6045f97ec9240bded586f509be21bc2a7d5a70d29537b6d7b5069e71c5fa813 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_euclid, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, release=553, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, distribution-scope=public, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.12) Nov 28 04:53:48 localhost systemd[1]: libpod-conmon-c6045f97ec9240bded586f509be21bc2a7d5a70d29537b6d7b5069e71c5fa813.scope: Deactivated successfully. Nov 28 04:53:48 localhost openstack_network_exporter[240658]: ERROR 09:53:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:53:48 localhost openstack_network_exporter[240658]: ERROR 09:53:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:53:48 localhost openstack_network_exporter[240658]: ERROR 09:53:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:53:48 localhost openstack_network_exporter[240658]: ERROR 09:53:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:53:48 localhost openstack_network_exporter[240658]: Nov 28 04:53:48 localhost openstack_network_exporter[240658]: ERROR 09:53:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:53:48 localhost openstack_network_exporter[240658]: Nov 28 04:53:48 localhost systemd[1]: var-lib-containers-storage-overlay-5b6cb8489e37fd70ce4a79a2d82b39e56b5bd34a7ba4abca4f9811f53eeb6b5b-merged.mount: Deactivated successfully. Nov 28 04:53:48 localhost nova_compute[279673]: 2025-11-28 09:53:48.285 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:53:48 localhost ceph-mon[292954]: Reconfiguring mon.np0005538513 (monmap changed)... Nov 28 04:53:48 localhost ceph-mon[292954]: Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain Nov 28 04:53:48 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:48 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:48 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:53:48 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:53:49 localhost ceph-mon[292954]: mon.np0005538513@3(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:53:49 localhost ceph-mon[292954]: Reconfiguring crash.np0005538514 (monmap changed)... Nov 28 04:53:49 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain Nov 28 04:53:49 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:49 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:49 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 28 04:53:50 localhost nova_compute[279673]: 2025-11-28 09:53:50.084 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:53:50 localhost ceph-mon[292954]: Reconfiguring osd.0 (monmap changed)... Nov 28 04:53:50 localhost ceph-mon[292954]: Reconfiguring daemon osd.0 on np0005538514.localdomain Nov 28 04:53:50 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:50 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:50 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 28 04:53:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:53:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:53:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:53:50.832 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:53:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:53:50.833 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:53:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:53:50.834 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:53:50 localhost podman[297482]: 2025-11-28 09:53:50.86928395 +0000 UTC m=+0.095552802 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, config_id=multipathd, io.buildah.version=1.41.3) Nov 28 04:53:50 localhost podman[297482]: 2025-11-28 09:53:50.916285346 +0000 UTC m=+0.142554248 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, config_id=multipathd, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:53:50 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:53:50 localhost podman[297481]: 2025-11-28 09:53:50.926114139 +0000 UTC m=+0.154614150 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:53:51 localhost podman[297481]: 2025-11-28 09:53:51.010550928 +0000 UTC m=+0.239050979 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 04:53:51 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:53:51 localhost ceph-mon[292954]: Reconfiguring osd.3 (monmap changed)... Nov 28 04:53:51 localhost ceph-mon[292954]: Reconfiguring daemon osd.3 on np0005538514.localdomain Nov 28 04:53:51 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:51 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:51 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:53:51 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:53:52 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)... Nov 28 04:53:52 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain Nov 28 04:53:52 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:52 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:52 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538514.djozup (monmap changed)... Nov 28 04:53:52 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:53:52 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:53:52 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain Nov 28 04:53:53 localhost nova_compute[279673]: 2025-11-28 09:53:53.313 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:53:53 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:53 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:53 localhost ceph-mon[292954]: Reconfiguring mon.np0005538514 (monmap changed)... Nov 28 04:53:53 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:53:53 localhost ceph-mon[292954]: Reconfiguring daemon mon.np0005538514 on np0005538514.localdomain Nov 28 04:53:53 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:53 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:53 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:53:53 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:53:54 localhost ceph-mon[292954]: mon.np0005538513@3(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:53:55 localhost nova_compute[279673]: 2025-11-28 09:53:55.114 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:53:55 localhost ceph-mon[292954]: Reconfiguring crash.np0005538515 (monmap changed)... Nov 28 04:53:55 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain Nov 28 04:53:55 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:55 localhost ceph-mon[292954]: Added label _no_schedule to host np0005538511.localdomain Nov 28 04:53:55 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:55 localhost ceph-mon[292954]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005538511.localdomain Nov 28 04:53:55 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:55 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:55 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 28 04:53:56 localhost ceph-mon[292954]: Reconfiguring osd.1 (monmap changed)... Nov 28 04:53:56 localhost ceph-mon[292954]: Reconfiguring daemon osd.1 on np0005538515.localdomain Nov 28 04:53:56 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:56 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:56 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 28 04:53:57 localhost ceph-mon[292954]: Reconfiguring osd.4 (monmap changed)... Nov 28 04:53:57 localhost ceph-mon[292954]: Reconfiguring daemon osd.4 on np0005538515.localdomain Nov 28 04:53:57 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:57 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:57 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:53:57 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:53:57 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:57 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538511.localdomain"} : dispatch Nov 28 04:53:57 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538511.localdomain"} : dispatch Nov 28 04:53:57 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538511.localdomain"}]': finished Nov 28 04:53:58 localhost nova_compute[279673]: 2025-11-28 09:53:58.314 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:53:58 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)... Nov 28 04:53:58 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain Nov 28 04:53:58 localhost ceph-mon[292954]: Removed host np0005538511.localdomain Nov 28 04:53:58 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:58 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:58 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)... Nov 28 04:53:58 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:53:58 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:53:58 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain Nov 28 04:53:59 localhost ceph-mon[292954]: mon.np0005538513@3(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:53:59 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:59 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:59 localhost ceph-mon[292954]: Reconfiguring mon.np0005538515 (monmap changed)... Nov 28 04:53:59 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:53:59 localhost ceph-mon[292954]: Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain Nov 28 04:53:59 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:59 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:53:59 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:53:59 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:00 localhost nova_compute[279673]: 2025-11-28 09:54:00.118 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.672 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.673 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.678 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6def1e59-3a34-44ec-aac5-1856d93a8c0d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:54:00.673833', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '2a385e7c-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.845732187, 'message_signature': '5fa516a8c81d06b4c7f003ec72fea157e0f15a0995c73b6d76e812effc189ccd'}]}, 'timestamp': '2025-11-28 09:54:00.679900', '_unique_id': 'a81431ef2933408e9a2248a14a2a0bed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.681 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.683 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.683 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.700 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6da40f3-0776-404b-aee9-65f158de2b27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:54:00.683359', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '2a3b8db8-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.871748518, 'message_signature': 'f9426b9bc63574f8a02a2c01f48e9a4c2f27e3c7e42314ad2dbf2014186b333d'}]}, 'timestamp': '2025-11-28 09:54:00.700750', '_unique_id': '04a4d8c030d648868a261cffc210ad71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.702 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.703 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.713 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.714 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b33d980-2096-413e-8fff-0d6d09b806b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:54:00.703503', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a3d9e5a-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.87540269, 'message_signature': '24bada68cc646eea5c0dab2f7ea611608ec4ffe47a45eca90712c4f2a0439b02'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:54:00.703503', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a3db048-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.87540269, 'message_signature': '1f7312a36a9209a7a0a55b4e3454a3e3db76593f291076da4d37d43345b48fab'}]}, 'timestamp': '2025-11-28 09:54:00.714686', '_unique_id': 'e8ad6f44d99e4316ad06aeafd42bcd2e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.715 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.716 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.716 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 13370000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d56f9cb-f81e-4d96-baeb-8a20fe1e09e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13370000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:54:00.716861', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '2a3e1812-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.871748518, 'message_signature': 'ea397230ae68ef1201b92a4e4f5570c2262fd52c698c1b7946afcde1b092e5dc'}]}, 'timestamp': '2025-11-28 09:54:00.717327', '_unique_id': '76ee69ea9afa4ebeb51d6c56b2b7e87b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.718 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.719 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.719 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7fbf9e25-8e16-4c6e-80a0-da2125801ac9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:54:00.719405', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '2a3e7a64-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.845732187, 'message_signature': '77a88447ec7a172a2e7e935d908b966270600c0e87e485f3de6ea62f65597e4f'}]}, 'timestamp': '2025-11-28 09:54:00.719858', '_unique_id': '9ef658763f3746119aad286a2c137f52'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.720 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.721 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.749 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.750 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2b1be69-9afb-4cf5-94cd-b4b98d9431c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:54:00.721903', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a431f9c-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.893794707, 'message_signature': 'c85ad98e9758bf3cd9c2a87d079c7048cb2be27b2925a6364243f86bf80e4537'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:54:00.721903', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a43305e-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.893794707, 'message_signature': '42965516291f61da68e78a22c1057456f616e40437d42de1f16e0bd1a5c59591'}]}, 'timestamp': '2025-11-28 09:54:00.750732', '_unique_id': 'cc35234855d74c3e877c847a7ec0da0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.751 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.752 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.752 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.753 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd08350dc-b26f-4454-bf61-7ed1241e92ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:54:00.753092', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '2a439e72-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.845732187, 'message_signature': '4d693f28718bb6a9f916e8b3f20c213816549db7fb7ff6ab92fe8aa4453db120'}]}, 'timestamp': '2025-11-28 09:54:00.753551', '_unique_id': '0c7778efa88a4558964c4e2ddd5d0271'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.754 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.755 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.755 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.756 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b5d3d5a-6477-45c9-81a4-8bf5b589b5ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:54:00.755616', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a440092-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.893794707, 'message_signature': 'c01503078a0d33d96273cd993101aeb71ca63e90f5a37ef8799852316cd2314a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:54:00.755616', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a44119a-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.893794707, 'message_signature': '6d487218e602f2e31a066247aa3ee7eccaa270cd1115c965134ca1d6d3ce94ef'}]}, 'timestamp': '2025-11-28 09:54:00.756465', '_unique_id': '04798778d83e4d4c84f9525a7661225d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.757 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.758 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.758 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '539562b4-0c9f-432f-b2bf-09a1690f846b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:54:00.758555', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '2a447482-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.845732187, 'message_signature': 'b99883d6e1c45dfef80e25552b700e108e02773cb360fc48dad672d59f22fa08'}]}, 'timestamp': '2025-11-28 09:54:00.759054', '_unique_id': 'd14ccbd0bc694ad6b72413db6d20a4f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.759 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.760 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.761 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.761 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8f648ed-6fe3-41bc-a57a-0274d6887094', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:54:00.761078', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a44d602-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.893794707, 'message_signature': '7d77ef53e4129a9fded07c0f485f86e516f6248908b686e3c0dd6fad720dfea7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:54:00.761078', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a44e5b6-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.893794707, 'message_signature': 'b7090e0f5f5450f6c21fcbdf75461b1e9e18b6f90a5d4c8d585451b2f8be595f'}]}, 'timestamp': '2025-11-28 09:54:00.761894', '_unique_id': '7ed528d6103847fa9285714ad044ef35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.762 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.763 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.764 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.764 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76f621c2-7c13-4bce-a7b4-a0c8995e074a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:54:00.764141', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '2a454e02-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.845732187, 'message_signature': '940d261ed1d8350d02636e9564151dc2ffb5d6477eaa01d4ca8a668cdee9e51f'}]}, 'timestamp': '2025-11-28 09:54:00.764597', '_unique_id': '6bf611ae31f34232b9433dc0d08c4c46'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.765 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.766 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.766 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.767 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7f1737a-dd16-450f-b207-567a25aca1cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:54:00.766600', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a45ae92-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.893794707, 'message_signature': '0e659e44b1cc470b06bf58d0f5e3425c75ddc97e6146804d7a6714946aba9d23'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:54:00.766600', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a45bfd6-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.893794707, 'message_signature': 'd9e8c21766815e45a6ef9ac287d1a7e6e00003fd3c0de0614a4fa6ac236473de'}]}, 'timestamp': '2025-11-28 09:54:00.767480', '_unique_id': 'ef156c5bad714e5d9f759af4c06bbf07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.768 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.769 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.769 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44263ae9-fcb6-4e46-b0e7-b1bdd72265c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:54:00.769530', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '2a46202a-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.845732187, 'message_signature': 'aa4ed5f673b90b7bbb1c746fdef619af0c11d12f0ced422590734e1743f804f9'}]}, 'timestamp': '2025-11-28 09:54:00.769976', '_unique_id': '635f90760c924b3a86f65384f1ccb5b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.770 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.771 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.772 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d4a3b29-24d0-47e6-adb6-82dd20ec9dac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:54:00.772048', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '2a4682ea-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.845732187, 'message_signature': 'a8cdf5a827b8a827eea00c9c761b8d9472faadad0b04ec58426f9b61896af75e'}]}, 'timestamp': '2025-11-28 09:54:00.772504', '_unique_id': '7c38f6b18a0e4af1907ad537d9373d84'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.773 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.774 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.774 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.774 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '630715b6-4e45-434c-b674-224868582e47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:54:00.774650', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '2a46e992-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.845732187, 'message_signature': 'd7c4ad51bf9f96291c6313bdfc033fbcbbf1192a1c3dc43843f72a22d1373c98'}]}, 'timestamp': '2025-11-28 09:54:00.775159', '_unique_id': '2bd130fd9b6e4e10a0b4ba2e651c8514'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.776 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.777 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.777 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.777 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '983b3d14-770a-454e-8832-adca62eab4ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:54:00.777299', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a474f9a-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.893794707, 'message_signature': '896b82cfd4fff681169b400d37c5c4c212e97897f7a2bfd270ea0fff5d74fded'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:54:00.777299', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a475f62-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.893794707, 'message_signature': '7c68b4699c7024591a098138d70f02bfadaffd8166b91bc0f268ac4e199ae012'}]}, 'timestamp': '2025-11-28 09:54:00.778148', '_unique_id': 'ca7912c191d042b6b539e548a16e6ee6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.779 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.780 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.780 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.780 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f0743cb-3ff5-4c97-9efe-0423b60c33ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:54:00.780212', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a47c146-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.893794707, 'message_signature': '70c15efe6ae0d8fb7d5aee7465278f2d3f9d7a35ead7730bc311aa0b4d921f8b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:54:00.780212', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a47d0d2-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.893794707, 'message_signature': 'bca0cb3be00b108a4a37b92d84090e0c614f3dc4cf4c4f2b72c594c975f6720f'}]}, 'timestamp': '2025-11-28 09:54:00.781054', '_unique_id': 'e24ca6d9dc7c40f58f2125d997fed7df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.781 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.782 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.783 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd11d099a-586d-4d5b-a368-f615fc6856d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:54:00.783115', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '2a483306-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.845732187, 'message_signature': '86aecc758aa28ad709561a3b1fcf38d993ad88f5dc677fe8e9544f33a9d3bcf1'}]}, 'timestamp': '2025-11-28 09:54:00.783562', '_unique_id': 'dfce17da8df94e67b0462253256cd888'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.785 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.786 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5870e7ec-11c2-4da9-aafc-3d5c474275d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:54:00.786137', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '2a48a8fe-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.845732187, 'message_signature': '32d152f812d55a75a85642ae4f249d40ecc99cbbb118a71278001238c6159dbe'}]}, 'timestamp': '2025-11-28 09:54:00.786586', '_unique_id': 'e4198e37130c4afb8d38daba16160ef6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.788 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.788 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.789 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c68ae9f0-f2ca-42b4-8b2c-8182abdd8aab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:54:00.788617', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a49097a-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.87540269, 'message_signature': '6b0d412fd1953bedc1ccf21a4b6976db5856f5d651ce68d5669c08a583896ac3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:54:00.788617', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a491d52-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.87540269, 'message_signature': '410c8314c289eccb7c98b836f66fb3aef50d9e659f435d00cde5505bf59f17e1'}]}, 'timestamp': '2025-11-28 09:54:00.789540', '_unique_id': 'c7c7687057d84e1ea7e1ca8023634e0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.790 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.791 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.791 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.791 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b3ac194-cad1-4273-bcb2-82edde00aa56', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:54:00.791643', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2a497d74-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.87540269, 'message_signature': '858e46028ce69efaa65a44cea5406d1824b2917678deedbb7c28fd102f181c46'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:54:00.791643', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2a4987f6-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11474.87540269, 'message_signature': '6c298a1438cbd043551ea0b84cf8a9a5e2ed7e46915162c0f2313ea659967fa4'}]}, 'timestamp': '2025-11-28 09:54:00.792188', '_unique_id': '9893a01b73264b1a8a359c82959a9870'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:54:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:54:00.792 12 ERROR oslo_messaging.notify.messaging Nov 28 04:54:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:54:02 localhost podman[297541]: 2025-11-28 09:54:02.848342299 +0000 UTC m=+0.086601396 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 28 04:54:02 localhost podman[297541]: 2025-11-28 09:54:02.864569869 +0000 UTC m=+0.102828966 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., config_id=edpm, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.expose-services=, name=ubi9-minimal) Nov 28 04:54:02 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:54:03 localhost nova_compute[279673]: 2025-11-28 09:54:03.343 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:54:04 localhost ceph-mon[292954]: mon.np0005538513@3(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:54:05 localhost nova_compute[279673]: 2025-11-28 09:54:05.120 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:54:05 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:54:06 localhost podman[297579]: 2025-11-28 09:54:06.841627875 +0000 UTC m=+0.078951801 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 28 04:54:06 localhost podman[297579]: 2025-11-28 09:54:06.880590765 +0000 UTC m=+0.117914691 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:54:06 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:54:07 localhost ceph-mon[292954]: Saving service mon spec with placement label:mon Nov 28 04:54:07 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:07 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:54:07 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:08 localhost nova_compute[279673]: 2025-11-28 09:54:08.347 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:54:09 localhost ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b91e0 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0 Nov 28 04:54:09 localhost ceph-mon[292954]: mon.np0005538513@3(peon) e11 my rank is now 2 (was 3) Nov 28 04:54:09 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : mon.np0005538513 calling monitor election Nov 28 04:54:09 localhost ceph-mon[292954]: paxos.2).electionLogic(40) init, last seen epoch 40 Nov 28 04:54:09 localhost ceph-mon[292954]: mon.np0005538513@2(electing) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:54:09 localhost ceph-mon[292954]: mon.np0005538513@2(electing) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:54:09 localhost ceph-mon[292954]: mon.np0005538513@2(peon) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:54:09 localhost ceph-mon[292954]: mon.np0005538513@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:54:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:54:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:54:10 localhost podman[297815]: 2025-11-28 09:54:10.036990893 +0000 UTC m=+0.081769528 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent) Nov 28 04:54:10 localhost ceph-mon[292954]: Remove daemons mon.np0005538514 Nov 28 04:54:10 localhost ceph-mon[292954]: Safe to remove mon.np0005538514: new quorum should be ['np0005538512', 'np0005538515', 'np0005538513'] (from ['np0005538512', 'np0005538515', 'np0005538513']) Nov 28 04:54:10 localhost ceph-mon[292954]: Removing monitor np0005538514 from monmap... Nov 28 04:54:10 localhost ceph-mon[292954]: Removing daemon mon.np0005538514 from np0005538514.localdomain -- ports [] Nov 28 04:54:10 localhost ceph-mon[292954]: mon.np0005538513 calling monitor election Nov 28 04:54:10 localhost ceph-mon[292954]: mon.np0005538512 calling monitor election Nov 28 04:54:10 localhost ceph-mon[292954]: mon.np0005538515 calling monitor election Nov 28 04:54:10 localhost ceph-mon[292954]: mon.np0005538512 is new leader, mons np0005538512,np0005538515,np0005538513 in quorum (ranks 0,1,2) Nov 28 04:54:10 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:54:10 localhost ceph-mon[292954]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf Nov 28 04:54:10 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:54:10 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:54:10 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:54:10 localhost ceph-mon[292954]: overall HEALTH_OK Nov 28 04:54:10 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:10 localhost podman[297815]: 2025-11-28 09:54:10.069274516 +0000 UTC m=+0.114053191 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Nov 28 04:54:10 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:54:10 localhost podman[238687]: time="2025-11-28T09:54:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:54:10 localhost podman[238687]: @ - - [28/Nov/2025:09:54:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 28 04:54:10 localhost nova_compute[279673]: 2025-11-28 09:54:10.123 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:54:10 localhost systemd[1]: tmp-crun.1HZGQi.mount: Deactivated successfully. Nov 28 04:54:10 localhost podman[297847]: 2025-11-28 09:54:10.190299672 +0000 UTC m=+0.146666516 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 28 04:54:10 localhost podman[238687]: @ - - [28/Nov/2025:09:54:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18727 "" "Go-http-client/1.1" Nov 28 04:54:10 localhost podman[297847]: 2025-11-28 09:54:10.25260906 +0000 UTC m=+0.208975924 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:54:10 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:54:11 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:54:11 localhost ceph-mon[292954]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:54:11 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:54:11 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:54:11 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:11 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:11 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:11 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:11 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:11 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:11 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:11 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:11 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:11 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:54:11 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:54:12 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)... Nov 28 04:54:12 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain Nov 28 04:54:12 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:12 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:12 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:54:12 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:54:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:54:12 localhost podman[297982]: 2025-11-28 09:54:12.85325537 +0000 UTC m=+0.087166194 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=edpm) Nov 28 04:54:12 localhost podman[297982]: 2025-11-28 09:54:12.86756351 +0000 UTC m=+0.101474294 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2) Nov 28 04:54:12 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:54:13 localhost nova_compute[279673]: 2025-11-28 09:54:13.382 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:54:13 localhost ceph-mon[292954]: Reconfiguring crash.np0005538512 (monmap changed)... Nov 28 04:54:13 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain Nov 28 04:54:13 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:13 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:13 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:54:13 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:54:13 localhost podman[298053]: Nov 28 04:54:13 localhost podman[298053]: 2025-11-28 09:54:13.767164501 +0000 UTC m=+0.079167078 container create e98336c95c269e3ca85a3eb3b2453778f1911229d0af5ab851c9dd48e21d7a73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_williams, build-date=2025-09-24T08:57:55, architecture=x86_64, distribution-scope=public, RELEASE=main, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vendor=Red Hat, Inc., name=rhceph, GIT_BRANCH=main, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7) Nov 28 04:54:13 localhost systemd[1]: Started libpod-conmon-e98336c95c269e3ca85a3eb3b2453778f1911229d0af5ab851c9dd48e21d7a73.scope. Nov 28 04:54:13 localhost systemd[1]: Started libcrun container. Nov 28 04:54:13 localhost podman[298053]: 2025-11-28 09:54:13.733775523 +0000 UTC m=+0.045778130 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:54:13 localhost podman[298053]: 2025-11-28 09:54:13.836512286 +0000 UTC m=+0.148514863 container init e98336c95c269e3ca85a3eb3b2453778f1911229d0af5ab851c9dd48e21d7a73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_williams, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.openshift.tags=rhceph ceph, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_CLEAN=True) Nov 28 04:54:13 localhost podman[298053]: 2025-11-28 09:54:13.846429931 +0000 UTC m=+0.158432518 container start e98336c95c269e3ca85a3eb3b2453778f1911229d0af5ab851c9dd48e21d7a73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_williams, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, ceph=True, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.openshift.tags=rhceph ceph, distribution-scope=public, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph) Nov 28 04:54:13 localhost podman[298053]: 2025-11-28 09:54:13.846745551 +0000 UTC m=+0.158748128 container attach e98336c95c269e3ca85a3eb3b2453778f1911229d0af5ab851c9dd48e21d7a73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_williams, maintainer=Guillaume Abrioux , io.openshift.expose-services=, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, architecture=x86_64, name=rhceph) Nov 28 04:54:13 localhost competent_williams[298068]: 167 167 Nov 28 04:54:13 localhost systemd[1]: libpod-e98336c95c269e3ca85a3eb3b2453778f1911229d0af5ab851c9dd48e21d7a73.scope: Deactivated successfully. Nov 28 04:54:13 localhost podman[298053]: 2025-11-28 09:54:13.853205119 +0000 UTC m=+0.165207706 container died e98336c95c269e3ca85a3eb3b2453778f1911229d0af5ab851c9dd48e21d7a73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_williams, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, ceph=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12) Nov 28 04:54:13 localhost systemd[1]: var-lib-containers-storage-overlay-237c3de389c3fb5b1f0547dcdf14e999c29be0c7e8234a28c5b45ddf7b885af5-merged.mount: Deactivated successfully. Nov 28 04:54:13 localhost podman[298074]: 2025-11-28 09:54:13.951781244 +0000 UTC m=+0.088419343 container remove e98336c95c269e3ca85a3eb3b2453778f1911229d0af5ab851c9dd48e21d7a73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_williams, distribution-scope=public, name=rhceph, version=7, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_BRANCH=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, release=553, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc.) Nov 28 04:54:13 localhost systemd[1]: libpod-conmon-e98336c95c269e3ca85a3eb3b2453778f1911229d0af5ab851c9dd48e21d7a73.scope: Deactivated successfully. Nov 28 04:54:14 localhost ceph-mon[292954]: Reconfiguring crash.np0005538513 (monmap changed)... Nov 28 04:54:14 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain Nov 28 04:54:14 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:14 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:14 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 28 04:54:14 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:14 localhost ceph-mon[292954]: mon.np0005538513@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0. Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.564720) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22 Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323654564771, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1326, "num_deletes": 252, "total_data_size": 2258697, "memory_usage": 2295448, "flush_reason": "Manual Compaction"} Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323654575769, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 1301650, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13559, "largest_seqno": 14880, "table_properties": {"data_size": 1295703, "index_size": 3097, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 15968, "raw_average_key_size": 22, "raw_value_size": 1282660, "raw_average_value_size": 1798, "num_data_blocks": 132, "num_entries": 713, "num_filter_entries": 713, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323624, "oldest_key_time": 1764323624, "file_creation_time": 1764323654, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}} Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 11094 microseconds, and 4712 cpu microseconds. Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.575813) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 1301650 bytes OK Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.575837) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.577977) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.577998) EVENT_LOG_v1 {"time_micros": 1764323654577992, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.578039) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 2251827, prev total WAL file size 2251827, number of live WAL files 2. Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.578719) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130353432' seq:72057594037927935, type:22 .. '7061786F73003130373934' seq:0, type:0; will stop at (end) Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(1271KB)], [21(16MB)] Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323654578787, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 19006367, "oldest_snapshot_seqno": -1} Nov 28 04:54:14 localhost podman[298144]: Nov 28 04:54:14 localhost podman[298144]: 2025-11-28 09:54:14.702831051 +0000 UTC m=+0.103210707 container create 5332132bd1ed18ad7f6e2689e33a8421760cd5e7f00f47306f87083f5dc1be4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_antonelli, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.tags=rhceph ceph, ceph=True, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, release=553, name=rhceph, GIT_CLEAN=True, GIT_BRANCH=main, io.buildah.version=1.33.12) Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 10398 keys, 15806830 bytes, temperature: kUnknown Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323654703561, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 15806830, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15747383, "index_size": 32338, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26053, "raw_key_size": 280230, "raw_average_key_size": 26, "raw_value_size": 15569547, "raw_average_value_size": 1497, "num_data_blocks": 1224, "num_entries": 10398, "num_filter_entries": 10398, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764323654, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}} Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.703906) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 15806830 bytes Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.705872) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 152.2 rd, 126.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 16.9 +0.0 blob) out(15.1 +0.0 blob), read-write-amplify(26.7) write-amplify(12.1) OK, records in: 10934, records dropped: 536 output_compression: NoCompression Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.705916) EVENT_LOG_v1 {"time_micros": 1764323654705902, "job": 10, "event": "compaction_finished", "compaction_time_micros": 124888, "compaction_time_cpu_micros": 41269, "output_level": 6, "num_output_files": 1, "total_output_size": 15806830, "num_input_records": 10934, "num_output_records": 10398, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323654706291, "job": 10, "event": "table_file_deletion", "file_number": 23} Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323654708808, "job": 10, "event": "table_file_deletion", "file_number": 21} Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.578658) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.708977) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.708988) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.708992) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.708996) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:54:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:54:14.708999) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:54:14 localhost systemd[1]: Started libpod-conmon-5332132bd1ed18ad7f6e2689e33a8421760cd5e7f00f47306f87083f5dc1be4c.scope. Nov 28 04:54:14 localhost podman[298144]: 2025-11-28 09:54:14.646826348 +0000 UTC m=+0.047206034 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:54:14 localhost systemd[1]: Started libcrun container. Nov 28 04:54:14 localhost podman[298144]: 2025-11-28 09:54:14.762755337 +0000 UTC m=+0.163134993 container init 5332132bd1ed18ad7f6e2689e33a8421760cd5e7f00f47306f87083f5dc1be4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_antonelli, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux , GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-09-24T08:57:55, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 28 04:54:14 localhost podman[298144]: 2025-11-28 09:54:14.773277891 +0000 UTC m=+0.173657537 container start 5332132bd1ed18ad7f6e2689e33a8421760cd5e7f00f47306f87083f5dc1be4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_antonelli, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, ceph=True, RELEASE=main, build-date=2025-09-24T08:57:55, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , distribution-scope=public, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, version=7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7) Nov 28 04:54:14 localhost podman[298144]: 2025-11-28 09:54:14.773538309 +0000 UTC m=+0.173917995 container attach 5332132bd1ed18ad7f6e2689e33a8421760cd5e7f00f47306f87083f5dc1be4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_antonelli, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, name=rhceph, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, RELEASE=main, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vendor=Red Hat, Inc., architecture=x86_64, GIT_BRANCH=main, ceph=True, io.openshift.expose-services=) Nov 28 04:54:14 localhost strange_antonelli[298159]: 167 167 Nov 28 04:54:14 localhost systemd[1]: libpod-5332132bd1ed18ad7f6e2689e33a8421760cd5e7f00f47306f87083f5dc1be4c.scope: Deactivated successfully. Nov 28 04:54:14 localhost podman[298144]: 2025-11-28 09:54:14.776255842 +0000 UTC m=+0.176635548 container died 5332132bd1ed18ad7f6e2689e33a8421760cd5e7f00f47306f87083f5dc1be4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_antonelli, name=rhceph, com.redhat.component=rhceph-container, version=7, GIT_BRANCH=main, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, release=553, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux ) Nov 28 04:54:14 localhost systemd[1]: var-lib-containers-storage-overlay-c4acb01f3f7a20eabb74f1ed2296f5b8bb3690823138adaa5f56b9ba4d6a34d0-merged.mount: Deactivated successfully. Nov 28 04:54:14 localhost podman[298164]: 2025-11-28 09:54:14.875139276 +0000 UTC m=+0.089461875 container remove 5332132bd1ed18ad7f6e2689e33a8421760cd5e7f00f47306f87083f5dc1be4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_antonelli, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, ceph=True, com.redhat.component=rhceph-container, RELEASE=main, distribution-scope=public, version=7, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.openshift.expose-services=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 28 04:54:14 localhost systemd[1]: libpod-conmon-5332132bd1ed18ad7f6e2689e33a8421760cd5e7f00f47306f87083f5dc1be4c.scope: Deactivated successfully. Nov 28 04:54:15 localhost nova_compute[279673]: 2025-11-28 09:54:15.157 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:54:15 localhost ceph-mon[292954]: Reconfiguring osd.2 (monmap changed)... Nov 28 04:54:15 localhost ceph-mon[292954]: Reconfiguring daemon osd.2 on np0005538513.localdomain Nov 28 04:54:15 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:15 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:15 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 28 04:54:15 localhost podman[298241]: Nov 28 04:54:15 localhost podman[298241]: 2025-11-28 09:54:15.724628574 +0000 UTC m=+0.077570069 container create 60809c4684c4edfacaec147b27e19bf26f866af98ded58f05f104f591cdf3713 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_liskov, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, release=553, RELEASE=main, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.buildah.version=1.33.12, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=) Nov 28 04:54:15 localhost systemd[1]: Started libpod-conmon-60809c4684c4edfacaec147b27e19bf26f866af98ded58f05f104f591cdf3713.scope. Nov 28 04:54:15 localhost systemd[1]: Started libcrun container. Nov 28 04:54:15 localhost podman[298241]: 2025-11-28 09:54:15.692384792 +0000 UTC m=+0.045326317 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:54:15 localhost podman[298241]: 2025-11-28 09:54:15.793419272 +0000 UTC m=+0.146360767 container init 60809c4684c4edfacaec147b27e19bf26f866af98ded58f05f104f591cdf3713 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_liskov, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, version=7, RELEASE=main, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, ceph=True, vendor=Red Hat, Inc.) Nov 28 04:54:15 localhost podman[298241]: 2025-11-28 09:54:15.802939295 +0000 UTC m=+0.155880790 container start 60809c4684c4edfacaec147b27e19bf26f866af98ded58f05f104f591cdf3713 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_liskov, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, ceph=True, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_CLEAN=True, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 28 04:54:15 localhost podman[298241]: 2025-11-28 09:54:15.803341137 +0000 UTC m=+0.156282632 container attach 60809c4684c4edfacaec147b27e19bf26f866af98ded58f05f104f591cdf3713 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_liskov, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, distribution-scope=public, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, com.redhat.component=rhceph-container, release=553, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_BRANCH=main) Nov 28 04:54:15 localhost happy_liskov[298256]: 167 167 Nov 28 04:54:15 localhost systemd[1]: libpod-60809c4684c4edfacaec147b27e19bf26f866af98ded58f05f104f591cdf3713.scope: Deactivated successfully. Nov 28 04:54:15 localhost podman[298241]: 2025-11-28 09:54:15.805870565 +0000 UTC m=+0.158812100 container died 60809c4684c4edfacaec147b27e19bf26f866af98ded58f05f104f591cdf3713 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_liskov, name=rhceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, vcs-type=git, release=553, io.buildah.version=1.33.12, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., com.redhat.component=rhceph-container) Nov 28 04:54:15 localhost systemd[1]: var-lib-containers-storage-overlay-50ce6c0051a6e4efdccd421a571d7e5ee3041d4ffb448cc5e6171193994b5c51-merged.mount: Deactivated successfully. Nov 28 04:54:15 localhost podman[298261]: 2025-11-28 09:54:15.903515631 +0000 UTC m=+0.089097814 container remove 60809c4684c4edfacaec147b27e19bf26f866af98ded58f05f104f591cdf3713 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_liskov, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , RELEASE=main, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, io.openshift.expose-services=, release=553, version=7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_CLEAN=True) Nov 28 04:54:15 localhost systemd[1]: libpod-conmon-60809c4684c4edfacaec147b27e19bf26f866af98ded58f05f104f591cdf3713.scope: Deactivated successfully. Nov 28 04:54:16 localhost ceph-mon[292954]: Reconfiguring osd.5 (monmap changed)... Nov 28 04:54:16 localhost ceph-mon[292954]: Reconfiguring daemon osd.5 on np0005538513.localdomain Nov 28 04:54:16 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:16 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:16 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:54:16 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:54:16 localhost podman[298336]: Nov 28 04:54:16 localhost podman[298336]: 2025-11-28 09:54:16.763513062 +0000 UTC m=+0.078518698 container create 254fe9ab19a6479d3eb43385db51b384d27e20e999398e02ececcf25fe885b9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_jang, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, ceph=True, vcs-type=git, release=553, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.buildah.version=1.33.12, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, version=7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=) Nov 28 04:54:16 localhost systemd[1]: Started libpod-conmon-254fe9ab19a6479d3eb43385db51b384d27e20e999398e02ececcf25fe885b9d.scope. Nov 28 04:54:16 localhost systemd[1]: Started libcrun container. Nov 28 04:54:16 localhost podman[298336]: 2025-11-28 09:54:16.822311532 +0000 UTC m=+0.137317188 container init 254fe9ab19a6479d3eb43385db51b384d27e20e999398e02ececcf25fe885b9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_jang, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , release=553, architecture=x86_64, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, GIT_CLEAN=True, ceph=True, vcs-type=git, description=Red Hat Ceph Storage 7, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main) Nov 28 04:54:16 localhost podman[298336]: 2025-11-28 09:54:16.732122456 +0000 UTC m=+0.047128162 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:54:16 localhost podman[298336]: 2025-11-28 09:54:16.831901637 +0000 UTC m=+0.146907283 container start 254fe9ab19a6479d3eb43385db51b384d27e20e999398e02ececcf25fe885b9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_jang, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, maintainer=Guillaume Abrioux , name=rhceph, architecture=x86_64, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=553, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, vendor=Red Hat, Inc., distribution-scope=public, version=7, io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=) Nov 28 04:54:16 localhost podman[298336]: 2025-11-28 09:54:16.832272419 +0000 UTC m=+0.147278065 container attach 254fe9ab19a6479d3eb43385db51b384d27e20e999398e02ececcf25fe885b9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_jang, GIT_CLEAN=True, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., ceph=True, vcs-type=git, io.buildah.version=1.33.12) Nov 28 04:54:16 localhost mystifying_jang[298351]: 167 167 Nov 28 04:54:16 localhost systemd[1]: libpod-254fe9ab19a6479d3eb43385db51b384d27e20e999398e02ececcf25fe885b9d.scope: Deactivated successfully. Nov 28 04:54:16 localhost podman[298336]: 2025-11-28 09:54:16.834644761 +0000 UTC m=+0.149650437 container died 254fe9ab19a6479d3eb43385db51b384d27e20e999398e02ececcf25fe885b9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_jang, architecture=x86_64, com.redhat.component=rhceph-container, version=7, GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-type=git, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 28 04:54:16 localhost systemd[1]: var-lib-containers-storage-overlay-9f58a47facb36c21493d1508eb5bb6f7be5ec7366f6ff3fc1623af885ed7ae8e-merged.mount: Deactivated successfully. Nov 28 04:54:16 localhost podman[298356]: 2025-11-28 09:54:16.9353128 +0000 UTC m=+0.088132433 container remove 254fe9ab19a6479d3eb43385db51b384d27e20e999398e02ececcf25fe885b9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_jang, RELEASE=main, ceph=True, name=rhceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, architecture=x86_64, version=7, vcs-type=git, release=553, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container) Nov 28 04:54:16 localhost systemd[1]: libpod-conmon-254fe9ab19a6479d3eb43385db51b384d27e20e999398e02ececcf25fe885b9d.scope: Deactivated successfully. Nov 28 04:54:17 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)... Nov 28 04:54:17 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain Nov 28 04:54:17 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:17 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:17 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)... Nov 28 04:54:17 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:54:17 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:54:17 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain Nov 28 04:54:17 localhost podman[298426]: Nov 28 04:54:17 localhost podman[298426]: 2025-11-28 09:54:17.642367084 +0000 UTC m=+0.079895640 container create 5effd84eb50c0071ea643089d46e138e2b607a432b2cb3e80cada94fd70b2162 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_mcclintock, architecture=x86_64, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, distribution-scope=public, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 28 04:54:17 localhost systemd[1]: Started libpod-conmon-5effd84eb50c0071ea643089d46e138e2b607a432b2cb3e80cada94fd70b2162.scope. Nov 28 04:54:17 localhost systemd[1]: Started libcrun container. Nov 28 04:54:17 localhost podman[298426]: 2025-11-28 09:54:17.699615016 +0000 UTC m=+0.137143572 container init 5effd84eb50c0071ea643089d46e138e2b607a432b2cb3e80cada94fd70b2162 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_mcclintock, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, name=rhceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 28 04:54:17 localhost podman[298426]: 2025-11-28 09:54:17.708096147 +0000 UTC m=+0.145624703 container start 5effd84eb50c0071ea643089d46e138e2b607a432b2cb3e80cada94fd70b2162 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_mcclintock, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, release=553, GIT_CLEAN=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, distribution-scope=public, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-type=git, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 28 04:54:17 localhost podman[298426]: 2025-11-28 09:54:17.708378256 +0000 UTC m=+0.145906842 container attach 5effd84eb50c0071ea643089d46e138e2b607a432b2cb3e80cada94fd70b2162 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_mcclintock, description=Red Hat Ceph Storage 7, version=7, build-date=2025-09-24T08:57:55, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., distribution-scope=public, GIT_CLEAN=True, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, ceph=True) Nov 28 04:54:17 localhost practical_mcclintock[298441]: 167 167 Nov 28 04:54:17 localhost systemd[1]: libpod-5effd84eb50c0071ea643089d46e138e2b607a432b2cb3e80cada94fd70b2162.scope: Deactivated successfully. Nov 28 04:54:17 localhost podman[298426]: 2025-11-28 09:54:17.710704498 +0000 UTC m=+0.148233084 container died 5effd84eb50c0071ea643089d46e138e2b607a432b2cb3e80cada94fd70b2162 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_mcclintock, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, architecture=x86_64, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, ceph=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements) Nov 28 04:54:17 localhost podman[298426]: 2025-11-28 09:54:17.612582607 +0000 UTC m=+0.050111223 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:54:17 localhost podman[298446]: 2025-11-28 09:54:17.800398498 +0000 UTC m=+0.081923942 container remove 5effd84eb50c0071ea643089d46e138e2b607a432b2cb3e80cada94fd70b2162 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_mcclintock, architecture=x86_64, io.openshift.tags=rhceph ceph, vcs-type=git, release=553, version=7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 04:54:17 localhost systemd[1]: libpod-conmon-5effd84eb50c0071ea643089d46e138e2b607a432b2cb3e80cada94fd70b2162.scope: Deactivated successfully. Nov 28 04:54:17 localhost systemd[1]: var-lib-containers-storage-overlay-b0a1bf3430418ba81dfed06a8dab4ddfa964857e39d5d37e3d8d14731d168ac0-merged.mount: Deactivated successfully. Nov 28 04:54:18 localhost openstack_network_exporter[240658]: ERROR 09:54:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:54:18 localhost openstack_network_exporter[240658]: ERROR 09:54:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:54:18 localhost openstack_network_exporter[240658]: ERROR 09:54:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:54:18 localhost openstack_network_exporter[240658]: ERROR 09:54:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:54:18 localhost openstack_network_exporter[240658]: Nov 28 04:54:18 localhost openstack_network_exporter[240658]: ERROR 09:54:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:54:18 localhost openstack_network_exporter[240658]: Nov 28 04:54:18 localhost nova_compute[279673]: 2025-11-28 09:54:18.388 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:54:18 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:18 localhost ceph-mon[292954]: Reconfiguring crash.np0005538514 (monmap changed)... Nov 28 04:54:18 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:18 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:54:18 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:54:18 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain Nov 28 04:54:18 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:18 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:18 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 28 04:54:19 localhost ceph-mon[292954]: mon.np0005538513@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:54:20 localhost nova_compute[279673]: 2025-11-28 09:54:20.160 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:54:20 localhost ceph-mon[292954]: Reconfiguring osd.0 (monmap changed)... Nov 28 04:54:20 localhost ceph-mon[292954]: Reconfiguring daemon osd.0 on np0005538514.localdomain Nov 28 04:54:20 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:20 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:20 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 28 04:54:21 localhost ceph-mon[292954]: Reconfiguring osd.3 (monmap changed)... Nov 28 04:54:21 localhost ceph-mon[292954]: Reconfiguring daemon osd.3 on np0005538514.localdomain Nov 28 04:54:21 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:21 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:21 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:54:21 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:54:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:54:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:54:21 localhost podman[298463]: 2025-11-28 09:54:21.860247992 +0000 UTC m=+0.093591622 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 04:54:21 localhost podman[298464]: 2025-11-28 09:54:21.907967041 +0000 UTC m=+0.139377991 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125) Nov 28 04:54:21 localhost podman[298464]: 2025-11-28 09:54:21.917623408 +0000 UTC m=+0.149034288 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 28 04:54:21 localhost podman[298463]: 2025-11-28 09:54:21.924986794 +0000 UTC m=+0.158330444 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:54:21 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:54:21 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:54:22 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)... Nov 28 04:54:22 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain Nov 28 04:54:22 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:22 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538514.djozup (monmap changed)... Nov 28 04:54:22 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:22 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:54:22 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:54:22 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain Nov 28 04:54:22 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:22 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:54:22 localhost ceph-mon[292954]: Deploying daemon mon.np0005538514 on np0005538514.localdomain Nov 28 04:54:23 localhost nova_compute[279673]: 2025-11-28 09:54:23.430 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:54:23 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:23 localhost ceph-mon[292954]: Reconfiguring crash.np0005538515 (monmap changed)... Nov 28 04:54:23 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:54:23 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:23 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:54:23 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain Nov 28 04:54:23 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:23 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 28 04:54:23 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:24 localhost ceph-mon[292954]: mon.np0005538513@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:54:24 localhost ceph-mon[292954]: mon.np0005538513@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Nov 28 04:54:24 localhost ceph-mon[292954]: mon.np0005538513@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Nov 28 04:54:25 localhost nova_compute[279673]: 2025-11-28 09:54:25.193 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:54:25 localhost ceph-mon[292954]: mon.np0005538513@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Nov 28 04:54:25 localhost ceph-mon[292954]: Reconfiguring osd.1 (monmap changed)... Nov 28 04:54:25 localhost ceph-mon[292954]: Reconfiguring daemon osd.1 on np0005538515.localdomain Nov 28 04:54:25 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:25 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:25 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 28 04:54:25 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:25 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:26 localhost ceph-mon[292954]: Reconfiguring osd.4 (monmap changed)... Nov 28 04:54:26 localhost ceph-mon[292954]: Reconfiguring daemon osd.4 on np0005538515.localdomain Nov 28 04:54:26 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:26 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:26 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:54:26 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:54:27 localhost ceph-mon[292954]: mon.np0005538513@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Nov 28 04:54:27 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)... Nov 28 04:54:27 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain Nov 28 04:54:27 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:27 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:27 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:54:27 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:54:27 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:27 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:28 localhost nova_compute[279673]: 2025-11-28 09:54:28.431 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:54:28 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)... Nov 28 04:54:28 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain Nov 28 04:54:29 localhost ceph-mon[292954]: mon.np0005538513@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Nov 28 04:54:29 localhost ceph-mon[292954]: mon.np0005538513@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:54:29 localhost ceph-mon[292954]: mon.np0005538513@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Nov 28 04:54:30 localhost nova_compute[279673]: 2025-11-28 09:54:30.196 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:54:30 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:30 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:30 localhost ceph-mon[292954]: from='mgr.17154 172.18.0.108:0/2342198278' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:54:30 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:31 localhost ceph-mon[292954]: mon.np0005538513@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Nov 28 04:54:31 localhost nova_compute[279673]: 2025-11-28 09:54:31.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:54:31 localhost nova_compute[279673]: 2025-11-28 09:54:31.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 04:54:32 localhost nova_compute[279673]: 2025-11-28 09:54:32.767 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:54:32 localhost nova_compute[279673]: 2025-11-28 09:54:32.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:54:32 localhost nova_compute[279673]: 2025-11-28 09:54:32.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:54:33 localhost nova_compute[279673]: 2025-11-28 09:54:33.470 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:54:33 localhost ceph-mon[292954]: mon.np0005538513@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Nov 28 04:54:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:54:33 localhost nova_compute[279673]: 2025-11-28 09:54:33.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:54:33 localhost nova_compute[279673]: 2025-11-28 09:54:33.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:54:33 localhost podman[298593]: 2025-11-28 09:54:33.92053112 +0000 UTC m=+0.150960717 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.openshift.expose-services=, vcs-type=git, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, version=9.6) Nov 28 04:54:33 localhost podman[298593]: 2025-11-28 09:54:33.933657844 +0000 UTC m=+0.164087401 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_id=edpm, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers) Nov 28 04:54:33 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:54:34 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:34 localhost ceph-mon[292954]: mon.np0005538513@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:54:34 localhost nova_compute[279673]: 2025-11-28 09:54:34.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:54:35 localhost nova_compute[279673]: 2025-11-28 09:54:35.230 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:54:35 localhost ceph-mon[292954]: mon.np0005538513@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Nov 28 04:54:35 localhost ceph-mon[292954]: mon.np0005538513@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Nov 28 04:54:36 localhost ceph-mon[292954]: mon.np0005538513@2(peon) e11 handle_command mon_command({"prefix": "status", "format": "json"} v 0) Nov 28 04:54:36 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/11160684' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch Nov 28 04:54:36 localhost nova_compute[279673]: 2025-11-28 09:54:36.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:54:36 localhost nova_compute[279673]: 2025-11-28 09:54:36.886 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:54:36 localhost nova_compute[279673]: 2025-11-28 09:54:36.886 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:54:36 localhost nova_compute[279673]: 2025-11-28 09:54:36.887 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:54:36 localhost nova_compute[279673]: 2025-11-28 09:54:36.887 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 04:54:36 localhost nova_compute[279673]: 2025-11-28 09:54:36.887 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:54:37 localhost ceph-mon[292954]: mon.np0005538513@2(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 04:54:37 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4278207600' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 04:54:37 localhost nova_compute[279673]: 2025-11-28 09:54:37.347 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:54:37 localhost nova_compute[279673]: 2025-11-28 09:54:37.421 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:54:37 localhost nova_compute[279673]: 2025-11-28 09:54:37.422 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:54:37 localhost ceph-mon[292954]: mon.np0005538513@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Nov 28 04:54:37 localhost ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x560b917b8f20 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0 Nov 28 04:54:37 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : mon.np0005538513 calling monitor election Nov 28 04:54:37 localhost ceph-mon[292954]: paxos.2).electionLogic(42) init, last seen epoch 42 Nov 28 04:54:37 localhost ceph-mon[292954]: mon.np0005538513@2(electing) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:54:37 localhost nova_compute[279673]: 2025-11-28 09:54:37.641 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 04:54:37 localhost nova_compute[279673]: 2025-11-28 09:54:37.643 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11785MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 04:54:37 localhost nova_compute[279673]: 2025-11-28 09:54:37.643 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:54:37 localhost nova_compute[279673]: 2025-11-28 09:54:37.644 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:54:37 localhost nova_compute[279673]: 2025-11-28 09:54:37.734 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 04:54:37 localhost nova_compute[279673]: 2025-11-28 09:54:37.735 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 04:54:37 localhost nova_compute[279673]: 2025-11-28 09:54:37.736 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 04:54:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:54:37 localhost nova_compute[279673]: 2025-11-28 09:54:37.801 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:54:37 localhost podman[298635]: 2025-11-28 09:54:37.849008174 +0000 UTC m=+0.081268843 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 28 04:54:37 localhost podman[298635]: 2025-11-28 09:54:37.886878799 +0000 UTC m=+0.119139518 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 04:54:37 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:54:37 localhost ceph-mon[292954]: mon.np0005538513@2(electing) e12 handle_auth_request failed to assign global_id Nov 28 04:54:38 localhost ceph-mon[292954]: mon.np0005538513@2(electing) e12 handle_auth_request failed to assign global_id Nov 28 04:54:38 localhost nova_compute[279673]: 2025-11-28 09:54:38.503 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:54:38 localhost ceph-mon[292954]: mon.np0005538513@2(electing) e12 handle_auth_request failed to assign global_id Nov 28 04:54:39 localhost ceph-mon[292954]: mon.np0005538513@2(electing) e12 handle_auth_request failed to assign global_id Nov 28 04:54:40 localhost podman[238687]: time="2025-11-28T09:54:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:54:40 localhost podman[238687]: @ - - [28/Nov/2025:09:54:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 28 04:54:40 localhost podman[238687]: @ - - [28/Nov/2025:09:54:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18728 "" "Go-http-client/1.1" Nov 28 04:54:40 localhost nova_compute[279673]: 2025-11-28 09:54:40.235 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:54:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:54:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:54:40 localhost podman[298667]: 2025-11-28 09:54:40.853281927 +0000 UTC m=+0.088326589 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125) Nov 28 04:54:40 localhost ceph-mon[292954]: mon.np0005538513@2(electing) e12 handle_auth_request failed to assign global_id Nov 28 04:54:40 localhost podman[298668]: 2025-11-28 09:54:40.941786592 +0000 UTC m=+0.174159612 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent) Nov 28 04:54:40 localhost podman[298667]: 2025-11-28 09:54:40.946379313 +0000 UTC m=+0.181423945 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 04:54:40 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:54:40 localhost podman[298668]: 2025-11-28 09:54:40.976611753 +0000 UTC m=+0.208984813 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent) Nov 28 04:54:40 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:54:41 localhost ceph-mon[292954]: mon.np0005538513@2(electing) e12 handle_auth_request failed to assign global_id Nov 28 04:54:41 localhost ceph-mon[292954]: mon.np0005538513@2(electing) e12 handle_auth_request failed to assign global_id Nov 28 04:54:42 localhost ceph-mon[292954]: mon.np0005538513@2(electing) e12 handle_auth_request failed to assign global_id Nov 28 04:54:42 localhost ceph-mon[292954]: paxos.2).electionLogic(43) init, last seen epoch 43, mid-election, bumping Nov 28 04:54:42 localhost ceph-mon[292954]: mon.np0005538513@2(electing) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:54:42 localhost ceph-mon[292954]: mon.np0005538513@2(electing) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:54:42 localhost ceph-mon[292954]: mon.np0005538513@2(peon) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:54:42 localhost ceph-mon[292954]: mon.np0005538513 calling monitor election Nov 28 04:54:42 localhost ceph-mon[292954]: mon.np0005538515 calling monitor election Nov 28 04:54:42 localhost ceph-mon[292954]: mon.np0005538512 calling monitor election Nov 28 04:54:42 localhost ceph-mon[292954]: mon.np0005538514 calling monitor election Nov 28 04:54:42 localhost ceph-mon[292954]: mon.np0005538512 is new leader, mons np0005538512,np0005538515,np0005538513,np0005538514 in quorum (ranks 0,1,2,3) Nov 28 04:54:42 localhost ceph-mon[292954]: overall HEALTH_OK Nov 28 04:54:43 localhost nova_compute[279673]: 2025-11-28 09:54:43.526 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:54:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:54:43 localhost podman[298712]: 2025-11-28 09:54:43.847446321 +0000 UTC m=+0.081948833 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125) Nov 28 04:54:43 localhost podman[298712]: 2025-11-28 09:54:43.859876563 +0000 UTC m=+0.094379025 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 28 04:54:43 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:54:43 localhost ceph-mon[292954]: Reconfig service osd.default_drive_group Nov 28 04:54:43 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:43 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:43 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:43 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:43 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:43 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:43 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:43 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:43 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:43 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:43 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:43 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:44 localhost ceph-mon[292954]: mon.np0005538513@2(peon) e12 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 04:54:44 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3652519406' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 04:54:44 localhost nova_compute[279673]: 2025-11-28 09:54:44.330 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 6.530s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:54:44 localhost nova_compute[279673]: 2025-11-28 09:54:44.336 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 04:54:44 localhost nova_compute[279673]: 2025-11-28 09:54:44.378 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 04:54:44 localhost nova_compute[279673]: 2025-11-28 09:54:44.383 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 04:54:44 localhost nova_compute[279673]: 2025-11-28 09:54:44.383 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 6.739s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:54:44 localhost ceph-mon[292954]: mon.np0005538513@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:54:44 localhost ceph-mon[292954]: mon.np0005538513@2(peon).osd e89 e89: 6 total, 6 up, 6 in Nov 28 04:54:44 localhost systemd[1]: session-68.scope: Deactivated successfully. Nov 28 04:54:44 localhost systemd[1]: session-68.scope: Consumed 18.260s CPU time. Nov 28 04:54:44 localhost systemd-logind[764]: Session 68 logged out. Waiting for processes to exit. Nov 28 04:54:44 localhost systemd-logind[764]: Removed session 68. Nov 28 04:54:44 localhost ceph-mon[292954]: from='client.? 172.18.0.200:0/1122335753' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 28 04:54:44 localhost ceph-mon[292954]: Activating manager daemon np0005538511.fvuybw Nov 28 04:54:44 localhost ceph-mon[292954]: from='client.? 172.18.0.200:0/1122335753' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Nov 28 04:54:45 localhost ceph-mon[292954]: from='mgr.17154 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:54:45 localhost nova_compute[279673]: 2025-11-28 09:54:45.259 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:54:47 localhost nova_compute[279673]: 2025-11-28 09:54:47.385 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:54:47 localhost nova_compute[279673]: 2025-11-28 09:54:47.385 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 04:54:47 localhost nova_compute[279673]: 2025-11-28 09:54:47.386 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 04:54:47 localhost nova_compute[279673]: 2025-11-28 09:54:47.494 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:54:47 localhost nova_compute[279673]: 2025-11-28 09:54:47.494 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:54:47 localhost nova_compute[279673]: 2025-11-28 09:54:47.495 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 04:54:47 localhost nova_compute[279673]: 2025-11-28 09:54:47.495 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:54:47 localhost nova_compute[279673]: 2025-11-28 09:54:47.833 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 04:54:47 localhost nova_compute[279673]: 2025-11-28 09:54:47.860 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:54:47 localhost nova_compute[279673]: 2025-11-28 09:54:47.861 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 04:54:48 localhost openstack_network_exporter[240658]: ERROR 09:54:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:54:48 localhost openstack_network_exporter[240658]: ERROR 09:54:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:54:48 localhost openstack_network_exporter[240658]: ERROR 09:54:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:54:48 localhost openstack_network_exporter[240658]: ERROR 09:54:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:54:48 localhost openstack_network_exporter[240658]: Nov 28 04:54:48 localhost openstack_network_exporter[240658]: ERROR 09:54:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:54:48 localhost openstack_network_exporter[240658]: Nov 28 04:54:48 localhost nova_compute[279673]: 2025-11-28 09:54:48.529 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:54:49 localhost ceph-mon[292954]: mon.np0005538513@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:54:50 localhost nova_compute[279673]: 2025-11-28 09:54:50.262 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:54:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:54:50.833 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:54:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:54:50.834 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:54:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:54:50.835 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:54:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:54:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:54:52 localhost systemd[1]: tmp-crun.9ZEreo.mount: Deactivated successfully. Nov 28 04:54:52 localhost podman[298742]: 2025-11-28 09:54:52.851731403 +0000 UTC m=+0.087502304 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 04:54:52 localhost podman[298742]: 2025-11-28 09:54:52.857509601 +0000 UTC m=+0.093280452 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 28 04:54:52 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:54:52 localhost podman[298741]: 2025-11-28 09:54:52.828074735 +0000 UTC m=+0.069576053 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:54:52 localhost podman[298741]: 2025-11-28 09:54:52.907628273 +0000 UTC m=+0.149129621 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 28 04:54:52 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:54:53 localhost nova_compute[279673]: 2025-11-28 09:54:53.574 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:54:54 localhost ceph-mon[292954]: mon.np0005538513@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:54:54 localhost systemd[1]: Stopping User Manager for UID 1002... Nov 28 04:54:54 localhost systemd[26286]: Activating special unit Exit the Session... Nov 28 04:54:54 localhost systemd[26286]: Removed slice User Background Tasks Slice. Nov 28 04:54:54 localhost systemd[26286]: Stopped target Main User Target. Nov 28 04:54:54 localhost systemd[26286]: Stopped target Basic System. Nov 28 04:54:54 localhost systemd[26286]: Stopped target Paths. Nov 28 04:54:54 localhost systemd[26286]: Stopped target Sockets. Nov 28 04:54:54 localhost systemd[26286]: Stopped target Timers. Nov 28 04:54:54 localhost systemd[26286]: Stopped Mark boot as successful after the user session has run 2 minutes. Nov 28 04:54:54 localhost systemd[26286]: Stopped Daily Cleanup of User's Temporary Directories. Nov 28 04:54:55 localhost systemd[26286]: Closed D-Bus User Message Bus Socket. Nov 28 04:54:55 localhost systemd[26286]: Stopped Create User's Volatile Files and Directories. Nov 28 04:54:55 localhost systemd[26286]: Removed slice User Application Slice. Nov 28 04:54:55 localhost systemd[26286]: Reached target Shutdown. Nov 28 04:54:55 localhost systemd[26286]: Finished Exit the Session. Nov 28 04:54:55 localhost systemd[26286]: Reached target Exit the Session. Nov 28 04:54:55 localhost systemd[1]: user@1002.service: Deactivated successfully. Nov 28 04:54:55 localhost systemd[1]: Stopped User Manager for UID 1002. Nov 28 04:54:55 localhost systemd[1]: user@1002.service: Consumed 14.236s CPU time, read 0B from disk, written 7.0K to disk. Nov 28 04:54:55 localhost systemd[1]: Stopping User Runtime Directory /run/user/1002... Nov 28 04:54:55 localhost systemd[1]: run-user-1002.mount: Deactivated successfully. Nov 28 04:54:55 localhost systemd[1]: user-runtime-dir@1002.service: Deactivated successfully. Nov 28 04:54:55 localhost systemd[1]: Stopped User Runtime Directory /run/user/1002. Nov 28 04:54:55 localhost systemd[1]: Removed slice User Slice of UID 1002. Nov 28 04:54:55 localhost systemd[1]: user-1002.slice: Consumed 4min 44.203s CPU time. Nov 28 04:54:55 localhost nova_compute[279673]: 2025-11-28 09:54:55.287 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:54:58 localhost nova_compute[279673]: 2025-11-28 09:54:58.577 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:54:59 localhost ceph-mon[292954]: mon.np0005538513@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:55:00 localhost nova_compute[279673]: 2025-11-28 09:55:00.290 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:55:03 localhost nova_compute[279673]: 2025-11-28 09:55:03.616 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:55:04 localhost ceph-mon[292954]: mon.np0005538513@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:55:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:55:04 localhost podman[298784]: 2025-11-28 09:55:04.852434667 +0000 UTC m=+0.086331108 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible) Nov 28 04:55:04 localhost podman[298784]: 2025-11-28 09:55:04.866325165 +0000 UTC m=+0.100221586 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 28 04:55:04 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:55:05 localhost nova_compute[279673]: 2025-11-28 09:55:05.329 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:55:08 localhost nova_compute[279673]: 2025-11-28 09:55:08.618 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:55:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:55:08 localhost podman[298804]: 2025-11-28 09:55:08.848718058 +0000 UTC m=+0.086908327 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 28 04:55:08 localhost podman[298804]: 2025-11-28 09:55:08.862388687 +0000 UTC m=+0.100578956 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 28 04:55:08 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:55:09 localhost ceph-mon[292954]: mon.np0005538513@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:55:10 localhost podman[238687]: time="2025-11-28T09:55:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:55:10 localhost podman[238687]: @ - - [28/Nov/2025:09:55:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 28 04:55:10 localhost podman[238687]: @ - - [28/Nov/2025:09:55:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18727 "" "Go-http-client/1.1" Nov 28 04:55:10 localhost nova_compute[279673]: 2025-11-28 09:55:10.332 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:55:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:55:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:55:11 localhost podman[298828]: 2025-11-28 09:55:11.853879448 +0000 UTC m=+0.087228006 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:55:11 localhost podman[298828]: 2025-11-28 09:55:11.862296287 +0000 UTC m=+0.095644835 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 04:55:11 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:55:11 localhost podman[298827]: 2025-11-28 09:55:11.951327738 +0000 UTC m=+0.188318577 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 28 04:55:11 localhost podman[298827]: 2025-11-28 09:55:11.990416531 +0000 UTC m=+0.227407350 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3) Nov 28 04:55:12 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:55:13 localhost ceph-mon[292954]: mon.np0005538513@2(peon) e12 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 04:55:13 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4217127523' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 04:55:13 localhost ceph-mon[292954]: mon.np0005538513@2(peon) e12 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 04:55:13 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4217127523' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 04:55:13 localhost nova_compute[279673]: 2025-11-28 09:55:13.661 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:55:14 localhost ceph-mon[292954]: mon.np0005538513@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:55:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:55:14 localhost podman[298868]: 2025-11-28 09:55:14.852652954 +0000 UTC m=+0.088525636 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute) Nov 28 04:55:14 localhost podman[298868]: 2025-11-28 09:55:14.865466308 +0000 UTC m=+0.101338990 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 28 04:55:14 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:55:15 localhost nova_compute[279673]: 2025-11-28 09:55:15.372 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:55:18 localhost openstack_network_exporter[240658]: ERROR 09:55:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:55:18 localhost openstack_network_exporter[240658]: ERROR 09:55:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:55:18 localhost openstack_network_exporter[240658]: ERROR 09:55:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:55:18 localhost openstack_network_exporter[240658]: ERROR 09:55:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:55:18 localhost openstack_network_exporter[240658]: Nov 28 04:55:18 localhost openstack_network_exporter[240658]: ERROR 09:55:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:55:18 localhost openstack_network_exporter[240658]: Nov 28 04:55:18 localhost nova_compute[279673]: 2025-11-28 09:55:18.666 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:55:19 localhost ceph-mon[292954]: mon.np0005538513@2(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:55:19 localhost ceph-mon[292954]: mon.np0005538513@2(peon).osd e90 e90: 6 total, 6 up, 6 in Nov 28 04:55:19 localhost ceph-mgr[286105]: mgr handle_mgr_map Activating! Nov 28 04:55:19 localhost ceph-mgr[286105]: mgr handle_mgr_map I am now activating Nov 28 04:55:19 localhost ceph-mon[292954]: Activating manager daemon np0005538513.dsfdlx Nov 28 04:55:19 localhost ceph-mon[292954]: Manager daemon np0005538511.fvuybw is unresponsive, replacing it with standby daemon np0005538513.dsfdlx Nov 28 04:55:19 localhost ceph-mgr[286105]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 28 04:55:19 localhost ceph-mgr[286105]: mgr load Constructed class from module: balancer Nov 28 04:55:19 localhost ceph-mgr[286105]: [cephadm DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 28 04:55:19 localhost ceph-mgr[286105]: [balancer INFO root] Starting Nov 28 04:55:19 localhost ceph-mgr[286105]: [balancer INFO root] Optimize plan auto_2025-11-28_09:55:19 Nov 28 04:55:19 localhost ceph-mgr[286105]: [balancer INFO root] Mode upmap, max misplaced 0.050000 Nov 28 04:55:19 localhost ceph-mgr[286105]: [balancer INFO root] Some PGs (1.000000) are unknown; try again later Nov 28 04:55:19 localhost ceph-mgr[286105]: [cephadm WARNING root] removing stray HostCache host record np0005538511.localdomain.devices.0 Nov 28 04:55:19 localhost ceph-mgr[286105]: log_channel(cephadm) log [WRN] : removing stray HostCache host record np0005538511.localdomain.devices.0 Nov 28 04:55:19 localhost ceph-mgr[286105]: mgr load Constructed class from module: cephadm Nov 28 04:55:19 localhost ceph-mgr[286105]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 28 04:55:19 localhost ceph-mgr[286105]: mgr load Constructed class from module: crash Nov 28 04:55:19 localhost ceph-mgr[286105]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 28 04:55:19 localhost ceph-mgr[286105]: mgr load Constructed class from module: devicehealth Nov 28 04:55:19 localhost ceph-mgr[286105]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 28 04:55:19 localhost ceph-mgr[286105]: mgr load Constructed class from module: iostat Nov 28 04:55:19 localhost ceph-mgr[286105]: [devicehealth INFO root] Starting Nov 28 04:55:19 localhost ceph-mgr[286105]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 28 04:55:19 localhost ceph-mgr[286105]: mgr load Constructed class from module: nfs Nov 28 04:55:19 localhost ceph-mgr[286105]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 28 04:55:19 localhost ceph-mgr[286105]: mgr load Constructed class from module: orchestrator Nov 28 04:55:19 localhost ceph-mgr[286105]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 28 04:55:19 localhost ceph-mgr[286105]: mgr load Constructed class from module: pg_autoscaler Nov 28 04:55:19 localhost ceph-mgr[286105]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 28 04:55:19 localhost ceph-mgr[286105]: mgr load Constructed class from module: progress Nov 28 04:55:19 localhost ceph-mgr[286105]: [pg_autoscaler INFO root] _maybe_adjust Nov 28 04:55:19 localhost ceph-mgr[286105]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 28 04:55:19 localhost ceph-mgr[286105]: [progress INFO root] Loading... Nov 28 04:55:19 localhost ceph-mgr[286105]: [progress INFO root] Loaded [, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ] historic events Nov 28 04:55:19 localhost ceph-mgr[286105]: [progress INFO root] Loaded OSDMap, ready. Nov 28 04:55:19 localhost ceph-mgr[286105]: [rbd_support INFO root] recovery thread starting Nov 28 04:55:19 localhost ceph-mgr[286105]: [rbd_support INFO root] starting setup Nov 28 04:55:19 localhost ceph-mgr[286105]: mgr load Constructed class from module: rbd_support Nov 28 04:55:19 localhost ceph-mgr[286105]: [restful DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 28 04:55:19 localhost ceph-mgr[286105]: mgr load Constructed class from module: restful Nov 28 04:55:19 localhost ceph-mgr[286105]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 28 04:55:19 localhost ceph-mgr[286105]: mgr load Constructed class from module: status Nov 28 04:55:19 localhost ceph-mgr[286105]: [restful INFO root] server_addr: :: server_port: 8003 Nov 28 04:55:19 localhost ceph-mgr[286105]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 28 04:55:19 localhost ceph-mgr[286105]: mgr load Constructed class from module: telemetry Nov 28 04:55:19 localhost ceph-mgr[286105]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5) Nov 28 04:55:19 localhost ceph-mgr[286105]: [restful WARNING root] server not running: no certificate configured Nov 28 04:55:19 localhost ceph-mgr[286105]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules Nov 28 04:55:19 localhost ceph-mgr[286105]: [rbd_support INFO root] load_schedules: vms, start_after= Nov 28 04:55:19 localhost ceph-mgr[286105]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs' Nov 28 04:55:19 localhost ceph-mgr[286105]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs' Nov 28 04:55:19 localhost ceph-mgr[286105]: mgr load Constructed class from module: volumes Nov 28 04:55:19 localhost ceph-mgr[286105]: [rbd_support INFO root] load_schedules: volumes, start_after= Nov 28 04:55:19 localhost ceph-mgr[286105]: client.0 error registering admin socket command: (17) File exists Nov 28 04:55:19 localhost ceph-mgr[286105]: client.0 error registering admin socket command: (17) File exists Nov 28 04:55:19 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:55:19.862+0000 7fc4b36d9640 -1 client.0 error registering admin socket command: (17) File exists Nov 28 04:55:19 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:55:19.862+0000 7fc4b36d9640 -1 client.0 error registering admin socket command: (17) File exists Nov 28 04:55:19 localhost ceph-mgr[286105]: client.0 error registering admin socket command: (17) File exists Nov 28 04:55:19 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:55:19.862+0000 7fc4b36d9640 -1 client.0 error registering admin socket command: (17) File exists Nov 28 04:55:19 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:55:19.862+0000 7fc4b36d9640 -1 client.0 error registering admin socket command: (17) File exists Nov 28 04:55:19 localhost ceph-mgr[286105]: client.0 error registering admin socket command: (17) File exists Nov 28 04:55:19 localhost ceph-mgr[286105]: client.0 error registering admin socket command: (17) File exists Nov 28 04:55:19 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:55:19.862+0000 7fc4b36d9640 -1 client.0 error registering admin socket command: (17) File exists Nov 28 04:55:19 localhost ceph-mgr[286105]: client.0 error registering admin socket command: (17) File exists Nov 28 04:55:19 localhost ceph-mgr[286105]: client.0 error registering admin socket command: (17) File exists Nov 28 04:55:19 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:55:19.871+0000 7fc4b66df640 -1 client.0 error registering admin socket command: (17) File exists Nov 28 04:55:19 localhost ceph-mgr[286105]: client.0 error registering admin socket command: (17) File exists Nov 28 04:55:19 localhost ceph-mgr[286105]: client.0 error registering admin socket command: (17) File exists Nov 28 04:55:19 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:55:19.871+0000 7fc4b66df640 -1 client.0 error registering admin socket command: (17) File exists Nov 28 04:55:19 localhost ceph-mgr[286105]: client.0 error registering admin socket command: (17) File exists Nov 28 04:55:19 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:55:19.871+0000 7fc4b66df640 -1 client.0 error registering admin socket command: (17) File exists Nov 28 04:55:19 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:55:19.871+0000 7fc4b66df640 -1 client.0 error registering admin socket command: (17) File exists Nov 28 04:55:19 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:55:19.871+0000 7fc4b66df640 -1 client.0 error registering admin socket command: (17) File exists Nov 28 04:55:19 localhost ceph-mgr[286105]: [rbd_support INFO root] load_schedules: images, start_after= Nov 28 04:55:19 localhost ceph-mgr[286105]: [rbd_support INFO root] load_schedules: backups, start_after= Nov 28 04:55:19 localhost ceph-mgr[286105]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting Nov 28 04:55:19 localhost ceph-mgr[286105]: [rbd_support INFO root] PerfHandler: starting Nov 28 04:55:19 localhost ceph-mgr[286105]: [rbd_support INFO root] load_task_task: vms, start_after= Nov 28 04:55:19 localhost ceph-mgr[286105]: [rbd_support INFO root] load_task_task: volumes, start_after= Nov 28 04:55:19 localhost ceph-mgr[286105]: [rbd_support INFO root] load_task_task: images, start_after= Nov 28 04:55:19 localhost ceph-mgr[286105]: [rbd_support INFO root] load_task_task: backups, start_after= Nov 28 04:55:19 localhost ceph-mgr[286105]: [rbd_support INFO root] TaskHandler: starting Nov 28 04:55:19 localhost ceph-mgr[286105]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules Nov 28 04:55:19 localhost ceph-mgr[286105]: [rbd_support INFO root] load_schedules: vms, start_after= Nov 28 04:55:19 localhost ceph-mgr[286105]: [rbd_support INFO root] load_schedules: volumes, start_after= Nov 28 04:55:19 localhost ceph-mgr[286105]: [rbd_support INFO root] load_schedules: images, start_after= Nov 28 04:55:19 localhost ceph-mgr[286105]: [rbd_support INFO root] load_schedules: backups, start_after= Nov 28 04:55:19 localhost ceph-mgr[286105]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting Nov 28 04:55:19 localhost ceph-mgr[286105]: [rbd_support INFO root] setup complete Nov 28 04:55:19 localhost sshd[299027]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:55:20 localhost systemd[1]: Created slice User Slice of UID 1002. Nov 28 04:55:20 localhost systemd[1]: Starting User Runtime Directory /run/user/1002... Nov 28 04:55:20 localhost systemd-logind[764]: New session 69 of user ceph-admin. Nov 28 04:55:20 localhost systemd[1]: Finished User Runtime Directory /run/user/1002. Nov 28 04:55:20 localhost systemd[1]: Starting User Manager for UID 1002... Nov 28 04:55:20 localhost systemd[299031]: Queued start job for default target Main User Target. Nov 28 04:55:20 localhost systemd[299031]: Created slice User Application Slice. Nov 28 04:55:20 localhost systemd[299031]: Started Mark boot as successful after the user session has run 2 minutes. Nov 28 04:55:20 localhost systemd[299031]: Started Daily Cleanup of User's Temporary Directories. Nov 28 04:55:20 localhost systemd[299031]: Reached target Paths. Nov 28 04:55:20 localhost systemd[299031]: Reached target Timers. Nov 28 04:55:20 localhost systemd[299031]: Starting D-Bus User Message Bus Socket... Nov 28 04:55:20 localhost systemd[299031]: Starting Create User's Volatile Files and Directories... Nov 28 04:55:20 localhost systemd[299031]: Listening on D-Bus User Message Bus Socket. Nov 28 04:55:20 localhost systemd[299031]: Reached target Sockets. Nov 28 04:55:20 localhost systemd[299031]: Finished Create User's Volatile Files and Directories. Nov 28 04:55:20 localhost systemd[299031]: Reached target Basic System. Nov 28 04:55:20 localhost systemd[299031]: Reached target Main User Target. Nov 28 04:55:20 localhost systemd[299031]: Startup finished in 159ms. Nov 28 04:55:20 localhost systemd[1]: Started User Manager for UID 1002. Nov 28 04:55:20 localhost systemd[1]: Started Session 69 of User ceph-admin. Nov 28 04:55:20 localhost nova_compute[279673]: 2025-11-28 09:55:20.375 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:55:20 localhost ceph-mon[292954]: Manager daemon np0005538513.dsfdlx is now available Nov 28 04:55:20 localhost ceph-mon[292954]: removing stray HostCache host record np0005538511.localdomain.devices.0 Nov 28 04:55:20 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538511.localdomain.devices.0"} : dispatch Nov 28 04:55:20 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538511.localdomain.devices.0"}]': finished Nov 28 04:55:20 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538511.localdomain.devices.0"} : dispatch Nov 28 04:55:20 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538511.localdomain.devices.0"}]': finished Nov 28 04:55:20 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538513.dsfdlx/mirror_snapshot_schedule"} : dispatch Nov 28 04:55:20 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538513.dsfdlx/trash_purge_schedule"} : dispatch Nov 28 04:55:20 localhost ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.44366 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Nov 28 04:55:20 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:55:21 localhost ceph-mgr[286105]: [cephadm INFO cherrypy.error] [28/Nov/2025:09:55:21] ENGINE Bus STARTING Nov 28 04:55:21 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : [28/Nov/2025:09:55:21] ENGINE Bus STARTING Nov 28 04:55:21 localhost ceph-mgr[286105]: [cephadm INFO cherrypy.error] [28/Nov/2025:09:55:21] ENGINE Serving on http://172.18.0.106:8765 Nov 28 04:55:21 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : [28/Nov/2025:09:55:21] ENGINE Serving on http://172.18.0.106:8765 Nov 28 04:55:21 localhost ceph-mgr[286105]: [cephadm INFO cherrypy.error] [28/Nov/2025:09:55:21] ENGINE Serving on https://172.18.0.106:7150 Nov 28 04:55:21 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : [28/Nov/2025:09:55:21] ENGINE Serving on https://172.18.0.106:7150 Nov 28 04:55:21 localhost ceph-mgr[286105]: [cephadm INFO cherrypy.error] [28/Nov/2025:09:55:21] ENGINE Bus STARTED Nov 28 04:55:21 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : [28/Nov/2025:09:55:21] ENGINE Bus STARTED Nov 28 04:55:21 localhost ceph-mgr[286105]: [cephadm INFO cherrypy.error] [28/Nov/2025:09:55:21] ENGINE Client ('172.18.0.106', 60952) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Nov 28 04:55:21 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : [28/Nov/2025:09:55:21] ENGINE Client ('172.18.0.106', 60952) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Nov 28 04:55:21 localhost podman[299175]: 2025-11-28 09:55:21.388947969 +0000 UTC m=+0.085193773 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-type=git, architecture=x86_64, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, RELEASE=main, io.openshift.expose-services=, release=553, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 28 04:55:21 localhost podman[299175]: 2025-11-28 09:55:21.519537059 +0000 UTC m=+0.215782923 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, architecture=x86_64, ceph=True, CEPH_POINT_RELEASE=, vcs-type=git, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., name=rhceph, description=Red Hat Ceph Storage 7) Nov 28 04:55:21 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:55:21 localhost ceph-mon[292954]: [28/Nov/2025:09:55:21] ENGINE Bus STARTING Nov 28 04:55:21 localhost ceph-mon[292954]: [28/Nov/2025:09:55:21] ENGINE Serving on http://172.18.0.106:8765 Nov 28 04:55:21 localhost ceph-mgr[286105]: [devicehealth INFO root] Check health Nov 28 04:55:22 localhost ceph-mon[292954]: [28/Nov/2025:09:55:21] ENGINE Serving on https://172.18.0.106:7150 Nov 28 04:55:22 localhost ceph-mon[292954]: [28/Nov/2025:09:55:21] ENGINE Bus STARTED Nov 28 04:55:22 localhost ceph-mon[292954]: [28/Nov/2025:09:55:21] ENGINE Client ('172.18.0.106', 60952) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Nov 28 04:55:22 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:22 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:22 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:22 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:22 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:22 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:22 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:22 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:55:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:55:23 localhost podman[299389]: 2025-11-28 09:55:23.173581992 +0000 UTC m=+0.070841702 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd) Nov 28 04:55:23 localhost podman[299389]: 2025-11-28 09:55:23.189328397 +0000 UTC m=+0.086588047 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 04:55:23 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:55:23 localhost systemd[1]: tmp-crun.TUDxKu.mount: Deactivated successfully. Nov 28 04:55:23 localhost podman[299388]: 2025-11-28 09:55:23.288292663 +0000 UTC m=+0.182101327 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:55:23 localhost podman[299388]: 2025-11-28 09:55:23.297910088 +0000 UTC m=+0.191718752 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 04:55:23 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:55:23 localhost ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.44426 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch Nov 28 04:55:23 localhost ceph-mgr[286105]: [cephadm INFO root] Saving service mon spec with placement label:mon Nov 28 04:55:23 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon Nov 28 04:55:23 localhost ceph-mgr[286105]: [cephadm INFO root] Adjusting osd_memory_target on np0005538513.localdomain to 836.6M Nov 28 04:55:23 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005538513.localdomain to 836.6M Nov 28 04:55:23 localhost ceph-mgr[286105]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 04:55:23 localhost ceph-mgr[286105]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 04:55:23 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:55:23 localhost nova_compute[279673]: 2025-11-28 09:55:23.704 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:55:23 localhost ceph-mgr[286105]: [cephadm INFO root] Adjusting osd_memory_target on np0005538514.localdomain to 836.6M Nov 28 04:55:23 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005538514.localdomain to 836.6M Nov 28 04:55:23 localhost ceph-mgr[286105]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 04:55:23 localhost ceph-mgr[286105]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 04:55:23 localhost ceph-mgr[286105]: [cephadm INFO root] Adjusting osd_memory_target on np0005538515.localdomain to 836.6M Nov 28 04:55:23 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005538515.localdomain to 836.6M Nov 28 04:55:23 localhost ceph-mgr[286105]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 04:55:23 localhost ceph-mgr[286105]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 04:55:23 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538512.localdomain:/etc/ceph/ceph.conf Nov 28 04:55:23 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538512.localdomain:/etc/ceph/ceph.conf Nov 28 04:55:23 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:55:23 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:55:23 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:55:23 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:55:23 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:55:23 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:55:24 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:24 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:24 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config rm", "who": "osd/host:np0005538512", "name": "osd_memory_target"} : dispatch Nov 28 04:55:24 localhost ceph-mon[292954]: Saving service mon spec with placement label:mon Nov 28 04:55:24 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:24 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:24 localhost ceph-mon[292954]: Adjusting osd_memory_target on np0005538513.localdomain to 836.6M Nov 28 04:55:24 localhost ceph-mon[292954]: Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 04:55:24 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:24 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 28 04:55:24 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 28 04:55:24 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:24 localhost ceph-mon[292954]: Adjusting osd_memory_target on np0005538514.localdomain to 836.6M Nov 28 04:55:24 localhost ceph-mon[292954]: Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 04:55:24 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:24 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 28 04:55:24 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 28 04:55:24 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:24 localhost ceph-mon[292954]: Adjusting osd_memory_target on np0005538515.localdomain to 836.6M Nov 28 04:55:24 localhost ceph-mon[292954]: Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 04:55:24 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:24 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 28 04:55:24 localhost ceph-mon[292954]: Updating np0005538512.localdomain:/etc/ceph/ceph.conf Nov 28 04:55:24 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 28 04:55:24 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:55:24 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:55:24 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:55:24 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:55:24 localhost ceph-mon[292954]: mon.np0005538513@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:55:24 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:55:24 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:55:24 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:55:24 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:55:24 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:55:24 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:55:24 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:55:24 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:55:25 localhost ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.44434 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538514", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Nov 28 04:55:25 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:55:25 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:55:25 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:55:25 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:55:25 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:55:25 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:55:25 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:55:25 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:55:25 localhost nova_compute[279673]: 2025-11-28 09:55:25.425 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:55:25 localhost ceph-mon[292954]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:55:25 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:55:25 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:55:25 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:55:25 localhost ceph-mon[292954]: Updating np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:55:25 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:55:25 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:55:25 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:55:25 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:55:25 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:55:25 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:55:25 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:55:25 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:55:25 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:55:25 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:55:25 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:55:25 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:55:26 localhost ceph-mon[292954]: Updating np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:55:26 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:55:26 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:55:26 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:55:26 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:26 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:26 localhost ceph-mgr[286105]: [progress INFO root] update: starting ev e15a826f-99af-4834-8906-4631a44eebd9 (Updating node-proxy deployment (+4 -> 4)) Nov 28 04:55:26 localhost ceph-mgr[286105]: [progress INFO root] complete: finished ev e15a826f-99af-4834-8906-4631a44eebd9 (Updating node-proxy deployment (+4 -> 4)) Nov 28 04:55:26 localhost ceph-mgr[286105]: [progress INFO root] Completed event e15a826f-99af-4834-8906-4631a44eebd9 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds Nov 28 04:55:26 localhost nova_compute[279673]: 2025-11-28 09:55:26.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:55:26 localhost nova_compute[279673]: 2025-11-28 09:55:26.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Nov 28 04:55:27 localhost nova_compute[279673]: 2025-11-28 09:55:27.030 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Nov 28 04:55:27 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005538512 (monmap changed)... Nov 28 04:55:27 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005538512 (monmap changed)... Nov 28 04:55:27 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain Nov 28 04:55:27 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain Nov 28 04:55:27 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s Nov 28 04:55:27 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:27 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:27 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:27 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:27 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:27 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:27 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:27 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:27 localhost ceph-mon[292954]: Reconfiguring mon.np0005538512 (monmap changed)... Nov 28 04:55:27 localhost ceph-mon[292954]: Reconfiguring daemon mon.np0005538512 on np0005538512.localdomain Nov 28 04:55:27 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:55:28 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)... Nov 28 04:55:28 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)... Nov 28 04:55:28 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain Nov 28 04:55:28 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain Nov 28 04:55:28 localhost nova_compute[279673]: 2025-11-28 09:55:28.711 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:55:28 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538512 (monmap changed)... Nov 28 04:55:28 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538512 (monmap changed)... Nov 28 04:55:28 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain Nov 28 04:55:28 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain Nov 28 04:55:29 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:29 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538512.zyhkxs (monmap changed)... Nov 28 04:55:29 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538512.zyhkxs on np0005538512.localdomain Nov 28 04:55:29 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:29 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538512.zyhkxs", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:55:29 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:29 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:29 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:55:29 localhost ceph-mon[292954]: mon.np0005538513@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:55:29 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 13 op/s Nov 28 04:55:29 localhost ceph-mgr[286105]: [progress INFO root] Writing back 50 completed events Nov 28 04:55:29 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538513 (monmap changed)... Nov 28 04:55:29 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538513 (monmap changed)... Nov 28 04:55:29 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain Nov 28 04:55:29 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain Nov 28 04:55:30 localhost ceph-mon[292954]: Reconfiguring crash.np0005538512 (monmap changed)... Nov 28 04:55:30 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain Nov 28 04:55:30 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:30 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:30 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:30 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:55:30 localhost nova_compute[279673]: 2025-11-28 09:55:30.427 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:55:30 localhost podman[300176]: Nov 28 04:55:30 localhost podman[300176]: 2025-11-28 09:55:30.535054625 +0000 UTC m=+0.087072001 container create dff4c09b97b3f88e9f9bd96051d5f66acd596454139e56c8c5c13cbc614a57a2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_khorana, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , RELEASE=main, io.buildah.version=1.33.12, version=7, name=rhceph, build-date=2025-09-24T08:57:55, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, release=553, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., architecture=x86_64) Nov 28 04:55:30 localhost systemd[1]: Started libpod-conmon-dff4c09b97b3f88e9f9bd96051d5f66acd596454139e56c8c5c13cbc614a57a2.scope. Nov 28 04:55:30 localhost systemd[1]: Started libcrun container. Nov 28 04:55:30 localhost podman[300176]: 2025-11-28 09:55:30.498366236 +0000 UTC m=+0.050383662 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:55:30 localhost podman[300176]: 2025-11-28 09:55:30.610379454 +0000 UTC m=+0.162396840 container init dff4c09b97b3f88e9f9bd96051d5f66acd596454139e56c8c5c13cbc614a57a2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_khorana, version=7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, release=553, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, RELEASE=main, name=rhceph, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements) Nov 28 04:55:30 localhost podman[300176]: 2025-11-28 09:55:30.621726863 +0000 UTC m=+0.173744289 container start dff4c09b97b3f88e9f9bd96051d5f66acd596454139e56c8c5c13cbc614a57a2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_khorana, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, version=7, architecture=x86_64, release=553, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , distribution-scope=public) Nov 28 04:55:30 localhost podman[300176]: 2025-11-28 09:55:30.622039792 +0000 UTC m=+0.174057178 container attach dff4c09b97b3f88e9f9bd96051d5f66acd596454139e56c8c5c13cbc614a57a2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_khorana, build-date=2025-09-24T08:57:55, architecture=x86_64, ceph=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, vcs-type=git, GIT_CLEAN=True, io.openshift.expose-services=, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., RELEASE=main, CEPH_POINT_RELEASE=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 28 04:55:30 localhost systemd[1]: libpod-dff4c09b97b3f88e9f9bd96051d5f66acd596454139e56c8c5c13cbc614a57a2.scope: Deactivated successfully. Nov 28 04:55:30 localhost sweet_khorana[300192]: 167 167 Nov 28 04:55:30 localhost podman[300176]: 2025-11-28 09:55:30.629695758 +0000 UTC m=+0.181713174 container died dff4c09b97b3f88e9f9bd96051d5f66acd596454139e56c8c5c13cbc614a57a2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_khorana, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, release=553, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_BRANCH=main, vendor=Red Hat, Inc., ceph=True, io.buildah.version=1.33.12, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, version=7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 28 04:55:30 localhost podman[300197]: 2025-11-28 09:55:30.736382232 +0000 UTC m=+0.091885559 container remove dff4c09b97b3f88e9f9bd96051d5f66acd596454139e56c8c5c13cbc614a57a2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_khorana, io.openshift.expose-services=, ceph=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-type=git, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_CLEAN=True, version=7, architecture=x86_64, name=rhceph, GIT_BRANCH=main) Nov 28 04:55:30 localhost systemd[1]: libpod-conmon-dff4c09b97b3f88e9f9bd96051d5f66acd596454139e56c8c5c13cbc614a57a2.scope: Deactivated successfully. Nov 28 04:55:30 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)... Nov 28 04:55:30 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)... Nov 28 04:55:30 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005538513.localdomain Nov 28 04:55:30 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005538513.localdomain Nov 28 04:55:31 localhost ceph-mon[292954]: Reconfiguring crash.np0005538513 (monmap changed)... Nov 28 04:55:31 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain Nov 28 04:55:31 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:31 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:31 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 28 04:55:31 localhost podman[300267]: Nov 28 04:55:31 localhost podman[300267]: 2025-11-28 09:55:31.438758441 +0000 UTC m=+0.059455891 container create 73d0522ac7c60a132398fcc3eea12929cfb11b1710fdaa221047c7f55ddc3926 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_joliot, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, version=7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, RELEASE=main, GIT_CLEAN=True, name=rhceph) Nov 28 04:55:31 localhost systemd[1]: Started libpod-conmon-73d0522ac7c60a132398fcc3eea12929cfb11b1710fdaa221047c7f55ddc3926.scope. Nov 28 04:55:31 localhost systemd[1]: Started libcrun container. Nov 28 04:55:31 localhost podman[300267]: 2025-11-28 09:55:31.493100125 +0000 UTC m=+0.113797585 container init 73d0522ac7c60a132398fcc3eea12929cfb11b1710fdaa221047c7f55ddc3926 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_joliot, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, RELEASE=main, com.redhat.component=rhceph-container, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, name=rhceph, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, ceph=True, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux ) Nov 28 04:55:31 localhost podman[300267]: 2025-11-28 09:55:31.502006829 +0000 UTC m=+0.122704249 container start 73d0522ac7c60a132398fcc3eea12929cfb11b1710fdaa221047c7f55ddc3926 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_joliot, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, RELEASE=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , distribution-scope=public, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, com.redhat.component=rhceph-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, release=553, GIT_CLEAN=True) Nov 28 04:55:31 localhost podman[300267]: 2025-11-28 09:55:31.502158613 +0000 UTC m=+0.122856103 container attach 73d0522ac7c60a132398fcc3eea12929cfb11b1710fdaa221047c7f55ddc3926 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_joliot, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., release=553, GIT_BRANCH=main, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph) Nov 28 04:55:31 localhost ecstatic_joliot[300283]: 167 167 Nov 28 04:55:31 localhost systemd[1]: libpod-73d0522ac7c60a132398fcc3eea12929cfb11b1710fdaa221047c7f55ddc3926.scope: Deactivated successfully. Nov 28 04:55:31 localhost podman[300267]: 2025-11-28 09:55:31.406177049 +0000 UTC m=+0.026874569 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:55:31 localhost podman[300267]: 2025-11-28 09:55:31.505740313 +0000 UTC m=+0.126437813 container died 73d0522ac7c60a132398fcc3eea12929cfb11b1710fdaa221047c7f55ddc3926 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_joliot, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, vcs-type=git, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, RELEASE=main, io.openshift.tags=rhceph ceph, distribution-scope=public, CEPH_POINT_RELEASE=, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, release=553) Nov 28 04:55:31 localhost systemd[1]: var-lib-containers-storage-overlay-f27fc19861619d301d8d9da9bc4725260477003f261db5aa6e45b0c10a4a570c-merged.mount: Deactivated successfully. Nov 28 04:55:31 localhost systemd[1]: var-lib-containers-storage-overlay-0a5416d70a866570e4d156259e882aa2c63c88ec04bd8570ee301b56df1e63a5-merged.mount: Deactivated successfully. Nov 28 04:55:31 localhost podman[300288]: 2025-11-28 09:55:31.59822978 +0000 UTC m=+0.084920245 container remove 73d0522ac7c60a132398fcc3eea12929cfb11b1710fdaa221047c7f55ddc3926 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_joliot, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, release=553, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.buildah.version=1.33.12) Nov 28 04:55:31 localhost systemd[1]: libpod-conmon-73d0522ac7c60a132398fcc3eea12929cfb11b1710fdaa221047c7f55ddc3926.scope: Deactivated successfully. Nov 28 04:55:31 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s Nov 28 04:55:31 localhost ceph-mon[292954]: mon.np0005538513@2(peon) e12 handle_command mon_command({"prefix": "mgr stat", "format": "json"} v 0) Nov 28 04:55:31 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/577138193' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch Nov 28 04:55:31 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)... Nov 28 04:55:31 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)... Nov 28 04:55:31 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005538513.localdomain Nov 28 04:55:31 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005538513.localdomain Nov 28 04:55:32 localhost ceph-mon[292954]: Reconfiguring osd.2 (monmap changed)... Nov 28 04:55:32 localhost ceph-mon[292954]: Reconfiguring daemon osd.2 on np0005538513.localdomain Nov 28 04:55:32 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:32 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:32 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:32 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:32 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 28 04:55:32 localhost podman[300364]: Nov 28 04:55:32 localhost podman[300364]: 2025-11-28 09:55:32.464706782 +0000 UTC m=+0.080467298 container create 8c6151cfc4a6ffddfdbc4bbabe0d183732cba6d349032189520a8b8fcb77fa96 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_hamilton, description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, RELEASE=main, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., vcs-type=git, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , ceph=True, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55) Nov 28 04:55:32 localhost systemd[1]: Started libpod-conmon-8c6151cfc4a6ffddfdbc4bbabe0d183732cba6d349032189520a8b8fcb77fa96.scope. Nov 28 04:55:32 localhost systemd[1]: Started libcrun container. Nov 28 04:55:32 localhost podman[300364]: 2025-11-28 09:55:32.529558548 +0000 UTC m=+0.145319064 container init 8c6151cfc4a6ffddfdbc4bbabe0d183732cba6d349032189520a8b8fcb77fa96 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_hamilton, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., RELEASE=main, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux , ceph=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, name=rhceph, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, description=Red Hat Ceph Storage 7) Nov 28 04:55:32 localhost podman[300364]: 2025-11-28 09:55:32.432364806 +0000 UTC m=+0.048125362 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:55:32 localhost interesting_hamilton[300379]: 167 167 Nov 28 04:55:32 localhost systemd[1]: libpod-8c6151cfc4a6ffddfdbc4bbabe0d183732cba6d349032189520a8b8fcb77fa96.scope: Deactivated successfully. Nov 28 04:55:32 localhost podman[300364]: 2025-11-28 09:55:32.541772473 +0000 UTC m=+0.157532989 container start 8c6151cfc4a6ffddfdbc4bbabe0d183732cba6d349032189520a8b8fcb77fa96 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_hamilton, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , name=rhceph, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, ceph=True, GIT_CLEAN=True, com.redhat.component=rhceph-container, distribution-scope=public, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, RELEASE=main, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=) Nov 28 04:55:32 localhost podman[300364]: 2025-11-28 09:55:32.542229777 +0000 UTC m=+0.157990293 container attach 8c6151cfc4a6ffddfdbc4bbabe0d183732cba6d349032189520a8b8fcb77fa96 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_hamilton, version=7, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, architecture=x86_64, maintainer=Guillaume Abrioux , ceph=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., vcs-type=git, release=553, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True) Nov 28 04:55:32 localhost podman[300364]: 2025-11-28 09:55:32.544199508 +0000 UTC m=+0.159960094 container died 8c6151cfc4a6ffddfdbc4bbabe0d183732cba6d349032189520a8b8fcb77fa96 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_hamilton, ceph=True, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vcs-type=git, GIT_CLEAN=True, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_BRANCH=main, release=553, build-date=2025-09-24T08:57:55, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, RELEASE=main, name=rhceph) Nov 28 04:55:32 localhost systemd[1]: var-lib-containers-storage-overlay-1c0fdc40bd1ebef23511b5cfa1cf80437a0ec2a4dcb9baa3d53189d5181a247d-merged.mount: Deactivated successfully. Nov 28 04:55:32 localhost podman[300384]: 2025-11-28 09:55:32.649427908 +0000 UTC m=+0.097703449 container remove 8c6151cfc4a6ffddfdbc4bbabe0d183732cba6d349032189520a8b8fcb77fa96 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_hamilton, io.k8s.description=Red Hat Ceph Storage 7, release=553, distribution-scope=public, CEPH_POINT_RELEASE=, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.12, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vcs-type=git, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, build-date=2025-09-24T08:57:55) Nov 28 04:55:32 localhost systemd[1]: libpod-conmon-8c6151cfc4a6ffddfdbc4bbabe0d183732cba6d349032189520a8b8fcb77fa96.scope: Deactivated successfully. Nov 28 04:55:32 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)... Nov 28 04:55:32 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)... Nov 28 04:55:32 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain Nov 28 04:55:32 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain Nov 28 04:55:33 localhost nova_compute[279673]: 2025-11-28 09:55:33.026 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:55:33 localhost nova_compute[279673]: 2025-11-28 09:55:33.028 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:55:33 localhost nova_compute[279673]: 2025-11-28 09:55:33.028 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:55:33 localhost nova_compute[279673]: 2025-11-28 09:55:33.028 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 04:55:33 localhost ceph-mon[292954]: Reconfiguring osd.5 (monmap changed)... Nov 28 04:55:33 localhost ceph-mon[292954]: Reconfiguring daemon osd.5 on np0005538513.localdomain Nov 28 04:55:33 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:33 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:33 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:33 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:33 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:55:33 localhost ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.27172 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538512", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Nov 28 04:55:33 localhost podman[300462]: Nov 28 04:55:33 localhost podman[300462]: 2025-11-28 09:55:33.470537042 +0000 UTC m=+0.074672990 container create f5b751945cf6f78c8ce46a40ae366f8a2422a1acc6551a34a7a9490c6b63d4d4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_diffie, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, ceph=True, io.buildah.version=1.33.12, RELEASE=main, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, name=rhceph, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7) Nov 28 04:55:33 localhost systemd[1]: Started libpod-conmon-f5b751945cf6f78c8ce46a40ae366f8a2422a1acc6551a34a7a9490c6b63d4d4.scope. Nov 28 04:55:33 localhost systemd[1]: Started libcrun container. Nov 28 04:55:33 localhost podman[300462]: 2025-11-28 09:55:33.442046045 +0000 UTC m=+0.046182013 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:55:33 localhost podman[300462]: 2025-11-28 09:55:33.549923915 +0000 UTC m=+0.154059883 container init f5b751945cf6f78c8ce46a40ae366f8a2422a1acc6551a34a7a9490c6b63d4d4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_diffie, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, RELEASE=main, vcs-type=git, io.openshift.tags=rhceph ceph, ceph=True, distribution-scope=public, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, name=rhceph, build-date=2025-09-24T08:57:55, architecture=x86_64, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.openshift.expose-services=) Nov 28 04:55:33 localhost podman[300462]: 2025-11-28 09:55:33.560826441 +0000 UTC m=+0.164962409 container start f5b751945cf6f78c8ce46a40ae366f8a2422a1acc6551a34a7a9490c6b63d4d4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_diffie, io.openshift.tags=rhceph ceph, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, distribution-scope=public, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, RELEASE=main, maintainer=Guillaume Abrioux , name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_BRANCH=main, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 28 04:55:33 localhost podman[300462]: 2025-11-28 09:55:33.561212553 +0000 UTC m=+0.165348521 container attach f5b751945cf6f78c8ce46a40ae366f8a2422a1acc6551a34a7a9490c6b63d4d4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_diffie, ceph=True, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, description=Red Hat Ceph Storage 7, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_BRANCH=main) Nov 28 04:55:33 localhost condescending_diffie[300477]: 167 167 Nov 28 04:55:33 localhost systemd[1]: libpod-f5b751945cf6f78c8ce46a40ae366f8a2422a1acc6551a34a7a9490c6b63d4d4.scope: Deactivated successfully. Nov 28 04:55:33 localhost podman[300462]: 2025-11-28 09:55:33.563843414 +0000 UTC m=+0.167979412 container died f5b751945cf6f78c8ce46a40ae366f8a2422a1acc6551a34a7a9490c6b63d4d4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_diffie, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, vendor=Red Hat, Inc., release=553, architecture=x86_64, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.openshift.tags=rhceph ceph) Nov 28 04:55:33 localhost systemd[1]: var-lib-containers-storage-overlay-a10b28bbfa262a8941d71b00637f3759952fa6422790e7aa06ad2b2fba7e3b1f-merged.mount: Deactivated successfully. Nov 28 04:55:33 localhost podman[300482]: 2025-11-28 09:55:33.664167162 +0000 UTC m=+0.091334893 container remove f5b751945cf6f78c8ce46a40ae366f8a2422a1acc6551a34a7a9490c6b63d4d4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_diffie, CEPH_POINT_RELEASE=, io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, distribution-scope=public, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, release=553, io.openshift.tags=rhceph ceph, version=7, GIT_BRANCH=main, io.buildah.version=1.33.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, vendor=Red Hat, Inc., name=rhceph) Nov 28 04:55:33 localhost systemd[1]: libpod-conmon-f5b751945cf6f78c8ce46a40ae366f8a2422a1acc6551a34a7a9490c6b63d4d4.scope: Deactivated successfully. Nov 28 04:55:33 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s Nov 28 04:55:33 localhost nova_compute[279673]: 2025-11-28 09:55:33.746 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:55:33 localhost nova_compute[279673]: 2025-11-28 09:55:33.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:55:33 localhost nova_compute[279673]: 2025-11-28 09:55:33.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:55:33 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)... Nov 28 04:55:33 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)... Nov 28 04:55:33 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain Nov 28 04:55:33 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain Nov 28 04:55:34 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)... Nov 28 04:55:34 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain Nov 28 04:55:34 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:34 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:34 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:55:34 localhost ceph-mon[292954]: mon.np0005538513@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:55:34 localhost podman[300552]: Nov 28 04:55:34 localhost podman[300552]: 2025-11-28 09:55:34.552900428 +0000 UTC m=+0.085203693 container create 7249413b257ac454017e9d06d17907e71b411373bba34b6464d07f4fdb08925b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_clarke, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.tags=rhceph ceph, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.openshift.expose-services=, distribution-scope=public, ceph=True, version=7, release=553) Nov 28 04:55:34 localhost systemd[1]: Started libpod-conmon-7249413b257ac454017e9d06d17907e71b411373bba34b6464d07f4fdb08925b.scope. Nov 28 04:55:34 localhost systemd[1]: Started libcrun container. Nov 28 04:55:34 localhost podman[300552]: 2025-11-28 09:55:34.518111628 +0000 UTC m=+0.050414933 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:55:34 localhost podman[300552]: 2025-11-28 09:55:34.623839831 +0000 UTC m=+0.156143086 container init 7249413b257ac454017e9d06d17907e71b411373bba34b6464d07f4fdb08925b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_clarke, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, GIT_CLEAN=True, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, distribution-scope=public, name=rhceph, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, io.openshift.tags=rhceph ceph, vcs-type=git) Nov 28 04:55:34 localhost systemd[1]: tmp-crun.XW9Ffo.mount: Deactivated successfully. Nov 28 04:55:34 localhost podman[300552]: 2025-11-28 09:55:34.640122393 +0000 UTC m=+0.172425648 container start 7249413b257ac454017e9d06d17907e71b411373bba34b6464d07f4fdb08925b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_clarke, vcs-type=git, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, GIT_CLEAN=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, description=Red Hat Ceph Storage 7, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., GIT_BRANCH=main) Nov 28 04:55:34 localhost podman[300552]: 2025-11-28 09:55:34.640345839 +0000 UTC m=+0.172649084 container attach 7249413b257ac454017e9d06d17907e71b411373bba34b6464d07f4fdb08925b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_clarke, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, description=Red Hat Ceph Storage 7, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, GIT_CLEAN=True, CEPH_POINT_RELEASE=, RELEASE=main) Nov 28 04:55:34 localhost relaxed_clarke[300566]: 167 167 Nov 28 04:55:34 localhost systemd[1]: libpod-7249413b257ac454017e9d06d17907e71b411373bba34b6464d07f4fdb08925b.scope: Deactivated successfully. Nov 28 04:55:34 localhost podman[300552]: 2025-11-28 09:55:34.644093615 +0000 UTC m=+0.176396910 container died 7249413b257ac454017e9d06d17907e71b411373bba34b6464d07f4fdb08925b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_clarke, GIT_CLEAN=True, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, ceph=True, release=553, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., RELEASE=main, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, build-date=2025-09-24T08:57:55, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, version=7) Nov 28 04:55:34 localhost podman[300571]: 2025-11-28 09:55:34.75473126 +0000 UTC m=+0.098379869 container remove 7249413b257ac454017e9d06d17907e71b411373bba34b6464d07f4fdb08925b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_clarke, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.expose-services=) Nov 28 04:55:34 localhost systemd[1]: libpod-conmon-7249413b257ac454017e9d06d17907e71b411373bba34b6464d07f4fdb08925b.scope: Deactivated successfully. Nov 28 04:55:34 localhost nova_compute[279673]: 2025-11-28 09:55:34.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:55:34 localhost nova_compute[279673]: 2025-11-28 09:55:34.773 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:55:34 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005538513 (monmap changed)... Nov 28 04:55:34 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005538513 (monmap changed)... Nov 28 04:55:34 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain Nov 28 04:55:34 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain Nov 28 04:55:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:55:34 localhost ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.27180 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005538512"], "force": true, "target": ["mon-mgr", ""]}]: dispatch Nov 28 04:55:34 localhost ceph-mgr[286105]: [cephadm INFO root] Remove daemons mon.np0005538512 Nov 28 04:55:34 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Remove daemons mon.np0005538512 Nov 28 04:55:34 localhost ceph-mgr[286105]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005538512: new quorum should be ['np0005538515', 'np0005538513', 'np0005538514'] (from ['np0005538515', 'np0005538513', 'np0005538514']) Nov 28 04:55:34 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005538512: new quorum should be ['np0005538515', 'np0005538513', 'np0005538514'] (from ['np0005538515', 'np0005538513', 'np0005538514']) Nov 28 04:55:34 localhost ceph-mgr[286105]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005538512 from monmap... Nov 28 04:55:34 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removing monitor np0005538512 from monmap... Nov 28 04:55:34 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005538512 from np0005538512.localdomain -- ports [] Nov 28 04:55:34 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005538512 from np0005538512.localdomain -- ports [] Nov 28 04:55:35 localhost ceph-mon[292954]: mon.np0005538513@2(peon) e13 my rank is now 1 (was 2) Nov 28 04:55:35 localhost ceph-mgr[286105]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0 Nov 28 04:55:35 localhost ceph-mgr[286105]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0 Nov 28 04:55:35 localhost ceph-mgr[286105]: client.44410 ms_handle_reset on v2:172.18.0.103:3300/0 Nov 28 04:55:35 localhost ceph-mgr[286105]: client.27136 ms_handle_reset on v2:172.18.0.103:3300/0 Nov 28 04:55:35 localhost podman[300605]: 2025-11-28 09:55:35.109335006 +0000 UTC m=+0.141702403 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, architecture=x86_64, container_name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, config_id=edpm, name=ubi9-minimal) Nov 28 04:55:35 localhost podman[300605]: 2025-11-28 09:55:35.151411231 +0000 UTC m=+0.183778608 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Nov 28 04:55:35 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:55:35 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : mon.np0005538513 calling monitor election Nov 28 04:55:35 localhost ceph-mon[292954]: paxos.1).electionLogic(46) init, last seen epoch 46 Nov 28 04:55:35 localhost ceph-mon[292954]: mon.np0005538513@1(electing) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:55:35 localhost ceph-mon[292954]: mon.np0005538513@1(electing) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:55:35 localhost ceph-mon[292954]: mon.np0005538513@1(peon) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:55:35 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)... Nov 28 04:55:35 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain Nov 28 04:55:35 localhost ceph-mon[292954]: Reconfiguring mon.np0005538513 (monmap changed)... Nov 28 04:55:35 localhost ceph-mon[292954]: Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain Nov 28 04:55:35 localhost ceph-mon[292954]: Remove daemons mon.np0005538512 Nov 28 04:55:35 localhost ceph-mon[292954]: Safe to remove mon.np0005538512: new quorum should be ['np0005538515', 'np0005538513', 'np0005538514'] (from ['np0005538515', 'np0005538513', 'np0005538514']) Nov 28 04:55:35 localhost ceph-mon[292954]: Removing monitor np0005538512 from monmap... Nov 28 04:55:35 localhost ceph-mon[292954]: Removing daemon mon.np0005538512 from np0005538512.localdomain -- ports [] Nov 28 04:55:35 localhost ceph-mon[292954]: mon.np0005538515 calling monitor election Nov 28 04:55:35 localhost ceph-mon[292954]: mon.np0005538514 calling monitor election Nov 28 04:55:35 localhost ceph-mon[292954]: mon.np0005538513 calling monitor election Nov 28 04:55:35 localhost ceph-mon[292954]: mon.np0005538515 is new leader, mons np0005538515,np0005538513,np0005538514 in quorum (ranks 0,1,2) Nov 28 04:55:35 localhost ceph-mon[292954]: overall HEALTH_OK Nov 28 04:55:35 localhost nova_compute[279673]: 2025-11-28 09:55:35.469 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:55:35 localhost podman[300661]: Nov 28 04:55:35 localhost podman[300661]: 2025-11-28 09:55:35.530788828 +0000 UTC m=+0.099626287 container create 3f86fa0776cae2989140e8c20aff3b0356013035e41a566a7c06a28ddead2e32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_proskuriakova, version=7, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, vendor=Red Hat, Inc., ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, RELEASE=main, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, io.openshift.tags=rhceph ceph) Nov 28 04:55:35 localhost systemd[1]: var-lib-containers-storage-overlay-ad0314c617d127c457a5ff6a4b0215dfdbc82f2da6c7ad2e4a7fde20133ee188-merged.mount: Deactivated successfully. Nov 28 04:55:35 localhost systemd[1]: Started libpod-conmon-3f86fa0776cae2989140e8c20aff3b0356013035e41a566a7c06a28ddead2e32.scope. Nov 28 04:55:35 localhost podman[300661]: 2025-11-28 09:55:35.49314735 +0000 UTC m=+0.061984869 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:55:35 localhost systemd[1]: Started libcrun container. Nov 28 04:55:35 localhost podman[300661]: 2025-11-28 09:55:35.619565351 +0000 UTC m=+0.188402810 container init 3f86fa0776cae2989140e8c20aff3b0356013035e41a566a7c06a28ddead2e32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_proskuriakova, GIT_CLEAN=True, vcs-type=git, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., release=553, version=7, CEPH_POINT_RELEASE=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 04:55:35 localhost podman[300661]: 2025-11-28 09:55:35.6292726 +0000 UTC m=+0.198110039 container start 3f86fa0776cae2989140e8c20aff3b0356013035e41a566a7c06a28ddead2e32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_proskuriakova, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, name=rhceph, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True) Nov 28 04:55:35 localhost podman[300661]: 2025-11-28 09:55:35.629539908 +0000 UTC m=+0.198377427 container attach 3f86fa0776cae2989140e8c20aff3b0356013035e41a566a7c06a28ddead2e32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_proskuriakova, build-date=2025-09-24T08:57:55, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, version=7, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, GIT_BRANCH=main, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, vcs-type=git, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph) Nov 28 04:55:35 localhost objective_proskuriakova[300676]: 167 167 Nov 28 04:55:35 localhost systemd[1]: libpod-3f86fa0776cae2989140e8c20aff3b0356013035e41a566a7c06a28ddead2e32.scope: Deactivated successfully. Nov 28 04:55:35 localhost podman[300661]: 2025-11-28 09:55:35.633932823 +0000 UTC m=+0.202770312 container died 3f86fa0776cae2989140e8c20aff3b0356013035e41a566a7c06a28ddead2e32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_proskuriakova, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., ceph=True, RELEASE=main, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, distribution-scope=public, version=7, CEPH_POINT_RELEASE=, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main) Nov 28 04:55:35 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s Nov 28 04:55:35 localhost podman[300681]: 2025-11-28 09:55:35.747959444 +0000 UTC m=+0.099078152 container remove 3f86fa0776cae2989140e8c20aff3b0356013035e41a566a7c06a28ddead2e32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_proskuriakova, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , GIT_CLEAN=True, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, release=553, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, vcs-type=git, name=rhceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55) Nov 28 04:55:35 localhost systemd[1]: libpod-conmon-3f86fa0776cae2989140e8c20aff3b0356013035e41a566a7c06a28ddead2e32.scope: Deactivated successfully. Nov 28 04:55:35 localhost nova_compute[279673]: 2025-11-28 09:55:35.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:55:35 localhost nova_compute[279673]: 2025-11-28 09:55:35.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Nov 28 04:55:35 localhost nova_compute[279673]: 2025-11-28 09:55:35.794 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:55:35 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538514 (monmap changed)... Nov 28 04:55:35 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538514 (monmap changed)... Nov 28 04:55:35 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain Nov 28 04:55:35 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain Nov 28 04:55:36 localhost systemd[1]: var-lib-containers-storage-overlay-b06601690913dcc22fcbac68e9cdb32bab2dc906ba0d9841942ade18446df865-merged.mount: Deactivated successfully. Nov 28 04:55:36 localhost nova_compute[279673]: 2025-11-28 09:55:36.807 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:55:36 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)... Nov 28 04:55:36 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)... Nov 28 04:55:36 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005538514.localdomain Nov 28 04:55:36 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005538514.localdomain Nov 28 04:55:36 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:36 localhost ceph-mon[292954]: Reconfiguring crash.np0005538514 (monmap changed)... Nov 28 04:55:36 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:36 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain Nov 28 04:55:36 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:55:36 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:36 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:36 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 28 04:55:36 localhost nova_compute[279673]: 2025-11-28 09:55:36.867 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:55:36 localhost nova_compute[279673]: 2025-11-28 09:55:36.868 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:55:36 localhost nova_compute[279673]: 2025-11-28 09:55:36.868 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:55:36 localhost nova_compute[279673]: 2025-11-28 09:55:36.869 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 04:55:36 localhost nova_compute[279673]: 2025-11-28 09:55:36.869 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:55:37 localhost ceph-mon[292954]: mon.np0005538513@1(peon) e13 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 04:55:37 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/705674516' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 04:55:37 localhost nova_compute[279673]: 2025-11-28 09:55:37.337 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:55:37 localhost nova_compute[279673]: 2025-11-28 09:55:37.416 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:55:37 localhost nova_compute[279673]: 2025-11-28 09:55:37.417 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:55:37 localhost nova_compute[279673]: 2025-11-28 09:55:37.641 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 04:55:37 localhost nova_compute[279673]: 2025-11-28 09:55:37.643 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11749MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 04:55:37 localhost nova_compute[279673]: 2025-11-28 09:55:37.644 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:55:37 localhost nova_compute[279673]: 2025-11-28 09:55:37.644 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:55:37 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s Nov 28 04:55:37 localhost nova_compute[279673]: 2025-11-28 09:55:37.813 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 04:55:37 localhost nova_compute[279673]: 2025-11-28 09:55:37.813 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 04:55:37 localhost nova_compute[279673]: 2025-11-28 09:55:37.814 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 04:55:37 localhost ceph-mon[292954]: Reconfiguring osd.0 (monmap changed)... Nov 28 04:55:37 localhost ceph-mon[292954]: Reconfiguring daemon osd.0 on np0005538514.localdomain Nov 28 04:55:37 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:37 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:37 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)... Nov 28 04:55:37 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)... Nov 28 04:55:37 localhost nova_compute[279673]: 2025-11-28 09:55:37.872 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing inventories for resource provider 35fead26-0bad-4950-b646-987079d58a17 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 28 04:55:37 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005538514.localdomain Nov 28 04:55:37 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005538514.localdomain Nov 28 04:55:37 localhost nova_compute[279673]: 2025-11-28 09:55:37.918 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Updating ProviderTree inventory for provider 35fead26-0bad-4950-b646-987079d58a17 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 28 04:55:37 localhost nova_compute[279673]: 2025-11-28 09:55:37.918 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 28 04:55:37 localhost nova_compute[279673]: 2025-11-28 09:55:37.933 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing aggregate associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 28 04:55:37 localhost nova_compute[279673]: 2025-11-28 09:55:37.958 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing trait associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_FMA3,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_ABM,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AMD_SVM,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_CLMUL,HW_CPU_X86_BMI2,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 28 04:55:37 localhost nova_compute[279673]: 2025-11-28 09:55:37.993 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:55:38 localhost ceph-mon[292954]: mon.np0005538513@1(peon) e13 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 04:55:38 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4058543900' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 04:55:38 localhost nova_compute[279673]: 2025-11-28 09:55:38.426 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:55:38 localhost nova_compute[279673]: 2025-11-28 09:55:38.433 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 04:55:38 localhost ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.44470 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538512.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch Nov 28 04:55:38 localhost nova_compute[279673]: 2025-11-28 09:55:38.457 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 04:55:38 localhost nova_compute[279673]: 2025-11-28 09:55:38.460 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 04:55:38 localhost nova_compute[279673]: 2025-11-28 09:55:38.460 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.816s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:55:38 localhost ceph-mgr[286105]: [cephadm INFO root] Removed label mon from host np0005538512.localdomain Nov 28 04:55:38 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removed label mon from host np0005538512.localdomain Nov 28 04:55:38 localhost nova_compute[279673]: 2025-11-28 09:55:38.750 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:55:38 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:38 localhost ceph-mon[292954]: Reconfiguring osd.3 (monmap changed)... Nov 28 04:55:38 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:38 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 28 04:55:38 localhost ceph-mon[292954]: Reconfiguring daemon osd.3 on np0005538514.localdomain Nov 28 04:55:38 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:38 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)... Nov 28 04:55:38 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)... Nov 28 04:55:38 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain Nov 28 04:55:38 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain Nov 28 04:55:39 localhost ceph-mon[292954]: mon.np0005538513@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:55:39 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:55:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:55:39 localhost podman[300741]: 2025-11-28 09:55:39.86275089 +0000 UTC m=+0.089131003 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:55:39 localhost podman[300741]: 2025-11-28 09:55:39.876680409 +0000 UTC m=+0.103060582 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 28 04:55:39 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:55:39 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538514.djozup (monmap changed)... Nov 28 04:55:39 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538514.djozup (monmap changed)... Nov 28 04:55:39 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain Nov 28 04:55:39 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain Nov 28 04:55:39 localhost ceph-mon[292954]: Removed label mon from host np0005538512.localdomain Nov 28 04:55:39 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:39 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:39 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:39 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)... Nov 28 04:55:39 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:39 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain Nov 28 04:55:39 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:55:39 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:39 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:39 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:55:40 localhost podman[238687]: time="2025-11-28T09:55:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:55:40 localhost podman[238687]: @ - - [28/Nov/2025:09:55:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 28 04:55:40 localhost podman[238687]: @ - - [28/Nov/2025:09:55:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18734 "" "Go-http-client/1.1" Nov 28 04:55:40 localhost nova_compute[279673]: 2025-11-28 09:55:40.471 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:55:40 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005538514 (monmap changed)... Nov 28 04:55:40 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005538514 (monmap changed)... Nov 28 04:55:40 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005538514 on np0005538514.localdomain Nov 28 04:55:40 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005538514 on np0005538514.localdomain Nov 28 04:55:41 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538514.djozup (monmap changed)... Nov 28 04:55:41 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain Nov 28 04:55:41 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:41 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:41 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:55:41 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:55:41 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538515 (monmap changed)... Nov 28 04:55:41 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538515 (monmap changed)... Nov 28 04:55:41 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain Nov 28 04:55:41 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain Nov 28 04:55:42 localhost ceph-mon[292954]: Reconfiguring mon.np0005538514 (monmap changed)... Nov 28 04:55:42 localhost ceph-mon[292954]: Reconfiguring daemon mon.np0005538514 on np0005538514.localdomain Nov 28 04:55:42 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:42 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:42 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:55:42 localhost nova_compute[279673]: 2025-11-28 09:55:42.425 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:55:42 localhost nova_compute[279673]: 2025-11-28 09:55:42.507 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:55:42 localhost nova_compute[279673]: 2025-11-28 09:55:42.507 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 04:55:42 localhost nova_compute[279673]: 2025-11-28 09:55:42.508 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 04:55:42 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)... Nov 28 04:55:42 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)... Nov 28 04:55:42 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005538515.localdomain Nov 28 04:55:42 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005538515.localdomain Nov 28 04:55:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:55:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:55:42 localhost podman[300764]: 2025-11-28 09:55:42.85609974 +0000 UTC m=+0.091426576 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:55:42 localhost podman[300765]: 2025-11-28 09:55:42.934456482 +0000 UTC m=+0.167030803 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent) Nov 28 04:55:42 localhost podman[300764]: 2025-11-28 09:55:42.941523439 +0000 UTC m=+0.176850275 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true) Nov 28 04:55:42 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:55:42 localhost podman[300765]: 2025-11-28 09:55:42.964832807 +0000 UTC m=+0.197407178 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:55:42 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:55:43 localhost ceph-mon[292954]: Reconfiguring crash.np0005538515 (monmap changed)... Nov 28 04:55:43 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain Nov 28 04:55:43 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:43 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:43 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 28 04:55:43 localhost ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.44473 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538512.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch Nov 28 04:55:43 localhost ceph-mgr[286105]: [cephadm INFO root] Removed label mgr from host np0005538512.localdomain Nov 28 04:55:43 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removed label mgr from host np0005538512.localdomain Nov 28 04:55:43 localhost nova_compute[279673]: 2025-11-28 09:55:43.486 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:55:43 localhost nova_compute[279673]: 2025-11-28 09:55:43.486 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:55:43 localhost nova_compute[279673]: 2025-11-28 09:55:43.486 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 04:55:43 localhost nova_compute[279673]: 2025-11-28 09:55:43.487 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:55:43 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)... Nov 28 04:55:43 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)... Nov 28 04:55:43 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005538515.localdomain Nov 28 04:55:43 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005538515.localdomain Nov 28 04:55:43 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:55:43 localhost nova_compute[279673]: 2025-11-28 09:55:43.753 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:55:44 localhost ceph-mon[292954]: Reconfiguring osd.1 (monmap changed)... Nov 28 04:55:44 localhost ceph-mon[292954]: Reconfiguring daemon osd.1 on np0005538515.localdomain Nov 28 04:55:44 localhost ceph-mon[292954]: Removed label mgr from host np0005538512.localdomain Nov 28 04:55:44 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:44 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:44 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:44 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:44 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:44 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 28 04:55:44 localhost ceph-mon[292954]: mon.np0005538513@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:55:44 localhost ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.44476 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005538512.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch Nov 28 04:55:44 localhost ceph-mgr[286105]: [cephadm INFO root] Removed label _admin from host np0005538512.localdomain Nov 28 04:55:44 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removed label _admin from host np0005538512.localdomain Nov 28 04:55:44 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)... Nov 28 04:55:44 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)... Nov 28 04:55:44 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain Nov 28 04:55:44 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain Nov 28 04:55:45 localhost ceph-mon[292954]: Reconfiguring osd.4 (monmap changed)... Nov 28 04:55:45 localhost ceph-mon[292954]: Reconfiguring daemon osd.4 on np0005538515.localdomain Nov 28 04:55:45 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:45 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:45 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:45 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:45 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:45 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:55:45 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)... Nov 28 04:55:45 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)... Nov 28 04:55:45 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain Nov 28 04:55:45 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain Nov 28 04:55:45 localhost nova_compute[279673]: 2025-11-28 09:55:45.517 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:55:45 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:55:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:55:45 localhost podman[300809]: 2025-11-28 09:55:45.846994483 +0000 UTC m=+0.086681669 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 28 04:55:45 localhost podman[300809]: 2025-11-28 09:55:45.86150018 +0000 UTC m=+0.101187376 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:55:45 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:55:46 localhost nova_compute[279673]: 2025-11-28 09:55:46.178 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 04:55:46 localhost nova_compute[279673]: 2025-11-28 09:55:46.198 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:55:46 localhost nova_compute[279673]: 2025-11-28 09:55:46.198 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 04:55:46 localhost ceph-mon[292954]: Removed label _admin from host np0005538512.localdomain Nov 28 04:55:46 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)... Nov 28 04:55:46 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain Nov 28 04:55:46 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:46 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:46 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:55:46 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005538515 (monmap changed)... Nov 28 04:55:46 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005538515 (monmap changed)... Nov 28 04:55:46 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain Nov 28 04:55:46 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain Nov 28 04:55:47 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)... Nov 28 04:55:47 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain Nov 28 04:55:47 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:47 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:47 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:55:47 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:47 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:47 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:55:48 localhost openstack_network_exporter[240658]: ERROR 09:55:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:55:48 localhost openstack_network_exporter[240658]: ERROR 09:55:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:55:48 localhost openstack_network_exporter[240658]: ERROR 09:55:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:55:48 localhost openstack_network_exporter[240658]: ERROR 09:55:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:55:48 localhost openstack_network_exporter[240658]: Nov 28 04:55:48 localhost openstack_network_exporter[240658]: ERROR 09:55:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:55:48 localhost openstack_network_exporter[240658]: Nov 28 04:55:48 localhost ceph-mon[292954]: Reconfiguring mon.np0005538515 (monmap changed)... Nov 28 04:55:48 localhost ceph-mon[292954]: Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain Nov 28 04:55:48 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Removing np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:55:48 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removing np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:55:48 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:55:48 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:55:48 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:55:48 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:55:48 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:55:48 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:55:48 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Removing np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:55:48 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removing np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:55:48 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Removing np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:55:48 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removing np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:55:48 localhost nova_compute[279673]: 2025-11-28 09:55:48.755 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:55:49 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:55:49 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:55:49 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:49 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:49 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:55:49 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:49 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:49 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:55:49 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:55:49 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:55:49 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:55:49 localhost ceph-mon[292954]: mon.np0005538513@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:55:49 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:55:49 localhost ceph-mgr[286105]: [volumes INFO mgr_util] scanning for idle connections.. Nov 28 04:55:49 localhost ceph-mgr[286105]: [volumes INFO mgr_util] cleaning up connections: [] Nov 28 04:55:49 localhost ceph-mgr[286105]: [volumes INFO mgr_util] scanning for idle connections.. Nov 28 04:55:49 localhost ceph-mgr[286105]: [volumes INFO mgr_util] cleaning up connections: [] Nov 28 04:55:49 localhost ceph-mgr[286105]: [volumes INFO mgr_util] scanning for idle connections.. Nov 28 04:55:49 localhost ceph-mgr[286105]: [volumes INFO mgr_util] cleaning up connections: [] Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0. Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.025943) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25 Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750025997, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 2811, "num_deletes": 255, "total_data_size": 8363170, "memory_usage": 8837936, "flush_reason": "Manual Compaction"} Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750055821, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 5010009, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14885, "largest_seqno": 17691, "table_properties": {"data_size": 4998653, "index_size": 7029, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3269, "raw_key_size": 28983, "raw_average_key_size": 22, "raw_value_size": 4973869, "raw_average_value_size": 3864, "num_data_blocks": 305, "num_entries": 1287, "num_filter_entries": 1287, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323655, "oldest_key_time": 1764323655, "file_creation_time": 1764323750, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}} Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 30031 microseconds, and 11462 cpu microseconds. Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.055972) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 5010009 bytes OK Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.056057) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.058171) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.058196) EVENT_LOG_v1 {"time_micros": 1764323750058187, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.058223) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 8349622, prev total WAL file size 8397823, number of live WAL files 2. Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.060849) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130373933' seq:72057594037927935, type:22 .. '7061786F73003131303435' seq:0, type:0; will stop at (end) Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(4892KB)], [24(15MB)] Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750060932, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 20816839, "oldest_snapshot_seqno": -1} Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 11135 keys, 17522050 bytes, temperature: kUnknown Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750191828, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 17522050, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17457469, "index_size": 35680, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27845, "raw_key_size": 297871, "raw_average_key_size": 26, "raw_value_size": 17266507, "raw_average_value_size": 1550, "num_data_blocks": 1369, "num_entries": 11135, "num_filter_entries": 11135, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764323750, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}} Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.192269) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 17522050 bytes Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.194046) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 158.9 rd, 133.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.8, 15.1 +0.0 blob) out(16.7 +0.0 blob), read-write-amplify(7.7) write-amplify(3.5) OK, records in: 11685, records dropped: 550 output_compression: NoCompression Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.194078) EVENT_LOG_v1 {"time_micros": 1764323750194063, "job": 12, "event": "compaction_finished", "compaction_time_micros": 131028, "compaction_time_cpu_micros": 48343, "output_level": 6, "num_output_files": 1, "total_output_size": 17522050, "num_input_records": 11685, "num_output_records": 11135, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750195320, "job": 12, "event": "table_file_deletion", "file_number": 26} Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750197896, "job": 12, "event": "table_file_deletion", "file_number": 24} Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.060713) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.198056) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.198062) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.198065) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.198068) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.198071) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:55:50 localhost ceph-mon[292954]: Removing np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:55:50 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:55:50 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:55:50 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:55:50 localhost ceph-mon[292954]: Removing np0005538512.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:55:50 localhost ceph-mon[292954]: Removing np0005538512.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:55:50 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:55:50 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:50 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:50 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:50 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:50 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:50 localhost ceph-mgr[286105]: [progress INFO root] update: starting ev bc8e7913-9bca-40e0-98a8-75add603a1f0 (Updating mgr deployment (-1 -> 3)) Nov 28 04:55:50 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Removing daemon mgr.np0005538512.zyhkxs from np0005538512.localdomain -- ports [8765] Nov 28 04:55:50 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removing daemon mgr.np0005538512.zyhkxs from np0005538512.localdomain -- ports [8765] Nov 28 04:55:50 localhost nova_compute[279673]: 2025-11-28 09:55:50.520 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0. Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.569205) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28 Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750569514, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 324, "num_deletes": 253, "total_data_size": 170427, "memory_usage": 178568, "flush_reason": "Manual Compaction"} Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750573221, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 112846, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17696, "largest_seqno": 18015, "table_properties": {"data_size": 110711, "index_size": 310, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 4968, "raw_average_key_size": 16, "raw_value_size": 106348, "raw_average_value_size": 359, "num_data_blocks": 11, "num_entries": 296, "num_filter_entries": 296, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323750, "oldest_key_time": 1764323750, "file_creation_time": 1764323750, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}} Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 4066 microseconds, and 1340 cpu microseconds. Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.573266) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 112846 bytes OK Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.573291) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.575429) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.575453) EVENT_LOG_v1 {"time_micros": 1764323750575446, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.575476) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 168086, prev total WAL file size 168086, number of live WAL files 2. Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.576337) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031323935' seq:72057594037927935, type:22 .. '6B760031353439' seq:0, type:0; will stop at (end) Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(110KB)], [27(16MB)] Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750576484, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 17634896, "oldest_snapshot_seqno": -1} Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 10907 keys, 16607555 bytes, temperature: kUnknown Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750670656, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 16607555, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16545728, "index_size": 33438, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27333, "raw_key_size": 294644, "raw_average_key_size": 27, "raw_value_size": 16359919, "raw_average_value_size": 1499, "num_data_blocks": 1257, "num_entries": 10907, "num_filter_entries": 10907, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764323750, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}} Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.671089) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 16607555 bytes Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.673332) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 187.1 rd, 176.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 16.7 +0.0 blob) out(15.8 +0.0 blob), read-write-amplify(303.4) write-amplify(147.2) OK, records in: 11431, records dropped: 524 output_compression: NoCompression Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.673401) EVENT_LOG_v1 {"time_micros": 1764323750673375, "job": 14, "event": "compaction_finished", "compaction_time_micros": 94274, "compaction_time_cpu_micros": 51841, "output_level": 6, "num_output_files": 1, "total_output_size": 16607555, "num_input_records": 11431, "num_output_records": 10907, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750673760, "job": 14, "event": "table_file_deletion", "file_number": 29} Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323750677957, "job": 14, "event": "table_file_deletion", "file_number": 27} Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.576241) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.678116) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.678125) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.678129) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.678132) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:55:50 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:55:50.678135) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:55:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:55:50.835 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:55:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:55:50.835 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:55:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:55:50.835 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:55:51 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:55:51 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:55:51 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:51 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:51 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:55:52 localhost ceph-mon[292954]: Removing daemon mgr.np0005538512.zyhkxs from np0005538512.localdomain -- ports [8765] Nov 28 04:55:52 localhost ceph-mgr[286105]: [cephadm INFO cephadm.services.cephadmservice] Removing key for mgr.np0005538512.zyhkxs Nov 28 04:55:52 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removing key for mgr.np0005538512.zyhkxs Nov 28 04:55:52 localhost ceph-mgr[286105]: [progress INFO root] complete: finished ev bc8e7913-9bca-40e0-98a8-75add603a1f0 (Updating mgr deployment (-1 -> 3)) Nov 28 04:55:52 localhost ceph-mgr[286105]: [progress INFO root] Completed event bc8e7913-9bca-40e0-98a8-75add603a1f0 (Updating mgr deployment (-1 -> 3)) in 2 seconds Nov 28 04:55:52 localhost ceph-mgr[286105]: [progress INFO root] update: starting ev 30a28b45-2802-4295-ba57-d14320d63229 (Updating node-proxy deployment (+4 -> 4)) Nov 28 04:55:52 localhost ceph-mgr[286105]: [progress INFO root] complete: finished ev 30a28b45-2802-4295-ba57-d14320d63229 (Updating node-proxy deployment (+4 -> 4)) Nov 28 04:55:52 localhost ceph-mgr[286105]: [progress INFO root] Completed event 30a28b45-2802-4295-ba57-d14320d63229 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds Nov 28 04:55:53 localhost ceph-mon[292954]: Removing key for mgr.np0005538512.zyhkxs Nov 28 04:55:53 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth rm", "entity": "mgr.np0005538512.zyhkxs"} : dispatch Nov 28 04:55:53 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005538512.zyhkxs"}]': finished Nov 28 04:55:53 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:53 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:53 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:55:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:55:53 localhost nova_compute[279673]: 2025-11-28 09:55:53.782 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:55:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:55:53 localhost systemd[1]: tmp-crun.wWQj3p.mount: Deactivated successfully. Nov 28 04:55:53 localhost podman[301166]: 2025-11-28 09:55:53.891172074 +0000 UTC m=+0.097315376 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:55:53 localhost podman[301166]: 2025-11-28 09:55:53.899870632 +0000 UTC m=+0.106013954 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:55:53 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:55:53 localhost podman[301167]: 2025-11-28 09:55:53.99825941 +0000 UTC m=+0.200483741 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd) Nov 28 04:55:54 localhost podman[301167]: 2025-11-28 09:55:54.013348035 +0000 UTC m=+0.215572386 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:55:54 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:55:54 localhost ceph-mgr[286105]: [progress INFO root] update: starting ev ead8fbf6-91dc-4ccd-979c-c7502b02c89e (Updating node-proxy deployment (+4 -> 4)) Nov 28 04:55:54 localhost ceph-mgr[286105]: [progress INFO root] complete: finished ev ead8fbf6-91dc-4ccd-979c-c7502b02c89e (Updating node-proxy deployment (+4 -> 4)) Nov 28 04:55:54 localhost ceph-mgr[286105]: [progress INFO root] Completed event ead8fbf6-91dc-4ccd-979c-c7502b02c89e (Updating node-proxy deployment (+4 -> 4)) in 0 seconds Nov 28 04:55:54 localhost ceph-mon[292954]: mon.np0005538513@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:55:54 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538512 (monmap changed)... Nov 28 04:55:54 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538512 (monmap changed)... Nov 28 04:55:54 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain Nov 28 04:55:54 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain Nov 28 04:55:54 localhost ceph-mgr[286105]: [progress INFO root] Writing back 50 completed events Nov 28 04:55:55 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:55 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:55 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:55:55 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:55 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538512.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:55:55 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:55 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538513 (monmap changed)... Nov 28 04:55:55 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538513 (monmap changed)... Nov 28 04:55:55 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain Nov 28 04:55:55 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain Nov 28 04:55:55 localhost nova_compute[279673]: 2025-11-28 09:55:55.541 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:55:55 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:55:56 localhost podman[301278]: Nov 28 04:55:56 localhost podman[301278]: 2025-11-28 09:55:56.114240042 +0000 UTC m=+0.082646145 container create fea14a6e92e77b8cee73ad98bf3b1c33df348d90b04168702cead51bb5f8ba61 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_chebyshev, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, RELEASE=main, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.tags=rhceph ceph, name=rhceph, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux ) Nov 28 04:55:56 localhost systemd[1]: Started libpod-conmon-fea14a6e92e77b8cee73ad98bf3b1c33df348d90b04168702cead51bb5f8ba61.scope. Nov 28 04:55:56 localhost podman[301278]: 2025-11-28 09:55:56.081964058 +0000 UTC m=+0.050370221 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:55:56 localhost systemd[1]: Started libcrun container. Nov 28 04:55:56 localhost podman[301278]: 2025-11-28 09:55:56.203383746 +0000 UTC m=+0.171789839 container init fea14a6e92e77b8cee73ad98bf3b1c33df348d90b04168702cead51bb5f8ba61 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_chebyshev, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, RELEASE=main, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, architecture=x86_64, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, ceph=True, io.openshift.tags=rhceph ceph, name=rhceph, GIT_CLEAN=True) Nov 28 04:55:56 localhost ceph-mon[292954]: Reconfiguring crash.np0005538512 (monmap changed)... Nov 28 04:55:56 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538512 on np0005538512.localdomain Nov 28 04:55:56 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:56 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:56 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:55:56 localhost podman[301278]: 2025-11-28 09:55:56.217256482 +0000 UTC m=+0.185662576 container start fea14a6e92e77b8cee73ad98bf3b1c33df348d90b04168702cead51bb5f8ba61 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_chebyshev, vcs-type=git, distribution-scope=public, architecture=x86_64, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, release=553, io.buildah.version=1.33.12, ceph=True, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 28 04:55:56 localhost podman[301278]: 2025-11-28 09:55:56.217541692 +0000 UTC m=+0.185947835 container attach fea14a6e92e77b8cee73ad98bf3b1c33df348d90b04168702cead51bb5f8ba61 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_chebyshev, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , ceph=True, RELEASE=main, GIT_BRANCH=main, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, version=7, architecture=x86_64, vcs-type=git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12) Nov 28 04:55:56 localhost brave_chebyshev[301293]: 167 167 Nov 28 04:55:56 localhost systemd[1]: libpod-fea14a6e92e77b8cee73ad98bf3b1c33df348d90b04168702cead51bb5f8ba61.scope: Deactivated successfully. Nov 28 04:55:56 localhost podman[301278]: 2025-11-28 09:55:56.223209977 +0000 UTC m=+0.191616120 container died fea14a6e92e77b8cee73ad98bf3b1c33df348d90b04168702cead51bb5f8ba61 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_chebyshev, vcs-type=git, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, ceph=True, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, architecture=x86_64, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, release=553, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 04:55:56 localhost ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.34545 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005538512.localdomain", "target": ["mon-mgr", ""]}]: dispatch Nov 28 04:55:56 localhost ceph-mgr[286105]: [cephadm INFO root] Added label _no_schedule to host np0005538512.localdomain Nov 28 04:55:56 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Added label _no_schedule to host np0005538512.localdomain Nov 28 04:55:56 localhost ceph-mgr[286105]: [cephadm INFO root] Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005538512.localdomain Nov 28 04:55:56 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005538512.localdomain Nov 28 04:55:56 localhost podman[301298]: 2025-11-28 09:55:56.339203047 +0000 UTC m=+0.101542527 container remove fea14a6e92e77b8cee73ad98bf3b1c33df348d90b04168702cead51bb5f8ba61 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_chebyshev, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_CLEAN=True, release=553, name=rhceph, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main) Nov 28 04:55:56 localhost systemd[1]: libpod-conmon-fea14a6e92e77b8cee73ad98bf3b1c33df348d90b04168702cead51bb5f8ba61.scope: Deactivated successfully. Nov 28 04:55:56 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)... Nov 28 04:55:56 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)... Nov 28 04:55:56 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005538513.localdomain Nov 28 04:55:56 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005538513.localdomain Nov 28 04:55:57 localhost podman[301368]: Nov 28 04:55:57 localhost podman[301368]: 2025-11-28 09:55:57.106173145 +0000 UTC m=+0.081938113 container create 73845108f7a435264fbee658551baa3ae2ae1c5256f8041ade21810a50d828e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_proskuriakova, RELEASE=main, GIT_CLEAN=True, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.buildah.version=1.33.12, distribution-scope=public, version=7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_BRANCH=main) Nov 28 04:55:57 localhost systemd[1]: var-lib-containers-storage-overlay-fc5b0ac95bb28b6f007cdcdd9a002e54754d51b240592b901437450ee73c8796-merged.mount: Deactivated successfully. Nov 28 04:55:57 localhost systemd[1]: Started libpod-conmon-73845108f7a435264fbee658551baa3ae2ae1c5256f8041ade21810a50d828e4.scope. Nov 28 04:55:57 localhost systemd[1]: Started libcrun container. Nov 28 04:55:57 localhost podman[301368]: 2025-11-28 09:55:57.073109027 +0000 UTC m=+0.048874045 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:55:57 localhost podman[301368]: 2025-11-28 09:55:57.172515586 +0000 UTC m=+0.148280554 container init 73845108f7a435264fbee658551baa3ae2ae1c5256f8041ade21810a50d828e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_proskuriakova, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, ceph=True, architecture=x86_64, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., release=553, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, GIT_CLEAN=True, io.openshift.expose-services=, build-date=2025-09-24T08:57:55) Nov 28 04:55:57 localhost podman[301368]: 2025-11-28 09:55:57.182905956 +0000 UTC m=+0.158670934 container start 73845108f7a435264fbee658551baa3ae2ae1c5256f8041ade21810a50d828e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_proskuriakova, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, ceph=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux , version=7, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 04:55:57 localhost podman[301368]: 2025-11-28 09:55:57.183255777 +0000 UTC m=+0.159020795 container attach 73845108f7a435264fbee658551baa3ae2ae1c5256f8041ade21810a50d828e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_proskuriakova, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, ceph=True, vcs-type=git, distribution-scope=public, GIT_CLEAN=True, version=7, build-date=2025-09-24T08:57:55, architecture=x86_64, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 28 04:55:57 localhost busy_proskuriakova[301383]: 167 167 Nov 28 04:55:57 localhost systemd[1]: libpod-73845108f7a435264fbee658551baa3ae2ae1c5256f8041ade21810a50d828e4.scope: Deactivated successfully. Nov 28 04:55:57 localhost podman[301368]: 2025-11-28 09:55:57.186220468 +0000 UTC m=+0.161985456 container died 73845108f7a435264fbee658551baa3ae2ae1c5256f8041ade21810a50d828e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_proskuriakova, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., GIT_BRANCH=main, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.expose-services=, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, ceph=True) Nov 28 04:55:57 localhost ceph-mon[292954]: Reconfiguring crash.np0005538513 (monmap changed)... Nov 28 04:55:57 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain Nov 28 04:55:57 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:57 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:57 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:57 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:57 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 28 04:55:57 localhost podman[301388]: 2025-11-28 09:55:57.294477231 +0000 UTC m=+0.096069108 container remove 73845108f7a435264fbee658551baa3ae2ae1c5256f8041ade21810a50d828e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_proskuriakova, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , ceph=True, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_CLEAN=True) Nov 28 04:55:57 localhost systemd[1]: libpod-conmon-73845108f7a435264fbee658551baa3ae2ae1c5256f8041ade21810a50d828e4.scope: Deactivated successfully. Nov 28 04:55:57 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)... Nov 28 04:55:57 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)... Nov 28 04:55:57 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005538513.localdomain Nov 28 04:55:57 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005538513.localdomain Nov 28 04:55:57 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:55:57 localhost ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.44482 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005538512.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Nov 28 04:55:58 localhost systemd[1]: var-lib-containers-storage-overlay-7595fdbdc5a323040c88f7fd2be76a1f62201296c430db9c745db9b784df7622-merged.mount: Deactivated successfully. Nov 28 04:55:58 localhost podman[301464]: Nov 28 04:55:58 localhost podman[301464]: 2025-11-28 09:55:58.224421306 +0000 UTC m=+0.084648427 container create 96333aafbd3ac47ba87c1227d9a9850e17e3be62686ddfaac615e8e6bc25cd7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_gauss, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.buildah.version=1.33.12, GIT_CLEAN=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.openshift.tags=rhceph ceph, name=rhceph, maintainer=Guillaume Abrioux , version=7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, vcs-type=git) Nov 28 04:55:58 localhost ceph-mon[292954]: Added label _no_schedule to host np0005538512.localdomain Nov 28 04:55:58 localhost ceph-mon[292954]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005538512.localdomain Nov 28 04:55:58 localhost ceph-mon[292954]: Reconfiguring osd.2 (monmap changed)... Nov 28 04:55:58 localhost ceph-mon[292954]: Reconfiguring daemon osd.2 on np0005538513.localdomain Nov 28 04:55:58 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:58 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:58 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 28 04:55:58 localhost systemd[1]: Started libpod-conmon-96333aafbd3ac47ba87c1227d9a9850e17e3be62686ddfaac615e8e6bc25cd7a.scope. Nov 28 04:55:58 localhost systemd[1]: Started libcrun container. Nov 28 04:55:58 localhost podman[301464]: 2025-11-28 09:55:58.188553891 +0000 UTC m=+0.048781042 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:55:58 localhost podman[301464]: 2025-11-28 09:55:58.297828975 +0000 UTC m=+0.158056086 container init 96333aafbd3ac47ba87c1227d9a9850e17e3be62686ddfaac615e8e6bc25cd7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_gauss, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-type=git, GIT_CLEAN=True, RELEASE=main, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553) Nov 28 04:55:58 localhost podman[301464]: 2025-11-28 09:55:58.306548183 +0000 UTC m=+0.166775304 container start 96333aafbd3ac47ba87c1227d9a9850e17e3be62686ddfaac615e8e6bc25cd7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_gauss, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., RELEASE=main, io.buildah.version=1.33.12, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, release=553, io.openshift.expose-services=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True) Nov 28 04:55:58 localhost podman[301464]: 2025-11-28 09:55:58.308913196 +0000 UTC m=+0.169140317 container attach 96333aafbd3ac47ba87c1227d9a9850e17e3be62686ddfaac615e8e6bc25cd7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_gauss, GIT_BRANCH=main, GIT_CLEAN=True, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, version=7, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, release=553, io.openshift.expose-services=, com.redhat.component=rhceph-container, architecture=x86_64, RELEASE=main, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, ceph=True, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 28 04:55:58 localhost gracious_gauss[301479]: 167 167 Nov 28 04:55:58 localhost systemd[1]: libpod-96333aafbd3ac47ba87c1227d9a9850e17e3be62686ddfaac615e8e6bc25cd7a.scope: Deactivated successfully. Nov 28 04:55:58 localhost podman[301464]: 2025-11-28 09:55:58.315173409 +0000 UTC m=+0.175400550 container died 96333aafbd3ac47ba87c1227d9a9850e17e3be62686ddfaac615e8e6bc25cd7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_gauss, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, RELEASE=main, io.openshift.tags=rhceph ceph, name=rhceph, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, vcs-type=git, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, com.redhat.component=rhceph-container) Nov 28 04:55:58 localhost podman[301484]: 2025-11-28 09:55:58.415951191 +0000 UTC m=+0.086968468 container remove 96333aafbd3ac47ba87c1227d9a9850e17e3be62686ddfaac615e8e6bc25cd7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_gauss, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, CEPH_POINT_RELEASE=, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, architecture=x86_64, name=rhceph, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , distribution-scope=public, io.buildah.version=1.33.12, version=7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True) Nov 28 04:55:58 localhost systemd[1]: libpod-conmon-96333aafbd3ac47ba87c1227d9a9850e17e3be62686ddfaac615e8e6bc25cd7a.scope: Deactivated successfully. Nov 28 04:55:58 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)... Nov 28 04:55:58 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)... Nov 28 04:55:58 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain Nov 28 04:55:58 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain Nov 28 04:55:58 localhost nova_compute[279673]: 2025-11-28 09:55:58.785 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:55:59 localhost systemd[1]: var-lib-containers-storage-overlay-231a4da403ed17711b8b8d29e538ebab04bb8a7a203305a4da0c06f7368039e2-merged.mount: Deactivated successfully. Nov 28 04:55:59 localhost ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.44488 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005538512.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch Nov 28 04:55:59 localhost ceph-mgr[286105]: [cephadm INFO root] Removed host np0005538512.localdomain Nov 28 04:55:59 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removed host np0005538512.localdomain Nov 28 04:55:59 localhost ceph-mon[292954]: Reconfiguring osd.5 (monmap changed)... Nov 28 04:55:59 localhost ceph-mon[292954]: Reconfiguring daemon osd.5 on np0005538513.localdomain Nov 28 04:55:59 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:59 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:59 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:55:59 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:55:59 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain"} : dispatch Nov 28 04:55:59 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain"}]': finished Nov 28 04:55:59 localhost podman[301560]: Nov 28 04:55:59 localhost podman[301560]: 2025-11-28 09:55:59.351770586 +0000 UTC m=+0.082601473 container create 393965a305174173322cb6c83e4accb04ae48857ac867fd4e40053893edb59ca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_gould, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, description=Red Hat Ceph Storage 7, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, ceph=True, architecture=x86_64, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.openshift.expose-services=) Nov 28 04:55:59 localhost systemd[1]: Started libpod-conmon-393965a305174173322cb6c83e4accb04ae48857ac867fd4e40053893edb59ca.scope. Nov 28 04:55:59 localhost systemd[1]: tmp-crun.2t8p8J.mount: Deactivated successfully. Nov 28 04:55:59 localhost podman[301560]: 2025-11-28 09:55:59.317540083 +0000 UTC m=+0.048371010 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:55:59 localhost systemd[1]: Started libcrun container. Nov 28 04:55:59 localhost podman[301560]: 2025-11-28 09:55:59.447637588 +0000 UTC m=+0.178468485 container init 393965a305174173322cb6c83e4accb04ae48857ac867fd4e40053893edb59ca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_gould, GIT_CLEAN=True, architecture=x86_64, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, RELEASE=main, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, release=553, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 28 04:55:59 localhost keen_gould[301576]: 167 167 Nov 28 04:55:59 localhost podman[301560]: 2025-11-28 09:55:59.462111783 +0000 UTC m=+0.192942680 container start 393965a305174173322cb6c83e4accb04ae48857ac867fd4e40053893edb59ca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_gould, ceph=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, build-date=2025-09-24T08:57:55, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, release=553, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, com.redhat.component=rhceph-container) Nov 28 04:55:59 localhost podman[301560]: 2025-11-28 09:55:59.462546916 +0000 UTC m=+0.193377833 container attach 393965a305174173322cb6c83e4accb04ae48857ac867fd4e40053893edb59ca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_gould, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, distribution-scope=public, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, architecture=x86_64, vcs-type=git, GIT_BRANCH=main, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_CLEAN=True, maintainer=Guillaume Abrioux ) Nov 28 04:55:59 localhost systemd[1]: libpod-393965a305174173322cb6c83e4accb04ae48857ac867fd4e40053893edb59ca.scope: Deactivated successfully. Nov 28 04:55:59 localhost podman[301560]: 2025-11-28 09:55:59.465556559 +0000 UTC m=+0.196387496 container died 393965a305174173322cb6c83e4accb04ae48857ac867fd4e40053893edb59ca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_gould, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, maintainer=Guillaume Abrioux , RELEASE=main, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, description=Red Hat Ceph Storage 7, architecture=x86_64) Nov 28 04:55:59 localhost ceph-mon[292954]: mon.np0005538513@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:55:59 localhost podman[301581]: 2025-11-28 09:55:59.564939688 +0000 UTC m=+0.091514228 container remove 393965a305174173322cb6c83e4accb04ae48857ac867fd4e40053893edb59ca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_gould, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, architecture=x86_64, io.buildah.version=1.33.12, GIT_BRANCH=main, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.openshift.tags=rhceph ceph, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.openshift.expose-services=, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 28 04:55:59 localhost systemd[1]: libpod-conmon-393965a305174173322cb6c83e4accb04ae48857ac867fd4e40053893edb59ca.scope: Deactivated successfully. Nov 28 04:55:59 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)... Nov 28 04:55:59 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)... Nov 28 04:55:59 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain Nov 28 04:55:59 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain Nov 28 04:55:59 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:56:00 localhost systemd[1]: var-lib-containers-storage-overlay-f6da1730271c3af938e9920c5df8939128f7e8ba8af02d0cf3c55ccaa74fd8a5-merged.mount: Deactivated successfully. Nov 28 04:56:00 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)... Nov 28 04:56:00 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain Nov 28 04:56:00 localhost ceph-mon[292954]: Removed host np0005538512.localdomain Nov 28 04:56:00 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:00 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:00 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:56:00 localhost podman[301653]: Nov 28 04:56:00 localhost podman[301653]: 2025-11-28 09:56:00.290941945 +0000 UTC m=+0.082654305 container create 35f03cdf8fc9abebe7dc6819d9c8c0cd5e0cc8d4749bd9f39714ff00353e110c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_keldysh, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, ceph=True, description=Red Hat Ceph Storage 7, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, name=rhceph, io.openshift.tags=rhceph ceph, RELEASE=main, maintainer=Guillaume Abrioux , io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, version=7, GIT_CLEAN=True, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container) Nov 28 04:56:00 localhost systemd[1]: Started libpod-conmon-35f03cdf8fc9abebe7dc6819d9c8c0cd5e0cc8d4749bd9f39714ff00353e110c.scope. Nov 28 04:56:00 localhost systemd[1]: Started libcrun container. Nov 28 04:56:00 localhost podman[301653]: 2025-11-28 09:56:00.355913485 +0000 UTC m=+0.147625835 container init 35f03cdf8fc9abebe7dc6819d9c8c0cd5e0cc8d4749bd9f39714ff00353e110c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_keldysh, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, description=Red Hat Ceph Storage 7, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 04:56:00 localhost podman[301653]: 2025-11-28 09:56:00.258372103 +0000 UTC m=+0.050084493 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:56:00 localhost podman[301653]: 2025-11-28 09:56:00.367116 +0000 UTC m=+0.158828350 container start 35f03cdf8fc9abebe7dc6819d9c8c0cd5e0cc8d4749bd9f39714ff00353e110c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_keldysh, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, vcs-type=git, io.buildah.version=1.33.12, version=7, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, maintainer=Guillaume Abrioux ) Nov 28 04:56:00 localhost podman[301653]: 2025-11-28 09:56:00.36745741 +0000 UTC m=+0.159169820 container attach 35f03cdf8fc9abebe7dc6819d9c8c0cd5e0cc8d4749bd9f39714ff00353e110c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_keldysh, com.redhat.license_terms=https://www.redhat.com/agreements, maintainer=Guillaume Abrioux , architecture=x86_64, io.openshift.expose-services=, name=rhceph, vendor=Red Hat, Inc., version=7, CEPH_POINT_RELEASE=, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, vcs-type=git, RELEASE=main, ceph=True, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_CLEAN=True) Nov 28 04:56:00 localhost reverent_keldysh[301668]: 167 167 Nov 28 04:56:00 localhost systemd[1]: libpod-35f03cdf8fc9abebe7dc6819d9c8c0cd5e0cc8d4749bd9f39714ff00353e110c.scope: Deactivated successfully. Nov 28 04:56:00 localhost podman[301653]: 2025-11-28 09:56:00.370389801 +0000 UTC m=+0.162102151 container died 35f03cdf8fc9abebe7dc6819d9c8c0cd5e0cc8d4749bd9f39714ff00353e110c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_keldysh, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, vcs-type=git, version=7, architecture=x86_64, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vendor=Red Hat, Inc., distribution-scope=public) Nov 28 04:56:00 localhost podman[301673]: 2025-11-28 09:56:00.470623946 +0000 UTC m=+0.087801444 container remove 35f03cdf8fc9abebe7dc6819d9c8c0cd5e0cc8d4749bd9f39714ff00353e110c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_keldysh, vcs-type=git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , distribution-scope=public, GIT_CLEAN=True, name=rhceph, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, ceph=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, version=7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, architecture=x86_64) Nov 28 04:56:00 localhost systemd[1]: libpod-conmon-35f03cdf8fc9abebe7dc6819d9c8c0cd5e0cc8d4749bd9f39714ff00353e110c.scope: Deactivated successfully. Nov 28 04:56:00 localhost nova_compute[279673]: 2025-11-28 09:56:00.545 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:56:00 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005538513 (monmap changed)... Nov 28 04:56:00 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005538513 (monmap changed)... Nov 28 04:56:00 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain Nov 28 04:56:00 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.674 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.675 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.683 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8672ca8-2bfe-4198-9b89-b82400139bfa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:56:00.675771', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '71bfa2aa-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.847690972, 'message_signature': '1381f9ee9f180cd518866b21ccaa0d264db0a2692c9b27d1322bd221b1f77923'}]}, 'timestamp': '2025-11-28 09:56:00.684765', '_unique_id': 'c5a52d0288ab4bb3bcb37a5671f66689'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.686 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.689 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.689 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29d69ffb-fa2e-4af3-be4a-21d4c790122e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:56:00.689248', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '71c06db6-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.847690972, 'message_signature': 'd877a18a41cacbe22023d32ff1bb72cd24812deafb604f217c46245dcdf4cbb6'}]}, 'timestamp': '2025-11-28 09:56:00.689777', '_unique_id': '2c8bdea2785b4c399e8c503846e12e13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.690 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.691 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.692 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.692 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '079acd19-ad03-4e54-921e-f51594fa81c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:56:00.692265', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '71c0e7be-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.847690972, 'message_signature': 'bb4fc6737fc1453c80827ae7341f3995c4ea246ce72816ff8c4dbf3c011639f0'}]}, 'timestamp': '2025-11-28 09:56:00.693005', '_unique_id': '9789b1e25af94b1bb0dbe6db75bcc569'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.694 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.696 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.726 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.727 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '417b97e0-0b20-4e0e-86f9-f7e3e6336127', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:56:00.696359', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '71c6212a-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.868292396, 'message_signature': '719c0ac34d0afc92609cc2cbd84945bcbe27d9515a5ea20bf3d7d2f40d1be22b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:56:00.696359', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '71c636ba-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.868292396, 'message_signature': '04b91b41dfd6bdf8bdd3b742a2dd9242cf6df20cdba3cbb10f346bcbe65ea8f5'}]}, 'timestamp': '2025-11-28 09:56:00.727640', '_unique_id': 'ed07a52a8f2646db89a86e17e6a7b089'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.729 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.730 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.730 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.730 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd889f451-bac0-450c-89e0-6f545fb2be7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:56:00.730579', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '71c6bb8a-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.847690972, 'message_signature': 'b8c04f156447b41295c26343a9d6583672163fbe5e31f92b53c942813b74c64b'}]}, 'timestamp': '2025-11-28 09:56:00.731114', '_unique_id': 'b944c1ea84984c31b64c24d1ee9dd86a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.732 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.733 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.733 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.733 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b73822c3-7f2a-4922-82e9-6ce4c7947fb8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:56:00.733332', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '71c727a0-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.868292396, 'message_signature': '0c2599de0a48f639bce76d2f57d0d000993c3cc307a1424b5ce32e2a3666284a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:56:00.733332', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '71c73a06-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.868292396, 'message_signature': '90bc81385ed49376ef3d62f2367f7fb61a6c05a54bb0f744b0d46ce153e4ae55'}]}, 'timestamp': '2025-11-28 09:56:00.734278', '_unique_id': '8e5113de4edd4c91b588721dc801d44e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.735 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.736 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.736 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5db8afb2-bce6-4c5b-b750-92b37e331113', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:56:00.736519', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '71c7a374-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.847690972, 'message_signature': 'bf5c974cd34344924ed5e530b6efa4cc2c6007a8d9cc85b634d0e5bc9c34d5c4'}]}, 'timestamp': '2025-11-28 09:56:00.736989', '_unique_id': '2707bb6ff99342dbbce68374ecdc3941'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.738 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.739 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.740 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62690b0e-7d47-4602-a073-dd20c49003c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:56:00.740054', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '71c83136-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.847690972, 'message_signature': 'bb20c579a59ac6a6ffd3c54f9f47f8ff5a70e87358b236791686139bd0f7d0f6'}]}, 'timestamp': '2025-11-28 09:56:00.740699', '_unique_id': '08d4bfb399b144f7a6b4a7f28365dee4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.741 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.742 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.742 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.743 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec9249ce-6ed4-4cb5-a49d-3b9f0ad816fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:56:00.742955', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '71c8a0d0-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.868292396, 'message_signature': 'b9188261aef4f7dd577ddd89c9d9eb62a16eac2a2f88dba7f540f9cd2acf7e55'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:56:00.742955', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '71c8b19c-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.868292396, 'message_signature': '4e54b3469ae75115f7e953105eb6bc7e0f72f264e609fb1c9cb1b4a7a9093e3a'}]}, 'timestamp': '2025-11-28 09:56:00.743877', '_unique_id': 'becb27e6c51e470dbaf7c444f2eeaad9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.744 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.746 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.757 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.757 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ecae124-d048-44e5-882e-2415693e5b92', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:56:00.746257', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '71cac69e-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.918185332, 'message_signature': 'd49987c2714f60e0930352bd31ce754bbbce5c56721c2ef017ec58eb4f6e84be'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:56:00.746257', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '71cadbc0-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.918185332, 'message_signature': '4a36a4ce94a7ad8f121ce1fb38357e3a76e88a2200ecc8ce04cee1d4ea183d5d'}]}, 'timestamp': '2025-11-28 09:56:00.758165', '_unique_id': '72b10f638c1a45e09a9a822f3247f7db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.759 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.760 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.760 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.761 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51af6fdc-5ac6-4881-b28b-4b5b048fe786', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:56:00.760642', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '71cb5172-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.918185332, 'message_signature': '4ce3f194f48669f1a5a68183c3ace784d3498f93841b6007be7a695289136163'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:56:00.760642', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '71cb62d4-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.918185332, 'message_signature': '6a32f62f17747bb01df235329e3ce777a5593990cec3d0675bc30478996f6d8f'}]}, 'timestamp': '2025-11-28 09:56:00.761520', '_unique_id': '1e6a25cefda94bc4b59a478cf721c06d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.762 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.763 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.764 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3c51377-1839-4abb-b441-28093050ea6b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:56:00.763973', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '71cbd642-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.847690972, 'message_signature': '14ffe9ef11f139315dcdf8b59cc85386e8d6132cd6ac638fe64e8c22efc2f9ea'}]}, 'timestamp': '2025-11-28 09:56:00.764509', '_unique_id': 'd912833165714c62a41f3e98b4f8f685'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.765 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.766 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.767 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6296e3b3-a986-45a1-8f07-1339e63ca2eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:56:00.767077', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '71cc4d02-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.847690972, 'message_signature': 'e917e7838491bc7e98c3bbd88d1feb36fdbd001a1a404ecf748341caba777e5d'}]}, 'timestamp': '2025-11-28 09:56:00.767542', '_unique_id': '20a445f1c6ba4f75a453f441b34824ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.768 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.769 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.785 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 13960000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'faf90230-45c5-45a2-a5c7-2823656838de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13960000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:56:00.769731', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '71cf2626-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.957483872, 'message_signature': '6db3c35543e5031b2b5a030bc7e5907132a17a5b2824cf2a335b2868b6b3bff8'}]}, 'timestamp': '2025-11-28 09:56:00.786239', '_unique_id': '283f352c3e7440b09ad933fcc347c3f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.788 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.788 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.788 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50f9bf05-2f28-4669-b916-f9aa6ba2f0b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:56:00.788474', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '71cf9066-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.868292396, 'message_signature': 'ffc7d3f51095009cfa33acede44b014126490cf3945a353191ea033d12a793ad'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:56:00.788474', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '71cfa1aa-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.868292396, 'message_signature': '1f622b0f048745fad78b18fe5a6d4f4d1d3d8931cd016bb8c38aa72eafbe2b89'}]}, 'timestamp': '2025-11-28 09:56:00.789342', '_unique_id': '36615011e02b47309400d682c242a77c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.790 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.791 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.791 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43a82553-8ff4-4308-aaf0-e4c161faf420', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:56:00.791504', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '71d006fe-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.847690972, 'message_signature': 'eb42d5ffe3952a9ee5665c9e0865e65323c16315392a176a23b20510da09f738'}]}, 'timestamp': '2025-11-28 09:56:00.791964', '_unique_id': '76699a81515a430d8c163dba9492af35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.792 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.794 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.794 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38a076a1-e3e4-498c-92b2-f0130f60f19e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:56:00.794179', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '71d06f72-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.847690972, 'message_signature': '63c2657b9ab1e2a13dc2e3a3026638de721fb4c9978934cb3b56f74cab8a809f'}]}, 'timestamp': '2025-11-28 09:56:00.794637', '_unique_id': '645898cde0c84ee8b9fd4d304b33fb60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.795 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.796 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.796 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.796 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.797 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.797 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22d9c47e-d5f4-4208-b28b-bde2597ffd6a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:56:00.796983', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '71d0df98-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.868292396, 'message_signature': '431cabc1d3fd772916ce6a81660f9ea2b36e9b5d97f5eaffb1d8d421696353db'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:56:00.796983', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '71d0f104-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.868292396, 'message_signature': '166e816bf70904594a8e7699048691dcb3537081a44bf5a860f7a04beb921389'}]}, 'timestamp': '2025-11-28 09:56:00.797928', '_unique_id': '348c65bce4dc4192bfe66886fd49981f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.798 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.800 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.800 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3335da16-7e15-45f5-b812-030d3b9eac5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:56:00.800363', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '71d160ee-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.957483872, 'message_signature': 'ddc195e1da2d4fa07bb8b005113bcf6b069578c09e992ed8a94b53fc964eb774'}]}, 'timestamp': '2025-11-28 09:56:00.800802', '_unique_id': '9ecad9bcdbb1495983b8733c3d60996b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.801 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.802 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.802 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.803 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b792b580-fce9-4139-a21d-779f7f719d14', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:56:00.802899', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '71d1c55c-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.918185332, 'message_signature': 'dc1cf2736cae55cee781181150ecfa36221b8bb47e56304a956b0abfe63f6f76'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:56:00.802899', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '71d1d8f8-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.918185332, 'message_signature': '30b7315e936d100a6e962ad6d4a2bb6dc2823aa830c7b20d106cb19f15f2bb57'}]}, 'timestamp': '2025-11-28 09:56:00.803871', '_unique_id': '3e67fb70af3645abaa36ac38b98d9aa6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.804 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.805 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.806 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.806 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5d36ac1-5cd9-482b-be3e-1f523eac02d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:56:00.806111', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '71d24162-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.868292396, 'message_signature': '3e09a921ae54408ce936355e7a70f2fd17191780d9f56b52928b41fa8964b32d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:56:00.806111', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '71d25166-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11594.868292396, 'message_signature': '3e915869eb93bbde8a31e301c83b22406fdc514cc0c7360ece9abf4351fb4e78'}]}, 'timestamp': '2025-11-28 09:56:00.806945', '_unique_id': '07dcdbac0b1f40fc8b1a7833c411f128'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:56:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:56:00.807 12 ERROR oslo_messaging.notify.messaging Nov 28 04:56:01 localhost systemd[1]: var-lib-containers-storage-overlay-bdd1a8ec4da179b12eb78cb1acf0e6f146d77d795937468857b0b018f40afe66-merged.mount: Deactivated successfully. Nov 28 04:56:01 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)... Nov 28 04:56:01 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain Nov 28 04:56:01 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:01 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:01 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:56:01 localhost podman[301744]: Nov 28 04:56:01 localhost podman[301744]: 2025-11-28 09:56:01.304000078 +0000 UTC m=+0.084892254 container create d8da87e4903d5f8010d002c64182990ca3d02637d11562898bfbbc3af66513b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_diffie, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, version=7, ceph=True, GIT_BRANCH=main, com.redhat.component=rhceph-container, architecture=x86_64, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55) Nov 28 04:56:01 localhost systemd[1]: Started libpod-conmon-d8da87e4903d5f8010d002c64182990ca3d02637d11562898bfbbc3af66513b1.scope. Nov 28 04:56:01 localhost systemd[1]: Started libcrun container. Nov 28 04:56:01 localhost podman[301744]: 2025-11-28 09:56:01.271755256 +0000 UTC m=+0.052647472 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:56:01 localhost podman[301744]: 2025-11-28 09:56:01.376713166 +0000 UTC m=+0.157605362 container init d8da87e4903d5f8010d002c64182990ca3d02637d11562898bfbbc3af66513b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_diffie, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, release=553, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, RELEASE=main, build-date=2025-09-24T08:57:55) Nov 28 04:56:01 localhost podman[301744]: 2025-11-28 09:56:01.384915298 +0000 UTC m=+0.165807494 container start d8da87e4903d5f8010d002c64182990ca3d02637d11562898bfbbc3af66513b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_diffie, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-09-24T08:57:55, release=553, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_BRANCH=main, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, name=rhceph, vendor=Red Hat, Inc., io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_CLEAN=True, version=7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 04:56:01 localhost podman[301744]: 2025-11-28 09:56:01.385445775 +0000 UTC m=+0.166337981 container attach d8da87e4903d5f8010d002c64182990ca3d02637d11562898bfbbc3af66513b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_diffie, architecture=x86_64, build-date=2025-09-24T08:57:55, release=553, io.openshift.expose-services=, GIT_BRANCH=main, name=rhceph, GIT_CLEAN=True, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 28 04:56:01 localhost goofy_diffie[301760]: 167 167 Nov 28 04:56:01 localhost systemd[1]: libpod-d8da87e4903d5f8010d002c64182990ca3d02637d11562898bfbbc3af66513b1.scope: Deactivated successfully. Nov 28 04:56:01 localhost podman[301744]: 2025-11-28 09:56:01.396397122 +0000 UTC m=+0.177289338 container died d8da87e4903d5f8010d002c64182990ca3d02637d11562898bfbbc3af66513b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_diffie, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_CLEAN=True, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, name=rhceph, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, RELEASE=main, release=553, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git) Nov 28 04:56:01 localhost podman[301765]: 2025-11-28 09:56:01.509724631 +0000 UTC m=+0.108426119 container remove d8da87e4903d5f8010d002c64182990ca3d02637d11562898bfbbc3af66513b1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_diffie, CEPH_POINT_RELEASE=, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, distribution-scope=public, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.component=rhceph-container, release=553, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, vcs-type=git) Nov 28 04:56:01 localhost systemd[1]: libpod-conmon-d8da87e4903d5f8010d002c64182990ca3d02637d11562898bfbbc3af66513b1.scope: Deactivated successfully. Nov 28 04:56:01 localhost ceph-mgr[286105]: [progress INFO root] update: starting ev c06c889e-5b25-4013-b3a0-99a9e621a0d9 (Updating node-proxy deployment (+3 -> 3)) Nov 28 04:56:01 localhost ceph-mgr[286105]: [progress INFO root] complete: finished ev c06c889e-5b25-4013-b3a0-99a9e621a0d9 (Updating node-proxy deployment (+3 -> 3)) Nov 28 04:56:01 localhost ceph-mgr[286105]: [progress INFO root] Completed event c06c889e-5b25-4013-b3a0-99a9e621a0d9 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds Nov 28 04:56:01 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:56:02 localhost systemd[1]: var-lib-containers-storage-overlay-ffd3a3fc7af74570d5ddb6c58d75a1dcc063971da9c291d0dfb9cd81e848f7aa-merged.mount: Deactivated successfully. Nov 28 04:56:02 localhost ceph-mon[292954]: Reconfiguring mon.np0005538513 (monmap changed)... Nov 28 04:56:02 localhost ceph-mon[292954]: Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain Nov 28 04:56:02 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:02 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:02 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:56:02 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:03 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:56:03 localhost nova_compute[279673]: 2025-11-28 09:56:03.824 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:56:04 localhost ceph-mon[292954]: mon.np0005538513@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:56:04 localhost ceph-mgr[286105]: [progress INFO root] Writing back 50 completed events Nov 28 04:56:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:56:05 localhost podman[301799]: 2025-11-28 09:56:05.489572654 +0000 UTC m=+0.083915874 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.tags=minimal rhel9, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.) Nov 28 04:56:05 localhost podman[301799]: 2025-11-28 09:56:05.505458213 +0000 UTC m=+0.099801433 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=edpm, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, maintainer=Red Hat, Inc.) Nov 28 04:56:05 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:56:05 localhost nova_compute[279673]: 2025-11-28 09:56:05.576 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:56:05 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:56:06 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:07 localhost ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.44494 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch Nov 28 04:56:07 localhost ceph-mgr[286105]: [cephadm INFO root] Saving service mon spec with placement label:mon Nov 28 04:56:07 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon Nov 28 04:56:07 localhost ceph-mgr[286105]: [progress INFO root] update: starting ev 18aba204-dae4-489c-a558-bf8f9a9db726 (Updating node-proxy deployment (+3 -> 3)) Nov 28 04:56:07 localhost ceph-mgr[286105]: [progress INFO root] complete: finished ev 18aba204-dae4-489c-a558-bf8f9a9db726 (Updating node-proxy deployment (+3 -> 3)) Nov 28 04:56:07 localhost ceph-mgr[286105]: [progress INFO root] Completed event 18aba204-dae4-489c-a558-bf8f9a9db726 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds Nov 28 04:56:07 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:56:08 localhost ceph-mon[292954]: Saving service mon spec with placement label:mon Nov 28 04:56:08 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:08 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:56:08 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:08 localhost nova_compute[279673]: 2025-11-28 09:56:08.826 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:56:09 localhost ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.44500 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005538515", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Nov 28 04:56:09 localhost ceph-mon[292954]: mon.np0005538513@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:56:09 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:56:10 localhost podman[238687]: time="2025-11-28T09:56:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:56:10 localhost podman[238687]: @ - - [28/Nov/2025:09:56:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 28 04:56:10 localhost podman[238687]: @ - - [28/Nov/2025:09:56:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18735 "" "Go-http-client/1.1" Nov 28 04:56:10 localhost ceph-mgr[286105]: [progress INFO root] Writing back 50 completed events Nov 28 04:56:10 localhost nova_compute[279673]: 2025-11-28 09:56:10.578 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:56:10 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:10 localhost ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.44506 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005538515"], "force": true, "target": ["mon-mgr", ""]}]: dispatch Nov 28 04:56:10 localhost ceph-mgr[286105]: [cephadm INFO root] Remove daemons mon.np0005538515 Nov 28 04:56:10 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Remove daemons mon.np0005538515 Nov 28 04:56:10 localhost ceph-mgr[286105]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005538515: new quorum should be ['np0005538513', 'np0005538514'] (from ['np0005538513', 'np0005538514']) Nov 28 04:56:10 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005538515: new quorum should be ['np0005538513', 'np0005538514'] (from ['np0005538513', 'np0005538514']) Nov 28 04:56:10 localhost ceph-mgr[286105]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005538515 from monmap... Nov 28 04:56:10 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removing monitor np0005538515 from monmap... Nov 28 04:56:10 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005538515 from np0005538515.localdomain -- ports [] Nov 28 04:56:10 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005538515 from np0005538515.localdomain -- ports [] Nov 28 04:56:10 localhost ceph-mon[292954]: mon.np0005538513@1(peon) e14 my rank is now 0 (was 1) Nov 28 04:56:10 localhost ceph-mgr[286105]: client.44410 ms_handle_reset on v2:172.18.0.103:3300/0 Nov 28 04:56:10 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : mon.np0005538513 calling monitor election Nov 28 04:56:10 localhost ceph-mon[292954]: paxos.0).electionLogic(48) init, last seen epoch 48 Nov 28 04:56:10 localhost ceph-mon[292954]: mon.np0005538513@0(electing) e14 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:56:10 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : mon.np0005538513 is new leader, mons np0005538513,np0005538514 in quorum (ranks 0,1) Nov 28 04:56:10 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : monmap epoch 14 Nov 28 04:56:10 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 Nov 28 04:56:10 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : last_changed 2025-11-28T09:56:10.676143+0000 Nov 28 04:56:10 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : created 2025-11-28T07:45:36.120469+0000 Nov 28 04:56:10 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef) Nov 28 04:56:10 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : election_strategy: 1 Nov 28 04:56:10 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513 Nov 28 04:56:10 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538514 Nov 28 04:56:10 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:56:10 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby Nov 28 04:56:10 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e90: 6 total, 6 up, 6 in Nov 28 04:56:10 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e34: np0005538513.dsfdlx(active, since 51s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538512.zyhkxs Nov 28 04:56:10 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : overall HEALTH_OK Nov 28 04:56:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:56:10 localhost ceph-mon[292954]: mon.np0005538513 calling monitor election Nov 28 04:56:10 localhost ceph-mon[292954]: mon.np0005538514 calling monitor election Nov 28 04:56:10 localhost ceph-mon[292954]: mon.np0005538513 is new leader, mons np0005538513,np0005538514 in quorum (ranks 0,1) Nov 28 04:56:10 localhost ceph-mon[292954]: overall HEALTH_OK Nov 28 04:56:10 localhost podman[301838]: 2025-11-28 09:56:10.85406511 +0000 UTC m=+0.086086130 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:56:10 localhost ceph-mds[282744]: --2- [v2:172.18.0.106:6808/2782735008,v1:172.18.0.106:6809/2782735008] >> [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] conn(0x55dac314f800 0x55dac3506000 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 Nov 28 04:56:10 localhost ceph-mgr[286105]: --2- 172.18.0.106:0/2775473572 >> [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] conn(0x560b9b6e2400 0x560b9ae07700 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 Nov 28 04:56:10 localhost ceph-mgr[286105]: --2- 172.18.0.106:0/3621695456 >> [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] conn(0x560b9b6e3800 0x560b9b75f080 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 Nov 28 04:56:10 localhost ceph-mgr[286105]: --2- 172.18.0.106:0/3109361859 >> [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] conn(0x560b9b920400 0x560b9b918000 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 Nov 28 04:56:10 localhost ceph-mgr[286105]: --2- 172.18.0.106:0/4290692976 >> [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] conn(0x560b9c7edc00 0x560b9ae04b00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 Nov 28 04:56:10 localhost ceph-mgr[286105]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0 Nov 28 04:56:10 localhost ceph-mgr[286105]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0 Nov 28 04:56:10 localhost ceph-mgr[286105]: client.27136 ms_handle_reset on v2:172.18.0.104:3300/0 Nov 28 04:56:10 localhost podman[301838]: 2025-11-28 09:56:10.89108926 +0000 UTC m=+0.123110340 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 04:56:10 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538513"} v 0) Nov 28 04:56:10 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch Nov 28 04:56:10 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0) Nov 28 04:56:10 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch Nov 28 04:56:10 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:56:10 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:56:10 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:56:10 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Nov 28 04:56:10 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:56:10 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:56:10 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:56:10 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:56:10 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:56:10 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:56:10 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:56:11 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:56:11 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:56:11 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:56:11 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:56:11 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:56:11 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:56:11 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:56:11 localhost ceph-mon[292954]: Remove daemons mon.np0005538515 Nov 28 04:56:11 localhost ceph-mon[292954]: Safe to remove mon.np0005538515: new quorum should be ['np0005538513', 'np0005538514'] (from ['np0005538513', 'np0005538514']) Nov 28 04:56:11 localhost ceph-mon[292954]: Removing monitor np0005538515 from monmap... Nov 28 04:56:11 localhost ceph-mon[292954]: Removing daemon mon.np0005538515 from np0005538515.localdomain -- ports [] Nov 28 04:56:11 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:56:11 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:56:11 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:56:11 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:56:12 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:56:12 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:12 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:56:12 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:56:12 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:12 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:56:12 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:12 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:12 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:56:12 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:12 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:56:12 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:12 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 04:56:12 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:12 localhost ceph-mgr[286105]: [progress INFO root] update: starting ev 53cd24d9-055a-4195-99ca-ff907803a144 (Updating node-proxy deployment (+3 -> 3)) Nov 28 04:56:12 localhost ceph-mgr[286105]: [progress INFO root] complete: finished ev 53cd24d9-055a-4195-99ca-ff907803a144 (Updating node-proxy deployment (+3 -> 3)) Nov 28 04:56:12 localhost ceph-mgr[286105]: [progress INFO root] Completed event 53cd24d9-055a-4195-99ca-ff907803a144 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds Nov 28 04:56:12 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Nov 28 04:56:12 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Nov 28 04:56:12 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538513 (monmap changed)... Nov 28 04:56:12 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538513 (monmap changed)... Nov 28 04:56:12 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Nov 28 04:56:12 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:56:12 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:56:12 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:56:12 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain Nov 28 04:56:12 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain Nov 28 04:56:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:56:13 localhost podman[302237]: 2025-11-28 09:56:13.053207171 +0000 UTC m=+0.064024582 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller) Nov 28 04:56:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:56:13 localhost podman[302237]: 2025-11-28 09:56:13.125386823 +0000 UTC m=+0.136204204 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 28 04:56:13 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:56:13 localhost podman[302256]: 2025-11-28 09:56:13.178851349 +0000 UTC m=+0.106743357 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 04:56:13 localhost podman[302256]: 2025-11-28 09:56:13.216537999 +0000 UTC m=+0.144429987 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent) Nov 28 04:56:13 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:56:13 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:56:13 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:56:13 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:56:13 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:13 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:13 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:13 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:13 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:13 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:13 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:13 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:56:13 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 04:56:13 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3571339704' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 04:56:13 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 04:56:13 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3571339704' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 04:56:13 localhost podman[302296]: Nov 28 04:56:13 localhost podman[302296]: 2025-11-28 09:56:13.500323714 +0000 UTC m=+0.082902823 container create 66f70c5ca36463b5108ee763191a47ca79bdc1155632debe1533c211c1b2526e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_neumann, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, build-date=2025-09-24T08:57:55, RELEASE=main, name=rhceph, architecture=x86_64, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, version=7, io.openshift.expose-services=) Nov 28 04:56:13 localhost systemd[1]: Started libpod-conmon-66f70c5ca36463b5108ee763191a47ca79bdc1155632debe1533c211c1b2526e.scope. Nov 28 04:56:13 localhost podman[302296]: 2025-11-28 09:56:13.463627475 +0000 UTC m=+0.046206644 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:56:13 localhost systemd[1]: Started libcrun container. Nov 28 04:56:13 localhost podman[302296]: 2025-11-28 09:56:13.578939674 +0000 UTC m=+0.161518793 container init 66f70c5ca36463b5108ee763191a47ca79bdc1155632debe1533c211c1b2526e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_neumann, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, RELEASE=main, name=rhceph, io.buildah.version=1.33.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, version=7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, vcs-type=git, GIT_BRANCH=main, GIT_CLEAN=True, release=553, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc.) Nov 28 04:56:13 localhost podman[302296]: 2025-11-28 09:56:13.59049057 +0000 UTC m=+0.173069689 container start 66f70c5ca36463b5108ee763191a47ca79bdc1155632debe1533c211c1b2526e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_neumann, version=7, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_CLEAN=True, ceph=True, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, release=553, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, CEPH_POINT_RELEASE=, name=rhceph, vcs-type=git) Nov 28 04:56:13 localhost podman[302296]: 2025-11-28 09:56:13.591551312 +0000 UTC m=+0.174130421 container attach 66f70c5ca36463b5108ee763191a47ca79bdc1155632debe1533c211c1b2526e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_neumann, distribution-scope=public, RELEASE=main, release=553, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, architecture=x86_64, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, vendor=Red Hat, Inc., ceph=True, GIT_CLEAN=True, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_BRANCH=main) Nov 28 04:56:13 localhost cool_neumann[302312]: 167 167 Nov 28 04:56:13 localhost systemd[1]: libpod-66f70c5ca36463b5108ee763191a47ca79bdc1155632debe1533c211c1b2526e.scope: Deactivated successfully. Nov 28 04:56:13 localhost podman[302296]: 2025-11-28 09:56:13.597330251 +0000 UTC m=+0.179909360 container died 66f70c5ca36463b5108ee763191a47ca79bdc1155632debe1533c211c1b2526e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_neumann, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , RELEASE=main, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 28 04:56:13 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:56:13 localhost podman[302317]: 2025-11-28 09:56:13.710283877 +0000 UTC m=+0.100873546 container remove 66f70c5ca36463b5108ee763191a47ca79bdc1155632debe1533c211c1b2526e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_neumann, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, name=rhceph, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, vcs-type=git, release=553, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 28 04:56:13 localhost systemd[1]: libpod-conmon-66f70c5ca36463b5108ee763191a47ca79bdc1155632debe1533c211c1b2526e.scope: Deactivated successfully. Nov 28 04:56:13 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:56:13 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:13 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:56:13 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:13 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)... Nov 28 04:56:13 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)... Nov 28 04:56:13 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0) Nov 28 04:56:13 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 28 04:56:13 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:56:13 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:56:13 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005538513.localdomain Nov 28 04:56:13 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005538513.localdomain Nov 28 04:56:13 localhost nova_compute[279673]: 2025-11-28 09:56:13.829 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:56:14 localhost systemd[1]: var-lib-containers-storage-overlay-91eb1e456f81d915cfcfd114f3e7915de83d6ef8b718cba73e5a296374edd668-merged.mount: Deactivated successfully. Nov 28 04:56:14 localhost ceph-mon[292954]: Reconfiguring crash.np0005538513 (monmap changed)... Nov 28 04:56:14 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain Nov 28 04:56:14 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:14 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:14 localhost ceph-mon[292954]: Reconfiguring osd.2 (monmap changed)... Nov 28 04:56:14 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 28 04:56:14 localhost ceph-mon[292954]: Reconfiguring daemon osd.2 on np0005538513.localdomain Nov 28 04:56:14 localhost podman[302385]: Nov 28 04:56:14 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:56:14 localhost podman[302385]: 2025-11-28 09:56:14.492583377 +0000 UTC m=+0.082728017 container create 9cc4927a49569c85811c1578b9c8372282862ad0440ea85f1ae72504d450edae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_tu, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_CLEAN=True, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.expose-services=, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, architecture=x86_64, vcs-type=git) Nov 28 04:56:14 localhost systemd[1]: Started libpod-conmon-9cc4927a49569c85811c1578b9c8372282862ad0440ea85f1ae72504d450edae.scope. Nov 28 04:56:14 localhost systemd[1]: Started libcrun container. Nov 28 04:56:14 localhost podman[302385]: 2025-11-28 09:56:14.457159476 +0000 UTC m=+0.047304116 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:56:14 localhost podman[302385]: 2025-11-28 09:56:14.561442537 +0000 UTC m=+0.151587177 container init 9cc4927a49569c85811c1578b9c8372282862ad0440ea85f1ae72504d450edae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_tu, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, name=rhceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, version=7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git) Nov 28 04:56:14 localhost podman[302385]: 2025-11-28 09:56:14.571446924 +0000 UTC m=+0.161591564 container start 9cc4927a49569c85811c1578b9c8372282862ad0440ea85f1ae72504d450edae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_tu, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, release=553, GIT_BRANCH=main, io.buildah.version=1.33.12, name=rhceph, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, distribution-scope=public, ceph=True, RELEASE=main, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_CLEAN=True, version=7, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Nov 28 04:56:14 localhost podman[302385]: 2025-11-28 09:56:14.571880597 +0000 UTC m=+0.162025237 container attach 9cc4927a49569c85811c1578b9c8372282862ad0440ea85f1ae72504d450edae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_tu, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, release=553, distribution-scope=public, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, ceph=True, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, description=Red Hat Ceph Storage 7) Nov 28 04:56:14 localhost loving_tu[302400]: 167 167 Nov 28 04:56:14 localhost systemd[1]: libpod-9cc4927a49569c85811c1578b9c8372282862ad0440ea85f1ae72504d450edae.scope: Deactivated successfully. Nov 28 04:56:14 localhost podman[302385]: 2025-11-28 09:56:14.574825478 +0000 UTC m=+0.164970148 container died 9cc4927a49569c85811c1578b9c8372282862ad0440ea85f1ae72504d450edae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_tu, name=rhceph, architecture=x86_64, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., ceph=True, distribution-scope=public, io.openshift.expose-services=, RELEASE=main, io.buildah.version=1.33.12, version=7, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_BRANCH=main, vcs-type=git) Nov 28 04:56:14 localhost nova_compute[279673]: 2025-11-28 09:56:14.589 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:56:14 localhost nova_compute[279673]: 2025-11-28 09:56:14.615 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Triggering sync for uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Nov 28 04:56:14 localhost nova_compute[279673]: 2025-11-28 09:56:14.616 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:56:14 localhost nova_compute[279673]: 2025-11-28 09:56:14.617 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:56:14 localhost nova_compute[279673]: 2025-11-28 09:56:14.644 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.027s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:56:14 localhost podman[302405]: 2025-11-28 09:56:14.684465184 +0000 UTC m=+0.098147092 container remove 9cc4927a49569c85811c1578b9c8372282862ad0440ea85f1ae72504d450edae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_tu, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, distribution-scope=public, version=7, GIT_BRANCH=main, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 28 04:56:14 localhost systemd[1]: libpod-conmon-9cc4927a49569c85811c1578b9c8372282862ad0440ea85f1ae72504d450edae.scope: Deactivated successfully. Nov 28 04:56:14 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:56:14 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:14 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:56:14 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:14 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)... Nov 28 04:56:14 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)... Nov 28 04:56:14 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0) Nov 28 04:56:14 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 28 04:56:14 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:56:14 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:56:14 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005538513.localdomain Nov 28 04:56:14 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005538513.localdomain Nov 28 04:56:15 localhost systemd[1]: var-lib-containers-storage-overlay-589615937de90216bb8fad5f08e8ce8996473ff2ce29c17f87e90c91023b1000-merged.mount: Deactivated successfully. Nov 28 04:56:15 localhost ceph-mgr[286105]: [progress INFO root] Writing back 50 completed events Nov 28 04:56:15 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 28 04:56:15 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:15 localhost podman[302480]: Nov 28 04:56:15 localhost podman[302480]: 2025-11-28 09:56:15.524664426 +0000 UTC m=+0.081399868 container create 834317ed8d664412db7745c1ab9e50df30feb1ba6137d978a406cb59b65ac9d5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_neumann, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, RELEASE=main, distribution-scope=public, io.buildah.version=1.33.12, release=553, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.openshift.expose-services=) Nov 28 04:56:15 localhost systemd[1]: Started libpod-conmon-834317ed8d664412db7745c1ab9e50df30feb1ba6137d978a406cb59b65ac9d5.scope. Nov 28 04:56:15 localhost podman[302480]: 2025-11-28 09:56:15.491565157 +0000 UTC m=+0.048300639 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:56:15 localhost systemd[1]: Started libcrun container. Nov 28 04:56:15 localhost nova_compute[279673]: 2025-11-28 09:56:15.619 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:56:15 localhost podman[302480]: 2025-11-28 09:56:15.636361073 +0000 UTC m=+0.193096485 container init 834317ed8d664412db7745c1ab9e50df30feb1ba6137d978a406cb59b65ac9d5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_neumann, CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, name=rhceph, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, version=7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True) Nov 28 04:56:15 localhost podman[302480]: 2025-11-28 09:56:15.64956897 +0000 UTC m=+0.206304402 container start 834317ed8d664412db7745c1ab9e50df30feb1ba6137d978a406cb59b65ac9d5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_neumann, io.buildah.version=1.33.12, architecture=x86_64, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, description=Red Hat Ceph Storage 7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vcs-type=git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, GIT_CLEAN=True, com.redhat.component=rhceph-container) Nov 28 04:56:15 localhost gracious_neumann[302495]: 167 167 Nov 28 04:56:15 localhost podman[302480]: 2025-11-28 09:56:15.652580233 +0000 UTC m=+0.209315705 container attach 834317ed8d664412db7745c1ab9e50df30feb1ba6137d978a406cb59b65ac9d5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_neumann, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, release=553, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , ceph=True, com.redhat.component=rhceph-container, GIT_CLEAN=True, distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vcs-type=git) Nov 28 04:56:15 localhost systemd[1]: libpod-834317ed8d664412db7745c1ab9e50df30feb1ba6137d978a406cb59b65ac9d5.scope: Deactivated successfully. Nov 28 04:56:15 localhost podman[302480]: 2025-11-28 09:56:15.655228174 +0000 UTC m=+0.211963636 container died 834317ed8d664412db7745c1ab9e50df30feb1ba6137d978a406cb59b65ac9d5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_neumann, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-type=git, build-date=2025-09-24T08:57:55, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, release=553, GIT_BRANCH=main, maintainer=Guillaume Abrioux , distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, version=7) Nov 28 04:56:15 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:56:15 localhost podman[302500]: 2025-11-28 09:56:15.757709959 +0000 UTC m=+0.092335153 container remove 834317ed8d664412db7745c1ab9e50df30feb1ba6137d978a406cb59b65ac9d5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_neumann, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, release=553, maintainer=Guillaume Abrioux , vcs-type=git) Nov 28 04:56:15 localhost systemd[1]: libpod-conmon-834317ed8d664412db7745c1ab9e50df30feb1ba6137d978a406cb59b65ac9d5.scope: Deactivated successfully. Nov 28 04:56:15 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:15 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:15 localhost ceph-mon[292954]: Reconfiguring osd.5 (monmap changed)... Nov 28 04:56:15 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 28 04:56:15 localhost ceph-mon[292954]: Reconfiguring daemon osd.5 on np0005538513.localdomain Nov 28 04:56:15 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:15 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:56:15 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:15 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:56:15 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:15 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)... Nov 28 04:56:15 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)... Nov 28 04:56:15 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Nov 28 04:56:15 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:56:15 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:56:15 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:56:15 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain Nov 28 04:56:15 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain Nov 28 04:56:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:56:16 localhost systemd[1]: tmp-crun.1KaqtF.mount: Deactivated successfully. Nov 28 04:56:16 localhost systemd[1]: var-lib-containers-storage-overlay-21bda7b6591885f7cd0085a1ae2256d09e2eabadc0cb751da4c419696b1902cd-merged.mount: Deactivated successfully. Nov 28 04:56:16 localhost podman[302533]: 2025-11-28 09:56:16.120666251 +0000 UTC m=+0.092448397 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125) Nov 28 04:56:16 localhost podman[302533]: 2025-11-28 09:56:16.161482978 +0000 UTC m=+0.133265154 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:56:16 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:56:16 localhost podman[302596]: Nov 28 04:56:16 localhost podman[302596]: 2025-11-28 09:56:16.606601309 +0000 UTC m=+0.070331686 container create ca32b51280dd95df222879fdb987941e9ac8be5cc4469b2ac5cd650bf8e5199a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_chandrasekhar, maintainer=Guillaume Abrioux , GIT_CLEAN=True, release=553, build-date=2025-09-24T08:57:55, ceph=True, distribution-scope=public, RELEASE=main, name=rhceph, io.openshift.tags=rhceph ceph, architecture=x86_64, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., vcs-type=git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.buildah.version=1.33.12) Nov 28 04:56:16 localhost systemd[1]: Started libpod-conmon-ca32b51280dd95df222879fdb987941e9ac8be5cc4469b2ac5cd650bf8e5199a.scope. Nov 28 04:56:16 localhost podman[302596]: 2025-11-28 09:56:16.572107116 +0000 UTC m=+0.035837533 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:56:16 localhost systemd[1]: Started libcrun container. Nov 28 04:56:16 localhost podman[302596]: 2025-11-28 09:56:16.686717904 +0000 UTC m=+0.150448291 container init ca32b51280dd95df222879fdb987941e9ac8be5cc4469b2ac5cd650bf8e5199a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_chandrasekhar, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.tags=rhceph ceph, architecture=x86_64, maintainer=Guillaume Abrioux , version=7, RELEASE=main, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7) Nov 28 04:56:16 localhost podman[302596]: 2025-11-28 09:56:16.696843646 +0000 UTC m=+0.160574023 container start ca32b51280dd95df222879fdb987941e9ac8be5cc4469b2ac5cd650bf8e5199a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_chandrasekhar, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.buildah.version=1.33.12, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.component=rhceph-container, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, distribution-scope=public, GIT_CLEAN=True, RELEASE=main, maintainer=Guillaume Abrioux , version=7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 28 04:56:16 localhost podman[302596]: 2025-11-28 09:56:16.697207277 +0000 UTC m=+0.160937694 container attach ca32b51280dd95df222879fdb987941e9ac8be5cc4469b2ac5cd650bf8e5199a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_chandrasekhar, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, com.redhat.component=rhceph-container, architecture=x86_64, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, RELEASE=main, release=553, name=rhceph, version=7, vcs-type=git) Nov 28 04:56:16 localhost funny_chandrasekhar[302611]: 167 167 Nov 28 04:56:16 localhost systemd[1]: libpod-ca32b51280dd95df222879fdb987941e9ac8be5cc4469b2ac5cd650bf8e5199a.scope: Deactivated successfully. Nov 28 04:56:16 localhost podman[302596]: 2025-11-28 09:56:16.70055644 +0000 UTC m=+0.164286827 container died ca32b51280dd95df222879fdb987941e9ac8be5cc4469b2ac5cd650bf8e5199a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_chandrasekhar, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_CLEAN=True, GIT_BRANCH=main, release=553, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, version=7, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph) Nov 28 04:56:16 localhost podman[302616]: 2025-11-28 09:56:16.800992492 +0000 UTC m=+0.086334219 container remove ca32b51280dd95df222879fdb987941e9ac8be5cc4469b2ac5cd650bf8e5199a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_chandrasekhar, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vendor=Red Hat, Inc., name=rhceph, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, RELEASE=main, description=Red Hat Ceph Storage 7) Nov 28 04:56:16 localhost systemd[1]: libpod-conmon-ca32b51280dd95df222879fdb987941e9ac8be5cc4469b2ac5cd650bf8e5199a.scope: Deactivated successfully. Nov 28 04:56:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:56:16 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:56:16 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:16 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)... Nov 28 04:56:16 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)... Nov 28 04:56:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Nov 28 04:56:16 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:56:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "mgr services"} v 0) Nov 28 04:56:16 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch Nov 28 04:56:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:56:16 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:56:16 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain Nov 28 04:56:16 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain Nov 28 04:56:16 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:16 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:16 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)... Nov 28 04:56:16 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:56:16 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain Nov 28 04:56:16 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:16 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:16 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:56:17 localhost systemd[1]: var-lib-containers-storage-overlay-24a76214d0b604632e20f804f365cd6a1301270744059c3eb79caa7ed867f8a0-merged.mount: Deactivated successfully. Nov 28 04:56:17 localhost podman[302685]: Nov 28 04:56:17 localhost podman[302685]: 2025-11-28 09:56:17.528162055 +0000 UTC m=+0.078674072 container create 4415bb8813e73b44d63e351928a3d53f0c2b0faa16a92e5fb19bbd49045f47ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_raman, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, name=rhceph, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, release=553, distribution-scope=public, GIT_BRANCH=main, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, version=7, ceph=True, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc.) Nov 28 04:56:17 localhost systemd[1]: Started libpod-conmon-4415bb8813e73b44d63e351928a3d53f0c2b0faa16a92e5fb19bbd49045f47ee.scope. Nov 28 04:56:17 localhost systemd[1]: Started libcrun container. Nov 28 04:56:17 localhost podman[302685]: 2025-11-28 09:56:17.493093555 +0000 UTC m=+0.043605592 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:56:17 localhost podman[302685]: 2025-11-28 09:56:17.5936354 +0000 UTC m=+0.144147417 container init 4415bb8813e73b44d63e351928a3d53f0c2b0faa16a92e5fb19bbd49045f47ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_raman, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , release=553, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, io.openshift.expose-services=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, com.redhat.component=rhceph-container) Nov 28 04:56:17 localhost competent_raman[302700]: 167 167 Nov 28 04:56:17 localhost systemd[1]: libpod-4415bb8813e73b44d63e351928a3d53f0c2b0faa16a92e5fb19bbd49045f47ee.scope: Deactivated successfully. Nov 28 04:56:17 localhost podman[302685]: 2025-11-28 09:56:17.603184574 +0000 UTC m=+0.153696591 container start 4415bb8813e73b44d63e351928a3d53f0c2b0faa16a92e5fb19bbd49045f47ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_raman, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, distribution-scope=public, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.expose-services=, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, name=rhceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, ceph=True, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-type=git) Nov 28 04:56:17 localhost podman[302685]: 2025-11-28 09:56:17.603492094 +0000 UTC m=+0.154004101 container attach 4415bb8813e73b44d63e351928a3d53f0c2b0faa16a92e5fb19bbd49045f47ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_raman, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, name=rhceph, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, version=7, com.redhat.component=rhceph-container, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7) Nov 28 04:56:17 localhost podman[302685]: 2025-11-28 09:56:17.605937359 +0000 UTC m=+0.156449416 container died 4415bb8813e73b44d63e351928a3d53f0c2b0faa16a92e5fb19bbd49045f47ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_raman, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.expose-services=, name=rhceph, version=7, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.buildah.version=1.33.12, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, vendor=Red Hat, Inc.) Nov 28 04:56:17 localhost podman[302705]: 2025-11-28 09:56:17.698639963 +0000 UTC m=+0.084916875 container remove 4415bb8813e73b44d63e351928a3d53f0c2b0faa16a92e5fb19bbd49045f47ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_raman, ceph=True, RELEASE=main, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, name=rhceph, io.openshift.expose-services=, version=7, distribution-scope=public, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=) Nov 28 04:56:17 localhost systemd[1]: libpod-conmon-4415bb8813e73b44d63e351928a3d53f0c2b0faa16a92e5fb19bbd49045f47ee.scope: Deactivated successfully. Nov 28 04:56:17 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:56:17 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:56:17 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:17 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:56:17 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:17 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538514 (monmap changed)... Nov 28 04:56:17 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538514 (monmap changed)... Nov 28 04:56:17 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Nov 28 04:56:17 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:56:17 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:56:17 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:56:17 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain Nov 28 04:56:17 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain Nov 28 04:56:18 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)... Nov 28 04:56:18 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain Nov 28 04:56:18 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:18 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:18 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:56:18 localhost systemd[1]: tmp-crun.7ebelN.mount: Deactivated successfully. Nov 28 04:56:18 localhost systemd[1]: var-lib-containers-storage-overlay-dcd5052f61ff523c6112cde343c61f8c68ec103d5011ca6267601757d3eb715c-merged.mount: Deactivated successfully. Nov 28 04:56:18 localhost openstack_network_exporter[240658]: ERROR 09:56:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:56:18 localhost openstack_network_exporter[240658]: ERROR 09:56:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:56:18 localhost openstack_network_exporter[240658]: ERROR 09:56:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:56:18 localhost openstack_network_exporter[240658]: ERROR 09:56:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:56:18 localhost openstack_network_exporter[240658]: Nov 28 04:56:18 localhost openstack_network_exporter[240658]: ERROR 09:56:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:56:18 localhost openstack_network_exporter[240658]: Nov 28 04:56:18 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:56:18 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:18 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:56:18 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:18 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)... Nov 28 04:56:18 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)... Nov 28 04:56:18 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0) Nov 28 04:56:18 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 28 04:56:18 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:56:18 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:56:18 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005538514.localdomain Nov 28 04:56:18 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005538514.localdomain Nov 28 04:56:18 localhost nova_compute[279673]: 2025-11-28 09:56:18.832 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:56:19 localhost ceph-mon[292954]: Reconfiguring crash.np0005538514 (monmap changed)... Nov 28 04:56:19 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain Nov 28 04:56:19 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:19 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:19 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 28 04:56:19 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:56:19 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:56:19 localhost ceph-mgr[286105]: [balancer INFO root] Optimize plan auto_2025-11-28_09:56:19 Nov 28 04:56:19 localhost ceph-mgr[286105]: [balancer INFO root] Mode upmap, max misplaced 0.050000 Nov 28 04:56:19 localhost ceph-mgr[286105]: [balancer INFO root] do_upmap Nov 28 04:56:19 localhost ceph-mgr[286105]: [balancer INFO root] pools ['manila_data', 'images', 'backups', 'vms', 'manila_metadata', '.mgr', 'volumes'] Nov 28 04:56:19 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:56:19 localhost ceph-mgr[286105]: [balancer INFO root] prepared 0/10 changes Nov 28 04:56:19 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:19 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:56:19 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:19 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)... Nov 28 04:56:19 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)... Nov 28 04:56:19 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0) Nov 28 04:56:19 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 28 04:56:19 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:56:19 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:56:19 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005538514.localdomain Nov 28 04:56:19 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005538514.localdomain Nov 28 04:56:19 localhost ceph-mgr[286105]: [pg_autoscaler INFO root] _maybe_adjust Nov 28 04:56:19 localhost ceph-mgr[286105]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Nov 28 04:56:19 localhost ceph-mgr[286105]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1) Nov 28 04:56:19 localhost ceph-mgr[286105]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Nov 28 04:56:19 localhost ceph-mgr[286105]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003328000680485762 of space, bias 1.0, pg target 0.6656001360971524 quantized to 32 (current 32) Nov 28 04:56:19 localhost ceph-mgr[286105]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Nov 28 04:56:19 localhost ceph-mgr[286105]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32) Nov 28 04:56:19 localhost ceph-mgr[286105]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Nov 28 04:56:19 localhost ceph-mgr[286105]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0014449417225013959 of space, bias 1.0, pg target 0.2885066972594454 quantized to 32 (current 32) Nov 28 04:56:19 localhost ceph-mgr[286105]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Nov 28 04:56:19 localhost ceph-mgr[286105]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32) Nov 28 04:56:19 localhost ceph-mgr[286105]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Nov 28 04:56:19 localhost ceph-mgr[286105]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32) Nov 28 04:56:19 localhost ceph-mgr[286105]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Nov 28 04:56:19 localhost ceph-mgr[286105]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.453674623115578e-06 of space, bias 4.0, pg target 0.0019596681323283084 quantized to 16 (current 16) Nov 28 04:56:19 localhost ceph-mgr[286105]: [volumes INFO mgr_util] scanning for idle connections.. Nov 28 04:56:19 localhost ceph-mgr[286105]: [volumes INFO mgr_util] cleaning up connections: [] Nov 28 04:56:19 localhost ceph-mgr[286105]: [volumes INFO mgr_util] scanning for idle connections.. Nov 28 04:56:19 localhost ceph-mgr[286105]: [volumes INFO mgr_util] cleaning up connections: [] Nov 28 04:56:19 localhost ceph-mgr[286105]: [volumes INFO mgr_util] scanning for idle connections.. Nov 28 04:56:19 localhost ceph-mgr[286105]: [volumes INFO mgr_util] cleaning up connections: [] Nov 28 04:56:19 localhost ceph-mgr[286105]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules Nov 28 04:56:19 localhost ceph-mgr[286105]: [rbd_support INFO root] load_schedules: vms, start_after= Nov 28 04:56:19 localhost ceph-mgr[286105]: [rbd_support INFO root] load_schedules: volumes, start_after= Nov 28 04:56:19 localhost ceph-mgr[286105]: [rbd_support INFO root] load_schedules: images, start_after= Nov 28 04:56:19 localhost ceph-mgr[286105]: [rbd_support INFO root] load_schedules: backups, start_after= Nov 28 04:56:19 localhost ceph-mgr[286105]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules Nov 28 04:56:19 localhost ceph-mgr[286105]: [rbd_support INFO root] load_schedules: vms, start_after= Nov 28 04:56:19 localhost ceph-mgr[286105]: [rbd_support INFO root] load_schedules: volumes, start_after= Nov 28 04:56:19 localhost ceph-mgr[286105]: [rbd_support INFO root] load_schedules: images, start_after= Nov 28 04:56:19 localhost ceph-mgr[286105]: [rbd_support INFO root] load_schedules: backups, start_after= Nov 28 04:56:20 localhost ceph-mon[292954]: Reconfiguring osd.0 (monmap changed)... Nov 28 04:56:20 localhost ceph-mon[292954]: Reconfiguring daemon osd.0 on np0005538514.localdomain Nov 28 04:56:20 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:20 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:20 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 28 04:56:20 localhost nova_compute[279673]: 2025-11-28 09:56:20.622 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:56:20 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:56:20 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:20 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:56:20 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:20 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)... Nov 28 04:56:20 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)... Nov 28 04:56:20 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Nov 28 04:56:20 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:56:20 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:56:20 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:56:20 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain Nov 28 04:56:20 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain Nov 28 04:56:21 localhost ceph-mon[292954]: Reconfiguring osd.3 (monmap changed)... Nov 28 04:56:21 localhost ceph-mon[292954]: Reconfiguring daemon osd.3 on np0005538514.localdomain Nov 28 04:56:21 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:21 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:21 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:56:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:56:21 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:56:21 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:21 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538514.djozup (monmap changed)... Nov 28 04:56:21 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538514.djozup (monmap changed)... Nov 28 04:56:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Nov 28 04:56:21 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:56:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "mgr services"} v 0) Nov 28 04:56:21 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch Nov 28 04:56:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:56:21 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:56:21 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain Nov 28 04:56:21 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain Nov 28 04:56:21 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:56:22 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)... Nov 28 04:56:22 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain Nov 28 04:56:22 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:22 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:22 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:56:22 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:56:22 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:22 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:56:22 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:22 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538515 (monmap changed)... Nov 28 04:56:22 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538515 (monmap changed)... Nov 28 04:56:22 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Nov 28 04:56:22 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:56:22 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:56:22 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:56:22 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain Nov 28 04:56:22 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain Nov 28 04:56:23 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538514.djozup (monmap changed)... Nov 28 04:56:23 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain Nov 28 04:56:23 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:23 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:23 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:56:23 localhost ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.54203 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005538515.localdomain:172.18.0.105", "target": ["mon-mgr", ""]}]: dispatch Nov 28 04:56:23 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Nov 28 04:56:23 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:23 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:56:23 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Nov 28 04:56:23 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:56:23 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:56:23 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:56:23 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:23 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:56:23 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:23 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Deploying daemon mon.np0005538515 on np0005538515.localdomain Nov 28 04:56:23 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Deploying daemon mon.np0005538515 on np0005538515.localdomain Nov 28 04:56:23 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)... Nov 28 04:56:23 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)... Nov 28 04:56:23 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0) Nov 28 04:56:23 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 28 04:56:23 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:56:23 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:56:23 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005538515.localdomain Nov 28 04:56:23 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005538515.localdomain Nov 28 04:56:23 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:56:23 localhost nova_compute[279673]: 2025-11-28 09:56:23.876 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:56:24 localhost ceph-mon[292954]: Reconfiguring crash.np0005538515 (monmap changed)... Nov 28 04:56:24 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain Nov 28 04:56:24 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:24 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:56:24 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:24 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:24 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 28 04:56:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:56:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:56:24 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:56:24 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:24 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)... Nov 28 04:56:24 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)... Nov 28 04:56:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0) Nov 28 04:56:24 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 28 04:56:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:56:24 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:56:24 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005538515.localdomain Nov 28 04:56:24 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005538515.localdomain Nov 28 04:56:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:56:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:56:24 localhost systemd[1]: tmp-crun.UFPg3K.mount: Deactivated successfully. Nov 28 04:56:24 localhost podman[302721]: 2025-11-28 09:56:24.871515241 +0000 UTC m=+0.099320848 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 04:56:24 localhost podman[302722]: 2025-11-28 09:56:24.916526667 +0000 UTC m=+0.143137047 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0) Nov 28 04:56:24 localhost podman[302721]: 2025-11-28 09:56:24.938595036 +0000 UTC m=+0.166400603 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 04:56:24 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:56:24 localhost podman[302722]: 2025-11-28 09:56:24.959905222 +0000 UTC m=+0.186515562 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Nov 28 04:56:24 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:56:25 localhost ceph-mon[292954]: Deploying daemon mon.np0005538515 on np0005538515.localdomain Nov 28 04:56:25 localhost ceph-mon[292954]: Reconfiguring osd.1 (monmap changed)... Nov 28 04:56:25 localhost ceph-mon[292954]: Reconfiguring daemon osd.1 on np0005538515.localdomain Nov 28 04:56:25 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:25 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:25 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 28 04:56:25 localhost nova_compute[279673]: 2025-11-28 09:56:25.660 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:56:25 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:56:26 localhost ceph-mon[292954]: Reconfiguring osd.4 (monmap changed)... Nov 28 04:56:26 localhost ceph-mon[292954]: Reconfiguring daemon osd.4 on np0005538515.localdomain Nov 28 04:56:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:56:26 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:56:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Nov 28 04:56:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Nov 28 04:56:26 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:27 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:27 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:27 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:56:27 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:27 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:56:27 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:27 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)... Nov 28 04:56:27 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)... Nov 28 04:56:27 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Nov 28 04:56:27 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:56:27 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:56:27 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:56:27 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain Nov 28 04:56:27 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain Nov 28 04:56:27 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Nov 28 04:56:27 localhost ceph-mon[292954]: mon.np0005538513@0(leader).monmap v14 adding/updating np0005538515 at [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to monitor cluster Nov 28 04:56:27 localhost ceph-mgr[286105]: mgr.server handle_open ignoring open from mon.np0005538515 172.18.0.108:0/3638165379; not ready for session (expect reconnect) Nov 28 04:56:27 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0) Nov 28 04:56:27 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch Nov 28 04:56:27 localhost ceph-mgr[286105]: mgr finish mon failed to return metadata for mon.np0005538515: (2) No such file or directory Nov 28 04:56:27 localhost ceph-mon[292954]: mon.np0005538513@0(probing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538513"} v 0) Nov 28 04:56:27 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538513"} : dispatch Nov 28 04:56:27 localhost ceph-mon[292954]: mon.np0005538513@0(probing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538514"} v 0) Nov 28 04:56:27 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538514"} : dispatch Nov 28 04:56:27 localhost ceph-mon[292954]: mon.np0005538513@0(probing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0) Nov 28 04:56:27 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch Nov 28 04:56:27 localhost ceph-mgr[286105]: mgr finish mon failed to return metadata for mon.np0005538515: (22) Invalid argument Nov 28 04:56:27 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : mon.np0005538513 calling monitor election Nov 28 04:56:27 localhost ceph-mon[292954]: paxos.0).electionLogic(50) init, last seen epoch 50 Nov 28 04:56:27 localhost ceph-mon[292954]: mon.np0005538513@0(electing) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:56:27 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:56:28 localhost ceph-mon[292954]: mon.np0005538513@0(electing) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:56:28 localhost ceph-mgr[286105]: mgr.server handle_open ignoring open from mon.np0005538515 172.18.0.108:0/3638165379; not ready for session (expect reconnect) Nov 28 04:56:28 localhost ceph-mon[292954]: mon.np0005538513@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0) Nov 28 04:56:28 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch Nov 28 04:56:28 localhost ceph-mgr[286105]: mgr finish mon failed to return metadata for mon.np0005538515: (22) Invalid argument Nov 28 04:56:28 localhost nova_compute[279673]: 2025-11-28 09:56:28.880 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:56:29 localhost ceph-mgr[286105]: mgr.server handle_open ignoring open from mon.np0005538515 172.18.0.108:0/3638165379; not ready for session (expect reconnect) Nov 28 04:56:29 localhost ceph-mon[292954]: mon.np0005538513@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0) Nov 28 04:56:29 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch Nov 28 04:56:29 localhost ceph-mgr[286105]: mgr finish mon failed to return metadata for mon.np0005538515: (22) Invalid argument Nov 28 04:56:29 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:56:30 localhost ceph-mgr[286105]: mgr.server handle_open ignoring open from mon.np0005538515 172.18.0.108:0/3638165379; not ready for session (expect reconnect) Nov 28 04:56:30 localhost ceph-mon[292954]: mon.np0005538513@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0) Nov 28 04:56:30 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch Nov 28 04:56:30 localhost ceph-mgr[286105]: mgr finish mon failed to return metadata for mon.np0005538515: (22) Invalid argument Nov 28 04:56:30 localhost nova_compute[279673]: 2025-11-28 09:56:30.663 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:56:31 localhost ceph-mgr[286105]: mgr.server handle_open ignoring open from mon.np0005538515 172.18.0.108:0/3638165379; not ready for session (expect reconnect) Nov 28 04:56:31 localhost ceph-mon[292954]: mon.np0005538513@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0) Nov 28 04:56:31 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch Nov 28 04:56:31 localhost ceph-mgr[286105]: mgr finish mon failed to return metadata for mon.np0005538515: (22) Invalid argument Nov 28 04:56:31 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:56:32 localhost ceph-mgr[286105]: mgr.server handle_open ignoring open from mon.np0005538515 172.18.0.108:0/3638165379; not ready for session (expect reconnect) Nov 28 04:56:32 localhost ceph-mon[292954]: mon.np0005538513@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0) Nov 28 04:56:32 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch Nov 28 04:56:32 localhost ceph-mgr[286105]: mgr finish mon failed to return metadata for mon.np0005538515: (22) Invalid argument Nov 28 04:56:32 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : mon.np0005538513 is new leader, mons np0005538513,np0005538514 in quorum (ranks 0,1) Nov 28 04:56:32 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : monmap epoch 15 Nov 28 04:56:32 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 Nov 28 04:56:32 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : last_changed 2025-11-28T09:56:27.227153+0000 Nov 28 04:56:32 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : created 2025-11-28T07:45:36.120469+0000 Nov 28 04:56:32 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef) Nov 28 04:56:32 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : election_strategy: 1 Nov 28 04:56:32 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513 Nov 28 04:56:32 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538514 Nov 28 04:56:32 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538515 Nov 28 04:56:32 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:56:32 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby Nov 28 04:56:32 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e90: 6 total, 6 up, 6 in Nov 28 04:56:32 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e34: np0005538513.dsfdlx(active, since 72s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538512.zyhkxs Nov 28 04:56:32 localhost ceph-mon[292954]: log_channel(cluster) log [WRN] : Health check failed: 1/3 mons down, quorum np0005538513,np0005538514 (MON_DOWN) Nov 28 04:56:32 localhost ceph-mon[292954]: log_channel(cluster) log [WRN] : Health detail: HEALTH_WARN 1/3 mons down, quorum np0005538513,np0005538514 Nov 28 04:56:32 localhost ceph-mon[292954]: log_channel(cluster) log [WRN] : [WRN] MON_DOWN: 1/3 mons down, quorum np0005538513,np0005538514 Nov 28 04:56:32 localhost ceph-mon[292954]: log_channel(cluster) log [WRN] : mon.np0005538515 (rank 2) addr [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] is down (out of quorum) Nov 28 04:56:32 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:32 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:56:32 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:32 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)... Nov 28 04:56:32 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)... Nov 28 04:56:32 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Nov 28 04:56:32 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:56:32 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mgr services"} v 0) Nov 28 04:56:32 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mgr services"} : dispatch Nov 28 04:56:32 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:56:32 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:56:32 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain Nov 28 04:56:32 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain Nov 28 04:56:32 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)... Nov 28 04:56:32 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain Nov 28 04:56:32 localhost ceph-mon[292954]: mon.np0005538513 calling monitor election Nov 28 04:56:32 localhost ceph-mon[292954]: mon.np0005538514 calling monitor election Nov 28 04:56:32 localhost ceph-mon[292954]: mon.np0005538513 is new leader, mons np0005538513,np0005538514 in quorum (ranks 0,1) Nov 28 04:56:32 localhost ceph-mon[292954]: Health check failed: 1/3 mons down, quorum np0005538513,np0005538514 (MON_DOWN) Nov 28 04:56:32 localhost ceph-mon[292954]: Health detail: HEALTH_WARN 1/3 mons down, quorum np0005538513,np0005538514 Nov 28 04:56:32 localhost ceph-mon[292954]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005538513,np0005538514 Nov 28 04:56:32 localhost ceph-mon[292954]: mon.np0005538515 (rank 2) addr [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] is down (out of quorum) Nov 28 04:56:32 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:32 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:32 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538515.yfkzhl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:56:32 localhost nova_compute[279673]: 2025-11-28 09:56:32.798 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:56:33 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:56:33 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:33 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:56:33 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:33 localhost ceph-mgr[286105]: mgr.server handle_open ignoring open from mon.np0005538515 172.18.0.108:0/3638165379; not ready for session (expect reconnect) Nov 28 04:56:33 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0) Nov 28 04:56:33 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch Nov 28 04:56:33 localhost ceph-mgr[286105]: mgr finish mon failed to return metadata for mon.np0005538515: (22) Invalid argument Nov 28 04:56:33 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:56:33 localhost nova_compute[279673]: 2025-11-28 09:56:33.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:56:33 localhost nova_compute[279673]: 2025-11-28 09:56:33.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:56:33 localhost nova_compute[279673]: 2025-11-28 09:56:33.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:56:33 localhost nova_compute[279673]: 2025-11-28 09:56:33.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 04:56:33 localhost nova_compute[279673]: 2025-11-28 09:56:33.912 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:56:34 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538515.yfkzhl (monmap changed)... Nov 28 04:56:34 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538515.yfkzhl on np0005538515.localdomain Nov 28 04:56:34 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:34 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:34 localhost ceph-mgr[286105]: mgr.server handle_open ignoring open from mon.np0005538515 172.18.0.108:0/3638165379; not ready for session (expect reconnect) Nov 28 04:56:34 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0) Nov 28 04:56:34 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch Nov 28 04:56:34 localhost ceph-mgr[286105]: mgr finish mon failed to return metadata for mon.np0005538515: (22) Invalid argument Nov 28 04:56:34 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : mon.np0005538513 calling monitor election Nov 28 04:56:34 localhost ceph-mon[292954]: paxos.0).electionLogic(52) init, last seen epoch 52 Nov 28 04:56:34 localhost ceph-mon[292954]: mon.np0005538513@0(electing) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:56:34 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : mon.np0005538513 is new leader, mons np0005538513,np0005538514,np0005538515 in quorum (ranks 0,1,2) Nov 28 04:56:34 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : monmap epoch 15 Nov 28 04:56:34 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : fsid 2c5417c9-00eb-57d5-a565-ddecbc7995c1 Nov 28 04:56:34 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : last_changed 2025-11-28T09:56:27.227153+0000 Nov 28 04:56:34 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : created 2025-11-28T07:45:36.120469+0000 Nov 28 04:56:34 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef) Nov 28 04:56:34 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : election_strategy: 1 Nov 28 04:56:34 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005538513 Nov 28 04:56:34 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005538514 Nov 28 04:56:34 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005538515 Nov 28 04:56:34 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Nov 28 04:56:34 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005538514.umgtoy=up:active} 2 up:standby Nov 28 04:56:34 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e90: 6 total, 6 up, 6 in Nov 28 04:56:34 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e34: np0005538513.dsfdlx(active, since 74s), standbys: np0005538514.djozup, np0005538515.yfkzhl, np0005538512.zyhkxs Nov 28 04:56:34 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005538513,np0005538514) Nov 28 04:56:34 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : Cluster is now healthy Nov 28 04:56:34 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : overall HEALTH_OK Nov 28 04:56:34 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:56:34 localhost nova_compute[279673]: 2025-11-28 09:56:34.768 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:56:34 localhost nova_compute[279673]: 2025-11-28 09:56:34.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:56:34 localhost nova_compute[279673]: 2025-11-28 09:56:34.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:56:34 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:56:34 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:34 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:56:34 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:35 localhost ceph-mon[292954]: mon.np0005538515 calling monitor election Nov 28 04:56:35 localhost ceph-mon[292954]: mon.np0005538513 calling monitor election Nov 28 04:56:35 localhost ceph-mon[292954]: mon.np0005538515 calling monitor election Nov 28 04:56:35 localhost ceph-mon[292954]: mon.np0005538513 is new leader, mons np0005538513,np0005538514,np0005538515 in quorum (ranks 0,1,2) Nov 28 04:56:35 localhost ceph-mon[292954]: mon.np0005538514 calling monitor election Nov 28 04:56:35 localhost ceph-mon[292954]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005538513,np0005538514) Nov 28 04:56:35 localhost ceph-mon[292954]: Cluster is now healthy Nov 28 04:56:35 localhost ceph-mon[292954]: overall HEALTH_OK Nov 28 04:56:35 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:35 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:35 localhost ceph-mgr[286105]: mgr.server handle_open ignoring open from mon.np0005538515 172.18.0.108:0/3638165379; not ready for session (expect reconnect) Nov 28 04:56:35 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005538515"} v 0) Nov 28 04:56:35 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "mon metadata", "id": "np0005538515"} : dispatch Nov 28 04:56:35 localhost nova_compute[279673]: 2025-11-28 09:56:35.692 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:56:35 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:56:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:56:35 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:56:35 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:56:35 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Nov 28 04:56:35 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:56:35 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:56:35 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:56:35 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:56:35 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:56:35 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:56:35 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:56:35 localhost podman[302831]: 2025-11-28 09:56:35.863444554 +0000 UTC m=+0.090557909 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, version=9.6) Nov 28 04:56:35 localhost podman[302831]: 2025-11-28 09:56:35.881347785 +0000 UTC m=+0.108461120 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, io.buildah.version=1.33.7) Nov 28 04:56:35 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:56:36 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:56:36 localhost ceph-mgr[286105]: mgr.server handle_report got status from non-daemon mon.np0005538515 Nov 28 04:56:36 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:36.231+0000 7fc4e28b7640 -1 mgr.server handle_report got status from non-daemon mon.np0005538515 Nov 28 04:56:36 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:56:36 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:56:36 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:56:36 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:56:36 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:56:36 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:56:37 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:56:37 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:37 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:56:37 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:37 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:56:37 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:37 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:56:37 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:37 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:56:37 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:37 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:56:37 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:37 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 04:56:37 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:37 localhost ceph-mgr[286105]: [progress INFO root] update: starting ev 11a5422d-8b81-46a0-a867-01b1dfdefb4c (Updating node-proxy deployment (+3 -> 3)) Nov 28 04:56:37 localhost ceph-mgr[286105]: [progress INFO root] complete: finished ev 11a5422d-8b81-46a0-a867-01b1dfdefb4c (Updating node-proxy deployment (+3 -> 3)) Nov 28 04:56:37 localhost ceph-mgr[286105]: [progress INFO root] Completed event 11a5422d-8b81-46a0-a867-01b1dfdefb4c (Updating node-proxy deployment (+3 -> 3)) in 0 seconds Nov 28 04:56:37 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Nov 28 04:56:37 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Nov 28 04:56:37 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:56:37 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:56:37 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:56:37 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:37 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:37 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:37 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:37 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:37 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:37 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:37 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005538513 (monmap changed)... Nov 28 04:56:37 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005538513 (monmap changed)... Nov 28 04:56:37 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Nov 28 04:56:37 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:56:37 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:56:37 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:56:37 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain Nov 28 04:56:37 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain Nov 28 04:56:37 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:56:37 localhost nova_compute[279673]: 2025-11-28 09:56:37.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:56:37 localhost nova_compute[279673]: 2025-11-28 09:56:37.789 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:56:37 localhost nova_compute[279673]: 2025-11-28 09:56:37.790 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:56:37 localhost nova_compute[279673]: 2025-11-28 09:56:37.790 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:56:37 localhost nova_compute[279673]: 2025-11-28 09:56:37.791 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 04:56:37 localhost nova_compute[279673]: 2025-11-28 09:56:37.791 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:56:38 localhost podman[303263]: Nov 28 04:56:38 localhost podman[303263]: 2025-11-28 09:56:38.0282842 +0000 UTC m=+0.072884294 container create 31e3568e5d4528293458187efbe96961428e36ea6625ac0c9593c931aca3f5e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_chandrasekhar, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=553, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , RELEASE=main, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553) Nov 28 04:56:38 localhost systemd[1]: Started libpod-conmon-31e3568e5d4528293458187efbe96961428e36ea6625ac0c9593c931aca3f5e2.scope. Nov 28 04:56:38 localhost systemd[1]: Started libcrun container. Nov 28 04:56:38 localhost podman[303263]: 2025-11-28 09:56:38.095812558 +0000 UTC m=+0.140412662 container init 31e3568e5d4528293458187efbe96961428e36ea6625ac0c9593c931aca3f5e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_chandrasekhar, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, architecture=x86_64, io.buildah.version=1.33.12, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, release=553, ceph=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, RELEASE=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Nov 28 04:56:38 localhost podman[303263]: 2025-11-28 09:56:37.998511433 +0000 UTC m=+0.043111547 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:56:38 localhost podman[303263]: 2025-11-28 09:56:38.105697442 +0000 UTC m=+0.150297516 container start 31e3568e5d4528293458187efbe96961428e36ea6625ac0c9593c931aca3f5e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_chandrasekhar, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, version=7, io.buildah.version=1.33.12, release=553, ceph=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 04:56:38 localhost podman[303263]: 2025-11-28 09:56:38.10594503 +0000 UTC m=+0.150545134 container attach 31e3568e5d4528293458187efbe96961428e36ea6625ac0c9593c931aca3f5e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_chandrasekhar, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2025-09-24T08:57:55, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , RELEASE=main, io.openshift.expose-services=, ceph=True, version=7, architecture=x86_64) Nov 28 04:56:38 localhost compassionate_chandrasekhar[303278]: 167 167 Nov 28 04:56:38 localhost systemd[1]: libpod-31e3568e5d4528293458187efbe96961428e36ea6625ac0c9593c931aca3f5e2.scope: Deactivated successfully. Nov 28 04:56:38 localhost podman[303263]: 2025-11-28 09:56:38.108910381 +0000 UTC m=+0.153510485 container died 31e3568e5d4528293458187efbe96961428e36ea6625ac0c9593c931aca3f5e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_chandrasekhar, RELEASE=main, vcs-type=git, name=rhceph, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-09-24T08:57:55, version=7, GIT_CLEAN=True, architecture=x86_64, distribution-scope=public, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_BRANCH=main, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.buildah.version=1.33.12, ceph=True) Nov 28 04:56:38 localhost podman[303283]: 2025-11-28 09:56:38.20243468 +0000 UTC m=+0.083883193 container remove 31e3568e5d4528293458187efbe96961428e36ea6625ac0c9593c931aca3f5e2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_chandrasekhar, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, release=553, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_BRANCH=main, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, vcs-type=git, RELEASE=main) Nov 28 04:56:38 localhost systemd[1]: libpod-conmon-31e3568e5d4528293458187efbe96961428e36ea6625ac0c9593c931aca3f5e2.scope: Deactivated successfully. Nov 28 04:56:38 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:56:38 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:56:38 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:56:38 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538513.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:56:38 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 04:56:38 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1396630985' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 04:56:38 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:56:38 localhost nova_compute[279673]: 2025-11-28 09:56:38.282 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:56:38 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:38 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:56:38 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:38 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)... Nov 28 04:56:38 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)... Nov 28 04:56:38 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0) Nov 28 04:56:38 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 28 04:56:38 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:56:38 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:56:38 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005538513.localdomain Nov 28 04:56:38 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005538513.localdomain Nov 28 04:56:38 localhost nova_compute[279673]: 2025-11-28 09:56:38.367 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:56:38 localhost nova_compute[279673]: 2025-11-28 09:56:38.369 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:56:38 localhost nova_compute[279673]: 2025-11-28 09:56:38.570 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 04:56:38 localhost nova_compute[279673]: 2025-11-28 09:56:38.572 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11721MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 04:56:38 localhost nova_compute[279673]: 2025-11-28 09:56:38.572 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:56:38 localhost nova_compute[279673]: 2025-11-28 09:56:38.572 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:56:38 localhost nova_compute[279673]: 2025-11-28 09:56:38.670 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 04:56:38 localhost nova_compute[279673]: 2025-11-28 09:56:38.671 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 04:56:38 localhost nova_compute[279673]: 2025-11-28 09:56:38.671 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 04:56:38 localhost nova_compute[279673]: 2025-11-28 09:56:38.719 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:56:38 localhost nova_compute[279673]: 2025-11-28 09:56:38.916 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:56:38 localhost podman[303375]: Nov 28 04:56:38 localhost podman[303375]: 2025-11-28 09:56:38.97759227 +0000 UTC m=+0.093730696 container create 2293bc1b166644dd665bbb376e12644b2305e3ab4579552f7063c3ef58732e22 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_tesla, build-date=2025-09-24T08:57:55, ceph=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux , release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, RELEASE=main, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 28 04:56:39 localhost systemd[1]: Started libpod-conmon-2293bc1b166644dd665bbb376e12644b2305e3ab4579552f7063c3ef58732e22.scope. Nov 28 04:56:39 localhost systemd[1]: Started libcrun container. Nov 28 04:56:39 localhost systemd[1]: tmp-crun.bIHuqE.mount: Deactivated successfully. Nov 28 04:56:39 localhost systemd[1]: var-lib-containers-storage-overlay-edd0a6ce3fe658326a09178f2ffecc47f6c7c755832390f5d49c466a077806d2-merged.mount: Deactivated successfully. Nov 28 04:56:39 localhost podman[303375]: 2025-11-28 09:56:38.939315832 +0000 UTC m=+0.055454258 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:56:39 localhost podman[303375]: 2025-11-28 09:56:39.046857902 +0000 UTC m=+0.162996298 container init 2293bc1b166644dd665bbb376e12644b2305e3ab4579552f7063c3ef58732e22 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_tesla, ceph=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, version=7, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, GIT_BRANCH=main, CEPH_POINT_RELEASE=, distribution-scope=public) Nov 28 04:56:39 localhost systemd[1]: tmp-crun.IZN1Zi.mount: Deactivated successfully. Nov 28 04:56:39 localhost podman[303375]: 2025-11-28 09:56:39.065131275 +0000 UTC m=+0.181269711 container start 2293bc1b166644dd665bbb376e12644b2305e3ab4579552f7063c3ef58732e22 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_tesla, description=Red Hat Ceph Storage 7, ceph=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, name=rhceph, release=553, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, RELEASE=main, io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, build-date=2025-09-24T08:57:55, com.redhat.component=rhceph-container, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc.) Nov 28 04:56:39 localhost podman[303375]: 2025-11-28 09:56:39.065516807 +0000 UTC m=+0.181655293 container attach 2293bc1b166644dd665bbb376e12644b2305e3ab4579552f7063c3ef58732e22 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_tesla, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, ceph=True, vcs-type=git, RELEASE=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, GIT_BRANCH=main, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.openshift.expose-services=) Nov 28 04:56:39 localhost funny_tesla[303389]: 167 167 Nov 28 04:56:39 localhost systemd[1]: libpod-2293bc1b166644dd665bbb376e12644b2305e3ab4579552f7063c3ef58732e22.scope: Deactivated successfully. Nov 28 04:56:39 localhost podman[303375]: 2025-11-28 09:56:39.070054946 +0000 UTC m=+0.186193402 container died 2293bc1b166644dd665bbb376e12644b2305e3ab4579552f7063c3ef58732e22 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_tesla, io.openshift.expose-services=, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vendor=Red Hat, Inc., name=rhceph, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, RELEASE=main, build-date=2025-09-24T08:57:55, ceph=True, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main) Nov 28 04:56:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 04:56:39 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2861436119' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 04:56:39 localhost nova_compute[279673]: 2025-11-28 09:56:39.180 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:56:39 localhost podman[303394]: 2025-11-28 09:56:39.18127507 +0000 UTC m=+0.099504684 container remove 2293bc1b166644dd665bbb376e12644b2305e3ab4579552f7063c3ef58732e22 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_tesla, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_BRANCH=main, architecture=x86_64, release=553, GIT_CLEAN=True, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.expose-services=, name=rhceph) Nov 28 04:56:39 localhost nova_compute[279673]: 2025-11-28 09:56:39.188 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 04:56:39 localhost systemd[1]: libpod-conmon-2293bc1b166644dd665bbb376e12644b2305e3ab4579552f7063c3ef58732e22.scope: Deactivated successfully. Nov 28 04:56:39 localhost nova_compute[279673]: 2025-11-28 09:56:39.206 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 04:56:39 localhost nova_compute[279673]: 2025-11-28 09:56:39.210 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 04:56:39 localhost nova_compute[279673]: 2025-11-28 09:56:39.211 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.639s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:56:39 localhost ceph-mon[292954]: Reconfiguring crash.np0005538513 (monmap changed)... Nov 28 04:56:39 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538513 on np0005538513.localdomain Nov 28 04:56:39 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:39 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:39 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 28 04:56:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:56:39 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:56:39 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:39 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)... Nov 28 04:56:39 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)... Nov 28 04:56:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0) Nov 28 04:56:39 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 28 04:56:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:56:39 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:56:39 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005538513.localdomain Nov 28 04:56:39 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005538513.localdomain Nov 28 04:56:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:56:39 localhost ceph-mgr[286105]: log_channel(audit) log [DBG] : from='client.44535 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch Nov 28 04:56:39 localhost ceph-mgr[286105]: [cephadm INFO root] Reconfig service osd.default_drive_group Nov 28 04:56:39 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfig service osd.default_drive_group Nov 28 04:56:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:56:39 localhost ceph-mgr[286105]: log_channel(cluster) log [DBG] : pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Nov 28 04:56:39 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:56:39 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:56:39 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:56:39 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:56:39 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:56:39 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:56:39 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:56:39 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:56:39 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:56:39 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:56:39 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:56:39 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:40 localhost systemd[1]: var-lib-containers-storage-overlay-630a1a762f124f3926c6225ad15e484fa1812775dbba5d8882661eb4f503739d-merged.mount: Deactivated successfully. Nov 28 04:56:40 localhost podman[303473]: Nov 28 04:56:40 localhost podman[303473]: 2025-11-28 09:56:40.064130405 +0000 UTC m=+0.077448375 container create ef5bcc9afd923683dc91809f084f19cc4164158329a4f755d6cb0c376743377d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_liskov, io.openshift.tags=rhceph ceph, version=7, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, architecture=x86_64, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, name=rhceph, io.openshift.expose-services=, vcs-type=git, maintainer=Guillaume Abrioux , RELEASE=main, release=553) Nov 28 04:56:40 localhost podman[238687]: time="2025-11-28T09:56:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:56:40 localhost systemd[1]: Started libpod-conmon-ef5bcc9afd923683dc91809f084f19cc4164158329a4f755d6cb0c376743377d.scope. Nov 28 04:56:40 localhost systemd[1]: Started libcrun container. Nov 28 04:56:40 localhost podman[303473]: 2025-11-28 09:56:40.032984226 +0000 UTC m=+0.046302226 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:56:40 localhost podman[303473]: 2025-11-28 09:56:40.138339189 +0000 UTC m=+0.151657169 container init ef5bcc9afd923683dc91809f084f19cc4164158329a4f755d6cb0c376743377d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_liskov, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, ceph=True, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, com.redhat.component=rhceph-container, vcs-type=git, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, io.openshift.expose-services=, distribution-scope=public, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, version=7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc.) Nov 28 04:56:40 localhost podman[303473]: 2025-11-28 09:56:40.148040898 +0000 UTC m=+0.161358868 container start ef5bcc9afd923683dc91809f084f19cc4164158329a4f755d6cb0c376743377d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_liskov, GIT_BRANCH=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, vcs-type=git, GIT_CLEAN=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/agreements, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2025-09-24T08:57:55, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , RELEASE=main, vendor=Red Hat, Inc., name=rhceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12) Nov 28 04:56:40 localhost podman[303473]: 2025-11-28 09:56:40.148242764 +0000 UTC m=+0.161560954 container attach ef5bcc9afd923683dc91809f084f19cc4164158329a4f755d6cb0c376743377d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_liskov, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_CLEAN=True, vcs-type=git, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, ceph=True, CEPH_POINT_RELEASE=, name=rhceph, release=553, vendor=Red Hat, Inc.) Nov 28 04:56:40 localhost blissful_liskov[303489]: 167 167 Nov 28 04:56:40 localhost systemd[1]: libpod-ef5bcc9afd923683dc91809f084f19cc4164158329a4f755d6cb0c376743377d.scope: Deactivated successfully. Nov 28 04:56:40 localhost podman[303473]: 2025-11-28 09:56:40.151439162 +0000 UTC m=+0.164757182 container died ef5bcc9afd923683dc91809f084f19cc4164158329a4f755d6cb0c376743377d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_liskov, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, ceph=True, vcs-type=git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=553, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, name=rhceph) Nov 28 04:56:40 localhost podman[238687]: @ - - [28/Nov/2025:09:56:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155578 "" "Go-http-client/1.1" Nov 28 04:56:40 localhost podman[238687]: @ - - [28/Nov/2025:09:56:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19045 "" "Go-http-client/1.1" Nov 28 04:56:40 localhost podman[303494]: 2025-11-28 09:56:40.276812731 +0000 UTC m=+0.113587206 container remove ef5bcc9afd923683dc91809f084f19cc4164158329a4f755d6cb0c376743377d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_liskov, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, distribution-scope=public, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, name=rhceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=553, com.redhat.component=rhceph-container, vcs-type=git, version=7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, RELEASE=main, ceph=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.33.12, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 28 04:56:40 localhost systemd[1]: libpod-conmon-ef5bcc9afd923683dc91809f084f19cc4164158329a4f755d6cb0c376743377d.scope: Deactivated successfully. Nov 28 04:56:40 localhost ceph-mon[292954]: Reconfiguring osd.2 (monmap changed)... Nov 28 04:56:40 localhost ceph-mon[292954]: Reconfiguring daemon osd.2 on np0005538513.localdomain Nov 28 04:56:40 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:40 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:40 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Nov 28 04:56:40 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:40 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:40 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:40 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:40 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:40 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:40 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:40 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:40 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:40 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:40 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:40 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:40 localhost ceph-mgr[286105]: [progress INFO root] Writing back 50 completed events Nov 28 04:56:40 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 28 04:56:40 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:40 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:56:40 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:40 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:56:40 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:40 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:56:40 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:40 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:56:40 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:40 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)... Nov 28 04:56:40 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)... Nov 28 04:56:40 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Nov 28 04:56:40 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:56:40 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Nov 28 04:56:40 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Nov 28 04:56:40 localhost ceph-mgr[286105]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain Nov 28 04:56:40 localhost ceph-mgr[286105]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain Nov 28 04:56:40 localhost nova_compute[279673]: 2025-11-28 09:56:40.716 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:56:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:56:41 localhost systemd[1]: var-lib-containers-storage-overlay-32579a85f17ae0b9919982077d9cde35bfdd8958595e703f01f98577d050a365-merged.mount: Deactivated successfully. Nov 28 04:56:41 localhost podman[303569]: 2025-11-28 09:56:41.113784664 +0000 UTC m=+0.096150780 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 04:56:41 localhost podman[303569]: 2025-11-28 09:56:41.129376135 +0000 UTC m=+0.111742291 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 28 04:56:41 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:56:41 localhost podman[303577]: Nov 28 04:56:41 localhost podman[303577]: 2025-11-28 09:56:41.185178942 +0000 UTC m=+0.137433842 container create e3a7233f00fc25db74dbfe7c8b5d3a656a39bd2d16c0518429c77dbef62649d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_bartik, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, distribution-scope=public, RELEASE=main, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.12, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7) Nov 28 04:56:41 localhost podman[303577]: 2025-11-28 09:56:41.09901159 +0000 UTC m=+0.051266490 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:56:41 localhost systemd[1]: Started libpod-conmon-e3a7233f00fc25db74dbfe7c8b5d3a656a39bd2d16c0518429c77dbef62649d7.scope. Nov 28 04:56:41 localhost systemd[1]: Started libcrun container. Nov 28 04:56:41 localhost podman[303577]: 2025-11-28 09:56:41.272419498 +0000 UTC m=+0.224674398 container init e3a7233f00fc25db74dbfe7c8b5d3a656a39bd2d16c0518429c77dbef62649d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_bartik, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, com.redhat.component=rhceph-container, distribution-scope=public, release=553, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, architecture=x86_64, io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, ceph=True) Nov 28 04:56:41 localhost podman[303577]: 2025-11-28 09:56:41.283047294 +0000 UTC m=+0.235302194 container start e3a7233f00fc25db74dbfe7c8b5d3a656a39bd2d16c0518429c77dbef62649d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_bartik, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.license_terms=https://www.redhat.com/agreements, release=553, vendor=Red Hat, Inc., ceph=True, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, distribution-scope=public, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_CLEAN=True, vcs-type=git, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 28 04:56:41 localhost podman[303577]: 2025-11-28 09:56:41.283415345 +0000 UTC m=+0.235670245 container attach e3a7233f00fc25db74dbfe7c8b5d3a656a39bd2d16c0518429c77dbef62649d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_bartik, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, RELEASE=main, vcs-type=git, maintainer=Guillaume Abrioux , GIT_BRANCH=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, version=7, distribution-scope=public, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, release=553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, build-date=2025-09-24T08:57:55) Nov 28 04:56:41 localhost gallant_bartik[303609]: 167 167 Nov 28 04:56:41 localhost systemd[1]: libpod-e3a7233f00fc25db74dbfe7c8b5d3a656a39bd2d16c0518429c77dbef62649d7.scope: Deactivated successfully. Nov 28 04:56:41 localhost podman[303577]: 2025-11-28 09:56:41.287475591 +0000 UTC m=+0.239730531 container died e3a7233f00fc25db74dbfe7c8b5d3a656a39bd2d16c0518429c77dbef62649d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_bartik, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, architecture=x86_64, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, name=rhceph, version=7, release=553, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55, RELEASE=main, vcs-type=git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 28 04:56:41 localhost ceph-mon[292954]: Reconfiguring osd.5 (monmap changed)... Nov 28 04:56:41 localhost ceph-mon[292954]: Reconfiguring daemon osd.5 on np0005538513.localdomain Nov 28 04:56:41 localhost ceph-mon[292954]: Reconfig service osd.default_drive_group Nov 28 04:56:41 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:41 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:41 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:41 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:41 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:41 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:56:41 localhost podman[303614]: 2025-11-28 09:56:41.390792371 +0000 UTC m=+0.090498656 container remove e3a7233f00fc25db74dbfe7c8b5d3a656a39bd2d16c0518429c77dbef62649d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_bartik, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, version=7, io.openshift.expose-services=, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, vcs-type=git, CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux ) Nov 28 04:56:41 localhost systemd[1]: libpod-conmon-e3a7233f00fc25db74dbfe7c8b5d3a656a39bd2d16c0518429c77dbef62649d7.scope: Deactivated successfully. Nov 28 04:56:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mgr fail"} v 0) Nov 28 04:56:41 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/937537164' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 28 04:56:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e90 do_prune osdmap full prune enabled Nov 28 04:56:41 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : Activating manager daemon np0005538514.djozup Nov 28 04:56:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 e91: 6 total, 6 up, 6 in Nov 28 04:56:41 localhost ceph-mgr[286105]: mgr handle_mgr_map I was active but no longer am Nov 28 04:56:41 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:41.482+0000 7fc53e906640 -1 mgr handle_mgr_map I was active but no longer am Nov 28 04:56:41 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e91: 6 total, 6 up, 6 in Nov 28 04:56:41 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/937537164' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Nov 28 04:56:41 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e35: np0005538514.djozup(active, starting, since 0.0458243s), standbys: np0005538515.yfkzhl, np0005538512.zyhkxs Nov 28 04:56:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:56:41 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:41 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : Manager daemon np0005538514.djozup is now available Nov 28 04:56:41 localhost systemd[1]: session-69.scope: Deactivated successfully. Nov 28 04:56:41 localhost systemd[1]: session-69.scope: Consumed 28.760s CPU time. Nov 28 04:56:41 localhost systemd-logind[764]: Session 69 logged out. Waiting for processes to exit. Nov 28 04:56:41 localhost systemd-logind[764]: Removed session 69. Nov 28 04:56:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"} v 0) Nov 28 04:56:41 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"} : dispatch Nov 28 04:56:41 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"}]': finished Nov 28 04:56:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"} v 0) Nov 28 04:56:41 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"} : dispatch Nov 28 04:56:41 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: ignoring --setuser ceph since I am not root Nov 28 04:56:41 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: ignoring --setgroup ceph since I am not root Nov 28 04:56:41 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"}]': finished Nov 28 04:56:41 localhost ceph-mgr[286105]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2 Nov 28 04:56:41 localhost ceph-mgr[286105]: pidfile_write: ignore empty --pid-file Nov 28 04:56:41 localhost ceph-mgr[286105]: mgr[py] Loading python module 'alerts' Nov 28 04:56:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/mirror_snapshot_schedule"} v 0) Nov 28 04:56:41 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/mirror_snapshot_schedule"} : dispatch Nov 28 04:56:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/trash_purge_schedule"} v 0) Nov 28 04:56:41 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/trash_purge_schedule"} : dispatch Nov 28 04:56:41 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:41.719+0000 7f02af17f140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member Nov 28 04:56:41 localhost ceph-mgr[286105]: mgr[py] Module alerts has missing NOTIFY_TYPES member Nov 28 04:56:41 localhost ceph-mgr[286105]: mgr[py] Loading python module 'balancer' Nov 28 04:56:41 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:41.788+0000 7f02af17f140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member Nov 28 04:56:41 localhost ceph-mgr[286105]: mgr[py] Module balancer has missing NOTIFY_TYPES member Nov 28 04:56:41 localhost ceph-mgr[286105]: mgr[py] Loading python module 'cephadm' Nov 28 04:56:41 localhost sshd[303654]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:56:41 localhost systemd-logind[764]: New session 71 of user ceph-admin. Nov 28 04:56:41 localhost systemd[1]: Started Session 71 of User ceph-admin. Nov 28 04:56:42 localhost systemd[1]: var-lib-containers-storage-overlay-78e92897badadd3a0d4e6279f34a3c1d6f59b1d9ac087caa19b8c5bcc97568bd-merged.mount: Deactivated successfully. Nov 28 04:56:42 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)... Nov 28 04:56:42 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain Nov 28 04:56:42 localhost ceph-mon[292954]: from='client.? 172.18.0.200:0/937537164' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 28 04:56:42 localhost ceph-mon[292954]: Activating manager daemon np0005538514.djozup Nov 28 04:56:42 localhost ceph-mon[292954]: from='client.? 172.18.0.200:0/937537164' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Nov 28 04:56:42 localhost ceph-mon[292954]: from='mgr.26581 172.18.0.106:0/4290692976' entity='mgr.np0005538513.dsfdlx' Nov 28 04:56:42 localhost ceph-mon[292954]: Manager daemon np0005538514.djozup is now available Nov 28 04:56:42 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"} : dispatch Nov 28 04:56:42 localhost ceph-mon[292954]: removing stray HostCache host record np0005538512.localdomain.devices.0 Nov 28 04:56:42 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"} : dispatch Nov 28 04:56:42 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"}]': finished Nov 28 04:56:42 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"} : dispatch Nov 28 04:56:42 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"} : dispatch Nov 28 04:56:42 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005538512.localdomain.devices.0"}]': finished Nov 28 04:56:42 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/mirror_snapshot_schedule"} : dispatch Nov 28 04:56:42 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/mirror_snapshot_schedule"} : dispatch Nov 28 04:56:42 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/trash_purge_schedule"} : dispatch Nov 28 04:56:42 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538514.djozup/trash_purge_schedule"} : dispatch Nov 28 04:56:42 localhost ceph-mgr[286105]: mgr[py] Loading python module 'crash' Nov 28 04:56:42 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:42.439+0000 7f02af17f140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member Nov 28 04:56:42 localhost ceph-mgr[286105]: mgr[py] Module crash has missing NOTIFY_TYPES member Nov 28 04:56:42 localhost ceph-mgr[286105]: mgr[py] Loading python module 'dashboard' Nov 28 04:56:42 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e36: np0005538514.djozup(active, since 1.10238s), standbys: np0005538515.yfkzhl, np0005538512.zyhkxs Nov 28 04:56:42 localhost ceph-mgr[286105]: mgr[py] Loading python module 'devicehealth' Nov 28 04:56:43 localhost podman[303768]: 2025-11-28 09:56:43.003078198 +0000 UTC m=+0.078389654 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, version=7, GIT_BRANCH=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.component=rhceph-container, io.openshift.expose-services=, architecture=x86_64, build-date=2025-09-24T08:57:55, io.openshift.tags=rhceph ceph, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_CLEAN=True, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, ceph=True, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=) Nov 28 04:56:43 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:43.009+0000 7f02af17f140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member Nov 28 04:56:43 localhost ceph-mgr[286105]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member Nov 28 04:56:43 localhost ceph-mgr[286105]: mgr[py] Loading python module 'diskprediction_local' Nov 28 04:56:43 localhost podman[303768]: 2025-11-28 09:56:43.09342516 +0000 UTC m=+0.168736666 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, release=553, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, build-date=2025-09-24T08:57:55, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, distribution-scope=public, GIT_BRANCH=main, GIT_CLEAN=True, RELEASE=main, com.redhat.component=rhceph-container) Nov 28 04:56:43 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode. Nov 28 04:56:43 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve. Nov 28 04:56:43 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: from numpy import show_config as show_numpy_config Nov 28 04:56:43 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:43.138+0000 7f02af17f140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Nov 28 04:56:43 localhost ceph-mgr[286105]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Nov 28 04:56:43 localhost ceph-mgr[286105]: mgr[py] Loading python module 'influx' Nov 28 04:56:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:56:43 localhost ceph-mgr[286105]: mgr[py] Module influx has missing NOTIFY_TYPES member Nov 28 04:56:43 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:43.196+0000 7f02af17f140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member Nov 28 04:56:43 localhost ceph-mgr[286105]: mgr[py] Loading python module 'insights' Nov 28 04:56:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:56:43 localhost ceph-mgr[286105]: mgr[py] Loading python module 'iostat' Nov 28 04:56:43 localhost podman[303813]: 2025-11-28 09:56:43.260143121 +0000 UTC m=+0.087812933 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0) Nov 28 04:56:43 localhost ceph-mgr[286105]: mgr[py] Module iostat has missing NOTIFY_TYPES member Nov 28 04:56:43 localhost ceph-mgr[286105]: mgr[py] Loading python module 'k8sevents' Nov 28 04:56:43 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:43.309+0000 7f02af17f140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member Nov 28 04:56:43 localhost podman[303836]: 2025-11-28 09:56:43.364339628 +0000 UTC m=+0.107528420 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 28 04:56:43 localhost podman[303836]: 2025-11-28 09:56:43.372354005 +0000 UTC m=+0.115542857 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 04:56:43 localhost podman[303813]: 2025-11-28 09:56:43.382530359 +0000 UTC m=+0.210200191 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125) Nov 28 04:56:43 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:56:43 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:56:43 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e37: np0005538514.djozup(active, since 2s), standbys: np0005538515.yfkzhl, np0005538512.zyhkxs Nov 28 04:56:43 localhost ceph-mgr[286105]: mgr[py] Loading python module 'localpool' Nov 28 04:56:43 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:56:43 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:43 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:56:43 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:43 localhost ceph-mgr[286105]: mgr[py] Loading python module 'mds_autoscaler' Nov 28 04:56:43 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:56:43 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:43 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:56:43 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:43 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:56:43 localhost ceph-mgr[286105]: mgr[py] Loading python module 'mirroring' Nov 28 04:56:43 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:43 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:56:43 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:43 localhost ceph-mgr[286105]: mgr[py] Loading python module 'nfs' Nov 28 04:56:43 localhost nova_compute[279673]: 2025-11-28 09:56:43.918 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:56:44 localhost ceph-mgr[286105]: mgr[py] Module nfs has missing NOTIFY_TYPES member Nov 28 04:56:44 localhost ceph-mgr[286105]: mgr[py] Loading python module 'orchestrator' Nov 28 04:56:44 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:44.059+0000 7f02af17f140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member Nov 28 04:56:44 localhost ceph-mgr[286105]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member Nov 28 04:56:44 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:44.201+0000 7f02af17f140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member Nov 28 04:56:44 localhost ceph-mgr[286105]: mgr[py] Loading python module 'osd_perf_query' Nov 28 04:56:44 localhost nova_compute[279673]: 2025-11-28 09:56:44.213 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:56:44 localhost nova_compute[279673]: 2025-11-28 09:56:44.213 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 04:56:44 localhost nova_compute[279673]: 2025-11-28 09:56:44.213 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 04:56:44 localhost ceph-mgr[286105]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Nov 28 04:56:44 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:44.264+0000 7f02af17f140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Nov 28 04:56:44 localhost ceph-mgr[286105]: mgr[py] Loading python module 'osd_support' Nov 28 04:56:44 localhost ceph-mgr[286105]: mgr[py] Module osd_support has missing NOTIFY_TYPES member Nov 28 04:56:44 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:44.319+0000 7f02af17f140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member Nov 28 04:56:44 localhost ceph-mgr[286105]: mgr[py] Loading python module 'pg_autoscaler' Nov 28 04:56:44 localhost ceph-mgr[286105]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Nov 28 04:56:44 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:44.386+0000 7f02af17f140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Nov 28 04:56:44 localhost ceph-mgr[286105]: mgr[py] Loading python module 'progress' Nov 28 04:56:44 localhost ceph-mgr[286105]: mgr[py] Module progress has missing NOTIFY_TYPES member Nov 28 04:56:44 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:44.446+0000 7f02af17f140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member Nov 28 04:56:44 localhost ceph-mgr[286105]: mgr[py] Loading python module 'prometheus' Nov 28 04:56:44 localhost nova_compute[279673]: 2025-11-28 09:56:44.499 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:56:44 localhost nova_compute[279673]: 2025-11-28 09:56:44.499 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:56:44 localhost nova_compute[279673]: 2025-11-28 09:56:44.499 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 04:56:44 localhost nova_compute[279673]: 2025-11-28 09:56:44.500 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:56:44 localhost ceph-mon[292954]: [28/Nov/2025:09:56:43] ENGINE Bus STARTING Nov 28 04:56:44 localhost ceph-mon[292954]: [28/Nov/2025:09:56:43] ENGINE Serving on https://172.18.0.107:7150 Nov 28 04:56:44 localhost ceph-mon[292954]: [28/Nov/2025:09:56:43] ENGINE Client ('172.18.0.107', 59370) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Nov 28 04:56:44 localhost ceph-mon[292954]: [28/Nov/2025:09:56:43] ENGINE Serving on http://172.18.0.107:8765 Nov 28 04:56:44 localhost ceph-mon[292954]: [28/Nov/2025:09:56:43] ENGINE Bus STARTED Nov 28 04:56:44 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:44 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:44 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:44 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:44 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:44 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:56:44 localhost ceph-mgr[286105]: mgr[py] Module prometheus has missing NOTIFY_TYPES member Nov 28 04:56:44 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:44.745+0000 7f02af17f140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member Nov 28 04:56:44 localhost ceph-mgr[286105]: mgr[py] Loading python module 'rbd_support' Nov 28 04:56:44 localhost ceph-mgr[286105]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member Nov 28 04:56:44 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:44.827+0000 7f02af17f140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member Nov 28 04:56:44 localhost ceph-mgr[286105]: mgr[py] Loading python module 'restful' Nov 28 04:56:44 localhost nova_compute[279673]: 2025-11-28 09:56:44.848 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 04:56:44 localhost nova_compute[279673]: 2025-11-28 09:56:44.864 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:56:44 localhost nova_compute[279673]: 2025-11-28 09:56:44.865 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 04:56:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:56:44 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:56:44 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:45 localhost ceph-mgr[286105]: mgr[py] Loading python module 'rgw' Nov 28 04:56:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0) Nov 28 04:56:45 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 28 04:56:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0) Nov 28 04:56:45 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 28 04:56:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Nov 28 04:56:45 localhost ceph-mgr[286105]: mgr[py] Module rgw has missing NOTIFY_TYPES member Nov 28 04:56:45 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:45.151+0000 7f02af17f140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member Nov 28 04:56:45 localhost ceph-mgr[286105]: mgr[py] Loading python module 'rook' Nov 28 04:56:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:56:45 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:56:45 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0) Nov 28 04:56:45 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 28 04:56:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0) Nov 28 04:56:45 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 28 04:56:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Nov 28 04:56:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:56:45 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:56:45 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0) Nov 28 04:56:45 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 28 04:56:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0) Nov 28 04:56:45 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 28 04:56:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Nov 28 04:56:45 localhost ceph-mgr[286105]: mgr[py] Module rook has missing NOTIFY_TYPES member Nov 28 04:56:45 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:45.583+0000 7f02af17f140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member Nov 28 04:56:45 localhost ceph-mgr[286105]: mgr[py] Loading python module 'selftest' Nov 28 04:56:45 localhost ceph-mgr[286105]: mgr[py] Module selftest has missing NOTIFY_TYPES member Nov 28 04:56:45 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:45.644+0000 7f02af17f140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member Nov 28 04:56:45 localhost ceph-mgr[286105]: mgr[py] Loading python module 'snap_schedule' Nov 28 04:56:45 localhost ceph-mgr[286105]: mgr[py] Loading python module 'stats' Nov 28 04:56:45 localhost nova_compute[279673]: 2025-11-28 09:56:45.764 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:56:45 localhost ceph-mgr[286105]: mgr[py] Loading python module 'status' Nov 28 04:56:45 localhost ceph-mgr[286105]: mgr[py] Module status has missing NOTIFY_TYPES member Nov 28 04:56:45 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:45.837+0000 7f02af17f140 -1 mgr[py] Module status has missing NOTIFY_TYPES member Nov 28 04:56:45 localhost ceph-mgr[286105]: mgr[py] Loading python module 'telegraf' Nov 28 04:56:45 localhost ceph-mgr[286105]: mgr[py] Module telegraf has missing NOTIFY_TYPES member Nov 28 04:56:45 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:45.896+0000 7f02af17f140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member Nov 28 04:56:45 localhost ceph-mgr[286105]: mgr[py] Loading python module 'telemetry' Nov 28 04:56:45 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:45 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:45 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 28 04:56:45 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 28 04:56:45 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 28 04:56:45 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 28 04:56:45 localhost ceph-mon[292954]: Adjusting osd_memory_target on np0005538514.localdomain to 836.6M Nov 28 04:56:45 localhost ceph-mon[292954]: Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 04:56:45 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:45 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:45 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 28 04:56:45 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 28 04:56:45 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 28 04:56:45 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 28 04:56:45 localhost ceph-mon[292954]: Adjusting osd_memory_target on np0005538515.localdomain to 836.6M Nov 28 04:56:45 localhost ceph-mon[292954]: Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 04:56:45 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:45 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:45 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 28 04:56:45 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 28 04:56:45 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 28 04:56:45 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 28 04:56:45 localhost ceph-mon[292954]: Adjusting osd_memory_target on np0005538513.localdomain to 836.6M Nov 28 04:56:45 localhost ceph-mon[292954]: Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 04:56:45 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:56:45 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:56:45 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:56:45 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:56:46 localhost ceph-mgr[286105]: mgr[py] Module telemetry has missing NOTIFY_TYPES member Nov 28 04:56:46 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:46.027+0000 7f02af17f140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member Nov 28 04:56:46 localhost ceph-mgr[286105]: mgr[py] Loading python module 'test_orchestrator' Nov 28 04:56:46 localhost ceph-mgr[286105]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Nov 28 04:56:46 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:46.172+0000 7f02af17f140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Nov 28 04:56:46 localhost ceph-mgr[286105]: mgr[py] Loading python module 'volumes' Nov 28 04:56:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:56:46 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e38: np0005538514.djozup(active, since 4s), standbys: np0005538515.yfkzhl, np0005538512.zyhkxs Nov 28 04:56:46 localhost podman[304250]: 2025-11-28 09:56:46.340421906 +0000 UTC m=+0.099756522 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 28 04:56:46 localhost ceph-mgr[286105]: mgr[py] Module volumes has missing NOTIFY_TYPES member Nov 28 04:56:46 localhost ceph-mgr[286105]: mgr[py] Loading python module 'zabbix' Nov 28 04:56:46 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:46.362+0000 7f02af17f140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member Nov 28 04:56:46 localhost podman[304250]: 2025-11-28 09:56:46.381431637 +0000 UTC m=+0.140766253 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:56:46 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:56:46 localhost ceph-mgr[286105]: mgr[py] Module zabbix has missing NOTIFY_TYPES member Nov 28 04:56:46 localhost ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-mgr-np0005538513-dsfdlx[286101]: 2025-11-28T09:56:46.420+0000 7f02af17f140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member Nov 28 04:56:46 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : Standby manager daemon np0005538513.dsfdlx started Nov 28 04:56:46 localhost ceph-mgr[286105]: ms_deliver_dispatch: unhandled message 0x55f77a9db1e0 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0 Nov 28 04:56:46 localhost ceph-mgr[286105]: client.0 ms_handle_reset on v2:172.18.0.107:6810/2760684413 Nov 28 04:56:47 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:56:47 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:56:47 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:56:47 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:56:47 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e39: np0005538514.djozup(active, since 5s), standbys: np0005538515.yfkzhl, np0005538512.zyhkxs, np0005538513.dsfdlx Nov 28 04:56:47 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:56:47 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:56:47 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:47 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:56:47 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:47 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:56:47 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:48 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:48 localhost openstack_network_exporter[240658]: ERROR 09:56:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:56:48 localhost openstack_network_exporter[240658]: ERROR 09:56:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:56:48 localhost openstack_network_exporter[240658]: Nov 28 04:56:48 localhost openstack_network_exporter[240658]: ERROR 09:56:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:56:48 localhost openstack_network_exporter[240658]: Nov 28 04:56:48 localhost openstack_network_exporter[240658]: ERROR 09:56:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:56:48 localhost openstack_network_exporter[240658]: ERROR 09:56:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:56:48 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:56:48 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:48 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:56:48 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:48 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 04:56:48 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:48 localhost ceph-mon[292954]: log_channel(cluster) log [WRN] : Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Nov 28 04:56:48 localhost ceph-mon[292954]: log_channel(cluster) log [WRN] : Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Nov 28 04:56:48 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:56:48 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:56:48 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:56:48 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:56:48 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:56:48 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:48 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:48 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:48 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:48 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:48 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:48 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:48 localhost nova_compute[279673]: 2025-11-28 09:56:48.920 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:56:49 localhost podman[304783]: Nov 28 04:56:49 localhost podman[304783]: 2025-11-28 09:56:49.144438986 +0000 UTC m=+0.079889740 container create fe8543c05f744f80dcc95afcaf11d9f147a5c688cba7af784897f2c01011d40a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_hofstadter, description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, release=553, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, maintainer=Guillaume Abrioux , build-date=2025-09-24T08:57:55, io.openshift.expose-services=, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., name=rhceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git) Nov 28 04:56:49 localhost systemd[1]: Started libpod-conmon-fe8543c05f744f80dcc95afcaf11d9f147a5c688cba7af784897f2c01011d40a.scope. Nov 28 04:56:49 localhost systemd[1]: Started libcrun container. Nov 28 04:56:49 localhost podman[304783]: 2025-11-28 09:56:49.110579444 +0000 UTC m=+0.046030278 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:56:49 localhost podman[304783]: 2025-11-28 09:56:49.22059167 +0000 UTC m=+0.156042424 container init fe8543c05f744f80dcc95afcaf11d9f147a5c688cba7af784897f2c01011d40a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_hofstadter, version=7, architecture=x86_64, GIT_BRANCH=main, build-date=2025-09-24T08:57:55, release=553, io.openshift.tags=rhceph ceph, name=rhceph, ceph=True, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, vcs-type=git, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 04:56:49 localhost nifty_hofstadter[304798]: 167 167 Nov 28 04:56:49 localhost podman[304783]: 2025-11-28 09:56:49.233624531 +0000 UTC m=+0.169075285 container start fe8543c05f744f80dcc95afcaf11d9f147a5c688cba7af784897f2c01011d40a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_hofstadter, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, distribution-scope=public, build-date=2025-09-24T08:57:55, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , vcs-type=git, version=7, io.openshift.expose-services=, GIT_CLEAN=True, ceph=True, release=553, name=rhceph, architecture=x86_64, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3) Nov 28 04:56:49 localhost systemd[1]: libpod-fe8543c05f744f80dcc95afcaf11d9f147a5c688cba7af784897f2c01011d40a.scope: Deactivated successfully. Nov 28 04:56:49 localhost podman[304783]: 2025-11-28 09:56:49.234960852 +0000 UTC m=+0.170411616 container attach fe8543c05f744f80dcc95afcaf11d9f147a5c688cba7af784897f2c01011d40a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_hofstadter, maintainer=Guillaume Abrioux , version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, RELEASE=main, release=553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, CEPH_POINT_RELEASE=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, distribution-scope=public, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph) Nov 28 04:56:49 localhost podman[304783]: 2025-11-28 09:56:49.2381382 +0000 UTC m=+0.173588964 container died fe8543c05f744f80dcc95afcaf11d9f147a5c688cba7af784897f2c01011d40a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_hofstadter, version=7, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, architecture=x86_64, io.buildah.version=1.33.12, build-date=2025-09-24T08:57:55, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_BRANCH=main, distribution-scope=public, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, ceph=True, release=553, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True) Nov 28 04:56:49 localhost podman[304803]: 2025-11-28 09:56:49.335700183 +0000 UTC m=+0.091135597 container remove fe8543c05f744f80dcc95afcaf11d9f147a5c688cba7af784897f2c01011d40a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_hofstadter, RELEASE=main, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, CEPH_POINT_RELEASE=, build-date=2025-09-24T08:57:55, description=Red Hat Ceph Storage 7, architecture=x86_64, release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, distribution-scope=public, GIT_BRANCH=main, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=) Nov 28 04:56:49 localhost systemd[1]: libpod-conmon-fe8543c05f744f80dcc95afcaf11d9f147a5c688cba7af784897f2c01011d40a.scope: Deactivated successfully. Nov 28 04:56:49 localhost ceph-mon[292954]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Nov 28 04:56:49 localhost ceph-mon[292954]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Nov 28 04:56:49 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Nov 28 04:56:49 localhost ceph-mon[292954]: Reconfiguring daemon osd.2 on np0005538513.localdomain Nov 28 04:56:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:56:49 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:56:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:56:49 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:56:49 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:56:49 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Nov 28 04:56:49 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:56:50 localhost systemd[1]: var-lib-containers-storage-overlay-bc457c5c37bce5163bcc1575fcb0fee9857a0ab1e47c6523bec72b4d0411e735-merged.mount: Deactivated successfully. Nov 28 04:56:50 localhost podman[304882]: Nov 28 04:56:50 localhost podman[304882]: 2025-11-28 09:56:50.332166266 +0000 UTC m=+0.076776575 container create 003b3b17193d134a03f3cfe786c7cc57962cb60d93c55e0970ebf853cb9fcd20 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_feistel, GIT_BRANCH=main, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, RELEASE=main, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, release=553, architecture=x86_64, name=rhceph) Nov 28 04:56:50 localhost systemd[1]: Started libpod-conmon-003b3b17193d134a03f3cfe786c7cc57962cb60d93c55e0970ebf853cb9fcd20.scope. Nov 28 04:56:50 localhost systemd[1]: Started libcrun container. Nov 28 04:56:50 localhost podman[304882]: 2025-11-28 09:56:50.302067409 +0000 UTC m=+0.046677748 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:56:50 localhost podman[304882]: 2025-11-28 09:56:50.407066771 +0000 UTC m=+0.151677080 container init 003b3b17193d134a03f3cfe786c7cc57962cb60d93c55e0970ebf853cb9fcd20 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_feistel, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, version=7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, RELEASE=main, name=rhceph, release=553, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, ceph=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.33.12, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph) Nov 28 04:56:50 localhost podman[304882]: 2025-11-28 09:56:50.417897254 +0000 UTC m=+0.162507573 container start 003b3b17193d134a03f3cfe786c7cc57962cb60d93c55e0970ebf853cb9fcd20 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_feistel, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, architecture=x86_64, name=rhceph, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vcs-type=git, release=553, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , io.buildah.version=1.33.12, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, version=7, ceph=True, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=) Nov 28 04:56:50 localhost podman[304882]: 2025-11-28 09:56:50.418273525 +0000 UTC m=+0.162883874 container attach 003b3b17193d134a03f3cfe786c7cc57962cb60d93c55e0970ebf853cb9fcd20 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_feistel, distribution-scope=public, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, ceph=True, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_CLEAN=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, GIT_BRANCH=main, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, name=rhceph, description=Red Hat Ceph Storage 7, architecture=x86_64) Nov 28 04:56:50 localhost silly_feistel[304897]: 167 167 Nov 28 04:56:50 localhost systemd[1]: libpod-003b3b17193d134a03f3cfe786c7cc57962cb60d93c55e0970ebf853cb9fcd20.scope: Deactivated successfully. Nov 28 04:56:50 localhost podman[304882]: 2025-11-28 09:56:50.423865468 +0000 UTC m=+0.168475807 container died 003b3b17193d134a03f3cfe786c7cc57962cb60d93c55e0970ebf853cb9fcd20 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_feistel, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, maintainer=Guillaume Abrioux , GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, vcs-type=git, release=553, RELEASE=main, distribution-scope=public, version=7, io.openshift.tags=rhceph ceph, ceph=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d) Nov 28 04:56:50 localhost podman[304902]: 2025-11-28 09:56:50.534887165 +0000 UTC m=+0.093601301 container remove 003b3b17193d134a03f3cfe786c7cc57962cb60d93c55e0970ebf853cb9fcd20 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_feistel, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=, ceph=True, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/agreements, vendor=Red Hat, Inc., RELEASE=main, GIT_CLEAN=True, build-date=2025-09-24T08:57:55, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.buildah.version=1.33.12, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=553, distribution-scope=public) Nov 28 04:56:50 localhost systemd[1]: libpod-conmon-003b3b17193d134a03f3cfe786c7cc57962cb60d93c55e0970ebf853cb9fcd20.scope: Deactivated successfully. Nov 28 04:56:50 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:50 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:50 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:50 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:50 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:56:50 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538513.yljthc (monmap changed)... Nov 28 04:56:50 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538513.yljthc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:56:50 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538513.yljthc on np0005538513.localdomain Nov 28 04:56:50 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:56:50 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:50 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:56:50 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:50 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Nov 28 04:56:50 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:56:50 localhost nova_compute[279673]: 2025-11-28 09:56:50.767 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:56:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:56:50.837 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:56:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:56:50.837 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:56:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:56:50.839 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:56:51 localhost systemd[1]: var-lib-containers-storage-overlay-657524b94084caf0ee278ccd4f4a33e6b5969e8fa6c5b6eaf213bf0c33e1a2ec-merged.mount: Deactivated successfully. Nov 28 04:56:51 localhost podman[304971]: Nov 28 04:56:51 localhost podman[304971]: 2025-11-28 09:56:51.395748403 +0000 UTC m=+0.079945181 container create 64c640f92507e5ccc55f6d3fc5d30c8237ff1563ed4914e674b1d81581ce7fde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_mcnulty, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, GIT_CLEAN=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, architecture=x86_64, RELEASE=main, maintainer=Guillaume Abrioux , GIT_BRANCH=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., release=553, version=7, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55) Nov 28 04:56:51 localhost systemd[1]: Started libpod-conmon-64c640f92507e5ccc55f6d3fc5d30c8237ff1563ed4914e674b1d81581ce7fde.scope. Nov 28 04:56:51 localhost podman[304971]: 2025-11-28 09:56:51.361156589 +0000 UTC m=+0.045353367 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:56:51 localhost systemd[1]: Started libcrun container. Nov 28 04:56:51 localhost podman[304971]: 2025-11-28 09:56:51.475623892 +0000 UTC m=+0.159820670 container init 64c640f92507e5ccc55f6d3fc5d30c8237ff1563ed4914e674b1d81581ce7fde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_mcnulty, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, release=553, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, architecture=x86_64, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, build-date=2025-09-24T08:57:55, GIT_BRANCH=main, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, version=7) Nov 28 04:56:51 localhost podman[304971]: 2025-11-28 09:56:51.4875861 +0000 UTC m=+0.171782868 container start 64c640f92507e5ccc55f6d3fc5d30c8237ff1563ed4914e674b1d81581ce7fde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_mcnulty, io.buildah.version=1.33.12, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, ceph=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, maintainer=Guillaume Abrioux , release=553, version=7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, vcs-type=git, description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/agreements, architecture=x86_64, RELEASE=main, GIT_BRANCH=main) Nov 28 04:56:51 localhost podman[304971]: 2025-11-28 09:56:51.487955012 +0000 UTC m=+0.172151790 container attach 64c640f92507e5ccc55f6d3fc5d30c8237ff1563ed4914e674b1d81581ce7fde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_mcnulty, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, release=553, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, ceph=True, distribution-scope=public, version=7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vcs-type=git, RELEASE=main, maintainer=Guillaume Abrioux , io.openshift.expose-services=) Nov 28 04:56:51 localhost crazy_mcnulty[304986]: 167 167 Nov 28 04:56:51 localhost systemd[1]: libpod-64c640f92507e5ccc55f6d3fc5d30c8237ff1563ed4914e674b1d81581ce7fde.scope: Deactivated successfully. Nov 28 04:56:51 localhost podman[304971]: 2025-11-28 09:56:51.490447218 +0000 UTC m=+0.174643996 container died 64c640f92507e5ccc55f6d3fc5d30c8237ff1563ed4914e674b1d81581ce7fde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_mcnulty, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_CLEAN=True, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, distribution-scope=public, architecture=x86_64, ceph=True, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., release=553, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.expose-services=, io.buildah.version=1.33.12, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, RELEASE=main) Nov 28 04:56:51 localhost podman[304991]: 2025-11-28 09:56:51.593884312 +0000 UTC m=+0.088295209 container remove 64c640f92507e5ccc55f6d3fc5d30c8237ff1563ed4914e674b1d81581ce7fde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_mcnulty, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, build-date=2025-09-24T08:57:55, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.buildah.version=1.33.12, vcs-type=git, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.tags=rhceph ceph, release=553, name=rhceph, maintainer=Guillaume Abrioux ) Nov 28 04:56:51 localhost systemd[1]: libpod-conmon-64c640f92507e5ccc55f6d3fc5d30c8237ff1563ed4914e674b1d81581ce7fde.scope: Deactivated successfully. Nov 28 04:56:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 28 04:56:51 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:51 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:51 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:51 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:56:51 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538513.dsfdlx (monmap changed)... Nov 28 04:56:51 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538513.dsfdlx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:56:51 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538513.dsfdlx on np0005538513.localdomain Nov 28 04:56:51 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:56:51 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:56:51 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Nov 28 04:56:51 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:56:52 localhost systemd[1]: tmp-crun.rzV7al.mount: Deactivated successfully. Nov 28 04:56:52 localhost systemd[1]: var-lib-containers-storage-overlay-cf95f5dd5d66a2419260b23995ada69ce7f9cb04138bc8241830f25a814c7432-merged.mount: Deactivated successfully. Nov 28 04:56:52 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:56:52 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:52 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:56:52 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:52 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:52 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:52 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:56:52 localhost ceph-mon[292954]: Reconfiguring crash.np0005538514 (monmap changed)... Nov 28 04:56:52 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538514.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:56:52 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538514 on np0005538514.localdomain Nov 28 04:56:52 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:52 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:52 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Nov 28 04:56:53 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:56:53 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:53 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:56:53 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:53 localhost ceph-mon[292954]: Reconfiguring osd.0 (monmap changed)... Nov 28 04:56:53 localhost ceph-mon[292954]: Reconfiguring daemon osd.0 on np0005538514.localdomain Nov 28 04:56:53 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:53 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:56:53 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:53 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:56:53 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:53 localhost nova_compute[279673]: 2025-11-28 09:56:53.951 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:56:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:56:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:56:54 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:56:54 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:56:54 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0. Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.760382) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31 Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323814760423, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 2552, "num_deletes": 255, "total_data_size": 5484296, "memory_usage": 5657152, "flush_reason": "Manual Compaction"} Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started Nov 28 04:56:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323814786295, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 4558535, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18020, "largest_seqno": 20567, "table_properties": {"data_size": 4547473, "index_size": 6799, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3205, "raw_key_size": 28856, "raw_average_key_size": 22, "raw_value_size": 4523024, "raw_average_value_size": 3536, "num_data_blocks": 297, "num_entries": 1279, "num_filter_entries": 1279, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323751, "oldest_key_time": 1764323751, "file_creation_time": 1764323814, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}} Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 25977 microseconds, and 11197 cpu microseconds. Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 04:56:54 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:54 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:54 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:54 localhost ceph-mon[292954]: Reconfiguring osd.3 (monmap changed)... Nov 28 04:56:54 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Nov 28 04:56:54 localhost ceph-mon[292954]: Reconfiguring daemon osd.3 on np0005538514.localdomain Nov 28 04:56:54 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:54 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:54 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.786353) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 4558535 bytes OK Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.786381) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.788426) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.788451) EVENT_LOG_v1 {"time_micros": 1764323814788443, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.788478) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 5472261, prev total WAL file size 5482291, number of live WAL files 2. Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.789566) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131303434' seq:72057594037927935, type:22 .. '7061786F73003131323936' seq:0, type:0; will stop at (end) Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(4451KB)], [30(15MB)] Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323814789629, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 21166090, "oldest_snapshot_seqno": -1} Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 11638 keys, 18010737 bytes, temperature: kUnknown Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323814913211, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 18010737, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17943903, "index_size": 36653, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29125, "raw_key_size": 312230, "raw_average_key_size": 26, "raw_value_size": 17745227, "raw_average_value_size": 1524, "num_data_blocks": 1394, "num_entries": 11638, "num_filter_entries": 11638, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764323814, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}} Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.913928) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 18010737 bytes Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.916839) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 171.1 rd, 145.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.3, 15.8 +0.0 blob) out(17.2 +0.0 blob), read-write-amplify(8.6) write-amplify(4.0) OK, records in: 12186, records dropped: 548 output_compression: NoCompression Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.916875) EVENT_LOG_v1 {"time_micros": 1764323814916858, "job": 16, "event": "compaction_finished", "compaction_time_micros": 123703, "compaction_time_cpu_micros": 44215, "output_level": 6, "num_output_files": 1, "total_output_size": 18010737, "num_input_records": 12186, "num_output_records": 11638, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.789484) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.917444) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.917450) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.917453) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.917457) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:56:54.917460) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323814918444, "job": 0, "event": "table_file_deletion", "file_number": 32} Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:56:54 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323814920999, "job": 0, "event": "table_file_deletion", "file_number": 30} Nov 28 04:56:54 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Nov 28 04:56:54 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:56:55 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Nov 28 04:56:55 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:56:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:56:55 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:56:55 localhost nova_compute[279673]: 2025-11-28 09:56:55.799 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:56:55 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:55 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:56:55 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:55 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Nov 28 04:56:55 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:56:55 localhost podman[305008]: 2025-11-28 09:56:55.860067829 +0000 UTC m=+0.090071923 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 04:56:55 localhost podman[305009]: 2025-11-28 09:56:55.911137991 +0000 UTC m=+0.139254408 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd) Nov 28 04:56:55 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:55 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:56:55 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538514.umgtoy (monmap changed)... Nov 28 04:56:55 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538514.umgtoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:56:55 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538514.umgtoy on np0005538514.localdomain Nov 28 04:56:55 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:55 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:55 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:55 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:56:55 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005538514.djozup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Nov 28 04:56:55 localhost podman[305008]: 2025-11-28 09:56:55.92800498 +0000 UTC m=+0.158009064 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 04:56:55 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:56:55 localhost podman[305009]: 2025-11-28 09:56:55.981499067 +0000 UTC m=+0.209615514 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 28 04:56:55 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:56:56 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:56:56 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:56 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:56:56 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:56 localhost ceph-mon[292954]: Saving service mon spec with placement label:mon Nov 28 04:56:56 localhost ceph-mon[292954]: Reconfiguring mgr.np0005538514.djozup (monmap changed)... Nov 28 04:56:56 localhost ceph-mon[292954]: Reconfiguring daemon mgr.np0005538514.djozup on np0005538514.localdomain Nov 28 04:56:56 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:56 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:56 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:56:57 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:56:57 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:57 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:56:57 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:57 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Nov 28 04:56:57 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:56:58 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:56:58 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:58 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:56:58 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:58 localhost ceph-mon[292954]: Reconfiguring mon.np0005538514 (monmap changed)... Nov 28 04:56:58 localhost ceph-mon[292954]: Reconfiguring daemon mon.np0005538514 on np0005538514.localdomain Nov 28 04:56:58 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:58 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:58 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:56:58 localhost ceph-mon[292954]: Reconfiguring crash.np0005538515 (monmap changed)... Nov 28 04:56:58 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005538515.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Nov 28 04:56:58 localhost ceph-mon[292954]: Reconfiguring daemon crash.np0005538515 on np0005538515.localdomain Nov 28 04:56:58 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:58 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:58 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Nov 28 04:56:58 localhost nova_compute[279673]: 2025-11-28 09:56:58.957 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:56:59 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:56:59 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:59 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:56:59 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:59 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:56:59 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:59 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:56:59 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:59 localhost ceph-mon[292954]: Reconfiguring osd.1 (monmap changed)... Nov 28 04:56:59 localhost ceph-mon[292954]: Reconfiguring daemon osd.1 on np0005538515.localdomain Nov 28 04:56:59 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:59 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:59 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:56:59 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:57:00 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:57:00 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:00 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:57:00 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:00 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:00 localhost ceph-mon[292954]: Reconfiguring osd.4 (monmap changed)... Nov 28 04:57:00 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Nov 28 04:57:00 localhost ceph-mon[292954]: Reconfiguring daemon osd.4 on np0005538515.localdomain Nov 28 04:57:00 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:00 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:57:00 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:00 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:57:00 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:00 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Nov 28 04:57:00 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:57:00 localhost nova_compute[279673]: 2025-11-28 09:57:00.802 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:57:01 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:57:01 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:01 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:57:01 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:01 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:01 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:01 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:01 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:57:01 localhost ceph-mon[292954]: Reconfiguring mds.mds.np0005538515.anvatb (monmap changed)... Nov 28 04:57:01 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005538515.anvatb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Nov 28 04:57:01 localhost ceph-mon[292954]: Reconfiguring daemon mds.mds.np0005538515.anvatb on np0005538515.localdomain Nov 28 04:57:01 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:01 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:01 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:57:02 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:57:02 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:02 localhost ceph-mon[292954]: Reconfiguring mon.np0005538515 (monmap changed)... Nov 28 04:57:02 localhost ceph-mon[292954]: Reconfiguring daemon mon.np0005538515 on np0005538515.localdomain Nov 28 04:57:02 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:57:02 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:02 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 04:57:02 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:02 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Nov 28 04:57:02 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:03 localhost podman[305119]: Nov 28 04:57:03 localhost podman[305119]: 2025-11-28 09:57:03.747878924 +0000 UTC m=+0.084695379 container create a5e68cb7d08a88f6466fa0fd0ee2a1271f7ab08134e244cffd9055fdf03913ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_rhodes, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-type=git, description=Red Hat Ceph Storage 7, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, architecture=x86_64, CEPH_POINT_RELEASE=, release=553, maintainer=Guillaume Abrioux , url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, distribution-scope=public, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, GIT_BRANCH=main, version=7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc.) Nov 28 04:57:03 localhost systemd[1]: Started libpod-conmon-a5e68cb7d08a88f6466fa0fd0ee2a1271f7ab08134e244cffd9055fdf03913ce.scope. Nov 28 04:57:03 localhost systemd[1]: Started libcrun container. Nov 28 04:57:03 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:03 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:03 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:57:03 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:03 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:03 localhost ceph-mon[292954]: Reconfiguring mon.np0005538513 (monmap changed)... Nov 28 04:57:03 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Nov 28 04:57:03 localhost ceph-mon[292954]: Reconfiguring daemon mon.np0005538513 on np0005538513.localdomain Nov 28 04:57:03 localhost podman[305119]: 2025-11-28 09:57:03.711431312 +0000 UTC m=+0.048247837 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Nov 28 04:57:03 localhost podman[305119]: 2025-11-28 09:57:03.81763953 +0000 UTC m=+0.154455995 container init a5e68cb7d08a88f6466fa0fd0ee2a1271f7ab08134e244cffd9055fdf03913ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_rhodes, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/agreements, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vcs-type=git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.33.12, com.redhat.component=rhceph-container, architecture=x86_64, release=553, vendor=Red Hat, Inc., GIT_BRANCH=main, distribution-scope=public, build-date=2025-09-24T08:57:55, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main) Nov 28 04:57:03 localhost podman[305119]: 2025-11-28 09:57:03.82928525 +0000 UTC m=+0.166101715 container start a5e68cb7d08a88f6466fa0fd0ee2a1271f7ab08134e244cffd9055fdf03913ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_rhodes, release=553, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, io.buildah.version=1.33.12, io.openshift.expose-services=, ceph=True, version=7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_BRANCH=main, name=rhceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-09-24T08:57:55) Nov 28 04:57:03 localhost podman[305119]: 2025-11-28 09:57:03.829528917 +0000 UTC m=+0.166345412 container attach a5e68cb7d08a88f6466fa0fd0ee2a1271f7ab08134e244cffd9055fdf03913ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_rhodes, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, build-date=2025-09-24T08:57:55, GIT_CLEAN=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., ceph=True, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, version=7, architecture=x86_64, release=553, maintainer=Guillaume Abrioux , name=rhceph, GIT_BRANCH=main, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/agreements, vcs-type=git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 28 04:57:03 localhost fervent_rhodes[305134]: 167 167 Nov 28 04:57:03 localhost systemd[1]: libpod-a5e68cb7d08a88f6466fa0fd0ee2a1271f7ab08134e244cffd9055fdf03913ce.scope: Deactivated successfully. Nov 28 04:57:03 localhost podman[305119]: 2025-11-28 09:57:03.835217432 +0000 UTC m=+0.172033897 container died a5e68cb7d08a88f6466fa0fd0ee2a1271f7ab08134e244cffd9055fdf03913ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_rhodes, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, release=553, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.buildah.version=1.33.12, vcs-type=git, distribution-scope=public, build-date=2025-09-24T08:57:55, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, architecture=x86_64, maintainer=Guillaume Abrioux , name=rhceph, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/agreements, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git) Nov 28 04:57:03 localhost podman[305139]: 2025-11-28 09:57:03.937646594 +0000 UTC m=+0.091815047 container remove a5e68cb7d08a88f6466fa0fd0ee2a1271f7ab08134e244cffd9055fdf03913ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_rhodes, com.redhat.component=rhceph-container, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/agreements, io.openshift.expose-services=, GIT_BRANCH=main, io.buildah.version=1.33.12, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, maintainer=Guillaume Abrioux , architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, distribution-scope=public, build-date=2025-09-24T08:57:55, name=rhceph, release=553, version=7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=) Nov 28 04:57:03 localhost systemd[1]: libpod-conmon-a5e68cb7d08a88f6466fa0fd0ee2a1271f7ab08134e244cffd9055fdf03913ce.scope: Deactivated successfully. Nov 28 04:57:03 localhost nova_compute[279673]: 2025-11-28 09:57:03.979 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:57:04 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:57:04 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:04 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:57:04 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:04 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:57:04 localhost systemd[1]: var-lib-containers-storage-overlay-6016f8d5ea77cfedcee980831b360ae116ffed49e619f4967787bc77a95b3280-merged.mount: Deactivated successfully. Nov 28 04:57:04 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e40: np0005538514.djozup(active, since 23s), standbys: np0005538515.yfkzhl, np0005538513.dsfdlx Nov 28 04:57:05 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:05 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:05 localhost nova_compute[279673]: 2025-11-28 09:57:05.828 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:57:06 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 28 04:57:06 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:57:06 localhost systemd[1]: tmp-crun.auq7aN.mount: Deactivated successfully. Nov 28 04:57:06 localhost podman[305156]: 2025-11-28 09:57:06.87275429 +0000 UTC m=+0.102129174 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9) Nov 28 04:57:06 localhost podman[305156]: 2025-11-28 09:57:06.917473057 +0000 UTC m=+0.146847931 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, vcs-type=git, release=1755695350) Nov 28 04:57:06 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:57:07 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:57:08 localhost nova_compute[279673]: 2025-11-28 09:57:08.982 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:57:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:57:10 localhost podman[238687]: time="2025-11-28T09:57:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:57:10 localhost podman[238687]: @ - - [28/Nov/2025:09:57:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 28 04:57:10 localhost podman[238687]: @ - - [28/Nov/2025:09:57:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18736 "" "Go-http-client/1.1" Nov 28 04:57:10 localhost nova_compute[279673]: 2025-11-28 09:57:10.832 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:57:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:57:11 localhost podman[305176]: 2025-11-28 09:57:11.847651872 +0000 UTC m=+0.083415859 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 28 04:57:11 localhost podman[305176]: 2025-11-28 09:57:11.857415712 +0000 UTC m=+0.093179739 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 04:57:11 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:57:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:57:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:57:13 localhost podman[305197]: 2025-11-28 09:57:13.847256622 +0000 UTC m=+0.087297749 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:57:13 localhost podman[305198]: 2025-11-28 09:57:13.911069865 +0000 UTC m=+0.147730597 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:57:13 localhost podman[305197]: 2025-11-28 09:57:13.920457735 +0000 UTC m=+0.160498802 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:57:13 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:57:13 localhost podman[305198]: 2025-11-28 09:57:13.943529655 +0000 UTC m=+0.180190407 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true) Nov 28 04:57:13 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:57:13 localhost nova_compute[279673]: 2025-11-28 09:57:13.984 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:57:14 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:57:15 localhost nova_compute[279673]: 2025-11-28 09:57:15.869 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:57:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:57:16 localhost podman[305239]: 2025-11-28 09:57:16.845340416 +0000 UTC m=+0.082474680 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2) Nov 28 04:57:16 localhost podman[305239]: 2025-11-28 09:57:16.859494871 +0000 UTC m=+0.096629145 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:57:16 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:57:18 localhost openstack_network_exporter[240658]: ERROR 09:57:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:57:18 localhost openstack_network_exporter[240658]: ERROR 09:57:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:57:18 localhost openstack_network_exporter[240658]: Nov 28 04:57:18 localhost openstack_network_exporter[240658]: ERROR 09:57:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:57:18 localhost openstack_network_exporter[240658]: ERROR 09:57:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:57:18 localhost openstack_network_exporter[240658]: ERROR 09:57:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:57:18 localhost openstack_network_exporter[240658]: Nov 28 04:57:18 localhost nova_compute[279673]: 2025-11-28 09:57:18.987 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:57:19 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:57:20 localhost nova_compute[279673]: 2025-11-28 09:57:20.872 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:57:24 localhost nova_compute[279673]: 2025-11-28 09:57:24.018 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:57:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:57:25 localhost nova_compute[279673]: 2025-11-28 09:57:25.906 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:57:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:57:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:57:26 localhost systemd[299031]: Starting Mark boot as successful... Nov 28 04:57:26 localhost podman[305259]: 2025-11-28 09:57:26.85750472 +0000 UTC m=+0.086215785 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:57:26 localhost systemd[299031]: Finished Mark boot as successful. Nov 28 04:57:26 localhost podman[305259]: 2025-11-28 09:57:26.888586706 +0000 UTC m=+0.117297731 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 04:57:26 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:57:26 localhost podman[305260]: 2025-11-28 09:57:26.935169671 +0000 UTC m=+0.160358118 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd) Nov 28 04:57:26 localhost podman[305260]: 2025-11-28 09:57:26.948402027 +0000 UTC m=+0.173590514 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 04:57:26 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:57:29 localhost nova_compute[279673]: 2025-11-28 09:57:29.021 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:57:29 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:57:30 localhost nova_compute[279673]: 2025-11-28 09:57:30.908 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:57:33 localhost nova_compute[279673]: 2025-11-28 09:57:33.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:57:33 localhost nova_compute[279673]: 2025-11-28 09:57:33.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:57:34 localhost nova_compute[279673]: 2025-11-28 09:57:34.067 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:57:34 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:57:34 localhost nova_compute[279673]: 2025-11-28 09:57:34.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:57:34 localhost nova_compute[279673]: 2025-11-28 09:57:34.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:57:34 localhost nova_compute[279673]: 2025-11-28 09:57:34.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 04:57:35 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "status", "format": "json"} v 0) Nov 28 04:57:35 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/1459346702' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch Nov 28 04:57:35 localhost nova_compute[279673]: 2025-11-28 09:57:35.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:57:35 localhost nova_compute[279673]: 2025-11-28 09:57:35.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:57:35 localhost nova_compute[279673]: 2025-11-28 09:57:35.937 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:57:36 localhost nova_compute[279673]: 2025-11-28 09:57:36.766 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:57:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:57:37 localhost nova_compute[279673]: 2025-11-28 09:57:37.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:57:37 localhost nova_compute[279673]: 2025-11-28 09:57:37.796 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:57:37 localhost nova_compute[279673]: 2025-11-28 09:57:37.796 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:57:37 localhost nova_compute[279673]: 2025-11-28 09:57:37.796 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:57:37 localhost nova_compute[279673]: 2025-11-28 09:57:37.797 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 04:57:37 localhost nova_compute[279673]: 2025-11-28 09:57:37.797 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:57:37 localhost podman[305300]: 2025-11-28 09:57:37.8557683 +0000 UTC m=+0.091803677 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 04:57:37 localhost podman[305300]: 2025-11-28 09:57:37.873491445 +0000 UTC m=+0.109526802 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, version=9.6, io.openshift.tags=minimal rhel9) Nov 28 04:57:37 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:57:38 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 04:57:38 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3192274012' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 04:57:38 localhost nova_compute[279673]: 2025-11-28 09:57:38.253 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:57:38 localhost nova_compute[279673]: 2025-11-28 09:57:38.316 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:57:38 localhost nova_compute[279673]: 2025-11-28 09:57:38.317 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:57:38 localhost nova_compute[279673]: 2025-11-28 09:57:38.504 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 04:57:38 localhost nova_compute[279673]: 2025-11-28 09:57:38.506 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11760MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 04:57:38 localhost nova_compute[279673]: 2025-11-28 09:57:38.507 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:57:38 localhost nova_compute[279673]: 2025-11-28 09:57:38.507 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:57:38 localhost nova_compute[279673]: 2025-11-28 09:57:38.623 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 04:57:38 localhost nova_compute[279673]: 2025-11-28 09:57:38.624 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 04:57:38 localhost nova_compute[279673]: 2025-11-28 09:57:38.624 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 04:57:38 localhost nova_compute[279673]: 2025-11-28 09:57:38.662 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:57:39 localhost nova_compute[279673]: 2025-11-28 09:57:39.071 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:57:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 04:57:39 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/510121500' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 04:57:39 localhost nova_compute[279673]: 2025-11-28 09:57:39.115 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:57:39 localhost nova_compute[279673]: 2025-11-28 09:57:39.121 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 04:57:39 localhost nova_compute[279673]: 2025-11-28 09:57:39.142 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 04:57:39 localhost nova_compute[279673]: 2025-11-28 09:57:39.145 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 04:57:39 localhost nova_compute[279673]: 2025-11-28 09:57:39.146 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:57:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:57:40 localhost podman[238687]: time="2025-11-28T09:57:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:57:40 localhost podman[238687]: @ - - [28/Nov/2025:09:57:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 28 04:57:40 localhost podman[238687]: @ - - [28/Nov/2025:09:57:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18727 "" "Go-http-client/1.1" Nov 28 04:57:40 localhost nova_compute[279673]: 2025-11-28 09:57:40.939 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:57:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:57:42 localhost systemd[1]: tmp-crun.wTrBOp.mount: Deactivated successfully. Nov 28 04:57:42 localhost podman[305363]: 2025-11-28 09:57:42.85426823 +0000 UTC m=+0.091733365 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 28 04:57:42 localhost podman[305363]: 2025-11-28 09:57:42.887762651 +0000 UTC m=+0.125227786 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:57:42 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:57:44 localhost nova_compute[279673]: 2025-11-28 09:57:44.103 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:57:44 localhost nova_compute[279673]: 2025-11-28 09:57:44.146 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:57:44 localhost nova_compute[279673]: 2025-11-28 09:57:44.146 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 04:57:44 localhost nova_compute[279673]: 2025-11-28 09:57:44.146 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 04:57:44 localhost nova_compute[279673]: 2025-11-28 09:57:44.532 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:57:44 localhost nova_compute[279673]: 2025-11-28 09:57:44.533 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:57:44 localhost nova_compute[279673]: 2025-11-28 09:57:44.533 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 04:57:44 localhost nova_compute[279673]: 2025-11-28 09:57:44.534 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:57:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:57:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:57:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:57:44 localhost podman[305386]: 2025-11-28 09:57:44.855304304 +0000 UTC m=+0.091153836 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 28 04:57:44 localhost podman[305387]: 2025-11-28 09:57:44.901663122 +0000 UTC m=+0.133912643 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 04:57:44 localhost podman[305387]: 2025-11-28 09:57:44.910355099 +0000 UTC m=+0.142604630 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 28 04:57:44 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:57:44 localhost podman[305386]: 2025-11-28 09:57:44.968814169 +0000 UTC m=+0.204663671 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Nov 28 04:57:44 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:57:45 localhost nova_compute[279673]: 2025-11-28 09:57:45.011 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 04:57:45 localhost nova_compute[279673]: 2025-11-28 09:57:45.030 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:57:45 localhost nova_compute[279673]: 2025-11-28 09:57:45.030 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 04:57:45 localhost nova_compute[279673]: 2025-11-28 09:57:45.969 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:57:47 localhost nova_compute[279673]: 2025-11-28 09:57:47.651 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:57:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:57:47 localhost systemd[1]: tmp-crun.aO0sQn.mount: Deactivated successfully. Nov 28 04:57:47 localhost podman[305429]: 2025-11-28 09:57:47.832371482 +0000 UTC m=+0.072774591 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:57:47 localhost podman[305429]: 2025-11-28 09:57:47.846505917 +0000 UTC m=+0.086909056 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:57:47 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:57:48 localhost openstack_network_exporter[240658]: ERROR 09:57:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:57:48 localhost openstack_network_exporter[240658]: ERROR 09:57:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:57:48 localhost openstack_network_exporter[240658]: ERROR 09:57:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:57:48 localhost openstack_network_exporter[240658]: ERROR 09:57:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:57:48 localhost openstack_network_exporter[240658]: Nov 28 04:57:48 localhost openstack_network_exporter[240658]: ERROR 09:57:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:57:48 localhost openstack_network_exporter[240658]: Nov 28 04:57:49 localhost nova_compute[279673]: 2025-11-28 09:57:49.109 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:57:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0) Nov 28 04:57:49 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/3756897191' entity='client.admin' cmd={"prefix": "config dump", "format": "json"} : dispatch Nov 28 04:57:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:57:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:57:50.837 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:57:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:57:50.838 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:57:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:57:50.839 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:57:50 localhost nova_compute[279673]: 2025-11-28 09:57:50.972 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:57:54 localhost nova_compute[279673]: 2025-11-28 09:57:54.135 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:57:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:57:56 localhost nova_compute[279673]: 2025-11-28 09:57:56.000 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:57:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:57:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:57:57 localhost systemd[1]: tmp-crun.uyCfxD.mount: Deactivated successfully. Nov 28 04:57:57 localhost podman[305448]: 2025-11-28 09:57:57.857804076 +0000 UTC m=+0.091414154 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:57:57 localhost podman[305448]: 2025-11-28 09:57:57.867475363 +0000 UTC m=+0.101085481 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 04:57:57 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:57:57 localhost podman[305449]: 2025-11-28 09:57:57.921740117 +0000 UTC m=+0.148620448 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3) Nov 28 04:57:57 localhost podman[305449]: 2025-11-28 09:57:57.958159064 +0000 UTC m=+0.185039395 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true) Nov 28 04:57:57 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:57:59 localhost nova_compute[279673]: 2025-11-28 09:57:59.139 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:57:59 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.673 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.673 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.678 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '381850ba-2556-441b-8d90-c6bad9de787f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:58:00.674225', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'b9454e36-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.846130737, 'message_signature': '8c82c25bcb1b43d262155e005990eaa516cc04ea53d532cd5e12ffba163c1d26'}]}, 'timestamp': '2025-11-28 09:58:00.678848', '_unique_id': '7a9a89928aee4af79361467b6d5a31c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.680 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.681 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.681 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.693 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.693 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '717bb117-4ac9-407a-8f3a-2b0cec487e70', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:58:00.681996', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b9478e6c-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.853922026, 'message_signature': '5c004f5a98c075fe8517907f3de65db293000dd41e572d4c4724f39daea6a3ef'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:58:00.681996', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b9479f06-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.853922026, 'message_signature': '92edd8a22433c2a6d767825168ecfc2531e7e64084da7f5a1f848ef067ea057d'}]}, 'timestamp': '2025-11-28 09:58:00.693936', '_unique_id': 'b882e977d67c4ec68fd151d601a6e6c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.694 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.696 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.696 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34d6eebd-5fc4-4b6d-a541-8705ac5f7516', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:58:00.696196', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'b9480810-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.846130737, 'message_signature': 'e162aaf200bc3d0077d97dd446a2f709a0cbc61936a73a99840d1ac6889c37f8'}]}, 'timestamp': '2025-11-28 09:58:00.696656', '_unique_id': '29f7628d3fc743ca910cf6a627c69d66'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.697 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.698 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.725 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.725 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe8ef821-c176-4882-8d32-50dfd33104a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:58:00.698720', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b94c7332-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.870612158, 'message_signature': 'dbc558f696527d8f084c2ab64ee0131bb260e7f27f3b8e62e8faaabb151f67c2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:58:00.698720', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b94c85ca-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.870612158, 'message_signature': '4fe48d5f97754c26ad641859eeed150e4e84d9b572419b4b4e7b9f36b3afd3cc'}]}, 'timestamp': '2025-11-28 09:58:00.726109', '_unique_id': 'f8556b5347944ed08e9b2391afbd1815'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.727 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.728 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.728 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b2472f5-92bf-49a9-a819-2138c336d763', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:58:00.728447', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'b94cf4e2-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.846130737, 'message_signature': '27df0f1b7f3a9f9500e203dfcf792897de844d89723c232d853eae1009895fe0'}]}, 'timestamp': '2025-11-28 09:58:00.728940', '_unique_id': '270e735ae67c4c0dbdad1a2049c37ea4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.729 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.731 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.731 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.731 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73e9fc56-eda6-4836-bc4c-0126dbc960df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:58:00.731336', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'b94d6486-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.846130737, 'message_signature': '6ab8ad9eec20310ebf532f91275d3f62253e5b97be7385e864bdcf198ec1d786'}]}, 'timestamp': '2025-11-28 09:58:00.731823', '_unique_id': '9c02f0c9021244858cbae5b995dd6741'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.732 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.733 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.734 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.734 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d3ab2c2-9b03-42a5-94ff-8b1dc22613dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:58:00.733993', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b94dcdd6-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.870612158, 'message_signature': '0139bb7300224533ba780de2ff3b4fabef3196ff85b6c2806f9db4af360de88b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:58:00.733993', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b94dddf8-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.870612158, 'message_signature': '465bc5e0047df21aef6279323d3d54b85d07b0f29fb85af358fd8b9fef037687'}]}, 'timestamp': '2025-11-28 09:58:00.734902', '_unique_id': 'fb325843f531460dbd664aa361cabe07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.735 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.736 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.737 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.737 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd7e4d95b-6010-4b62-baac-d58fed0f0551', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:58:00.737150', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b94e4798-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.870612158, 'message_signature': '5d77eeac4e27d087544f07e1f3a0182999d9f349e6ae2418c202b6becce2cf0a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:58:00.737150', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b94e579c-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.870612158, 'message_signature': '33dd6ad122080d7714ce61f313af9d26b5367cbc1cf39a15c082c8263fdd4eaa'}]}, 'timestamp': '2025-11-28 09:58:00.738010', '_unique_id': '2f56664fea074f8eab1f08e583d32a05'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.738 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.740 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.740 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.740 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65f712cb-9ccc-443b-91ba-6f84f2ac38f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:58:00.740250', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b94ec204-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.870612158, 'message_signature': '2803d2bdd5543b629b210e6ae5ebecbecbdff726724f83db8cc1f4cc5ae88063'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:58:00.740250', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b94ed212-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.870612158, 'message_signature': '0647523d875c7825f0d65249d12b689f0a8c11d345d160adbf56358f498597a3'}]}, 'timestamp': '2025-11-28 09:58:00.741176', '_unique_id': '2f9e46dcd610456983d2b6d71aea741b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.742 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.743 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.761 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '735547e4-b61c-4184-bf7d-564485fa2c8c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:58:00.743360', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'b95205f4-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.933456905, 'message_signature': 'efc781793bc830662eea4c69bb40c0ac2443cedc693e965f1a607af9348a9d1b'}]}, 'timestamp': '2025-11-28 09:58:00.762162', '_unique_id': 'b0329057ddaa4646aff08f86371358de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.763 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.764 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.764 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.764 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c17ce39a-5727-47f4-8a50-4c43594b536b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:58:00.764312', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b9526e0e-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.853922026, 'message_signature': '388e6c287398fa6697d56d952b00d04327e40f56b9f01c1de93a50bb15d1c756'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:58:00.764312', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b9528074-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.853922026, 'message_signature': '93290285964464c84d7152c1613c2999ffe0c864eb0bd2039dd714d37141c938'}]}, 'timestamp': '2025-11-28 09:58:00.765250', '_unique_id': 'f60e219ddfbd4a78a8cb06543b88dc6e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.766 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.767 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.767 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.768 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a02bc1b8-4b58-4494-8456-cd362ef9c92f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:58:00.767662', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b952f004-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.870612158, 'message_signature': '69cdf7ae2931186562410179cbd5e9e4b49a7ae029d5e734a20ba68538b5a28e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:58:00.767662', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b95301de-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.870612158, 'message_signature': '3c737cb06a3282e1b9a60cfb280c79133bf6726a2e5bc06f3d6cb7ecfaab1017'}]}, 'timestamp': '2025-11-28 09:58:00.768585', '_unique_id': 'e2aa3816ce0b42f5a4790d0b8e35d8d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.770 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.770 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 14550000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e17382f6-e00f-42fd-b13b-e7e69b91d29d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14550000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T09:58:00.770816', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'b9536bb0-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.933456905, 'message_signature': '2b3b86cc4341f2baf3e94864e0d0f5ad0101aed18d3c0d62ed44dd13eb740dc7'}]}, 'timestamp': '2025-11-28 09:58:00.771283', '_unique_id': '51ca6bbff7c248d5abede5120a017b6f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.773 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.773 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5362d2af-d200-4f95-8151-d8bed4d7f17d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:58:00.773372', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'b953ce34-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.846130737, 'message_signature': '73114aff2dfc8d7a8ff38270bd8fc3c4c319553e1f9099fce45ce2c2c770c96f'}]}, 'timestamp': '2025-11-28 09:58:00.773813', '_unique_id': '5830643199cd40c88f4a723b637c099d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.774 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.775 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.775 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e58ef211-a45c-43b7-8930-9cc2cffc6094', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:58:00.775824', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'b9542f0a-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.846130737, 'message_signature': '982a4fd72d7d871fac64f3aeca234022b1b88ce7ccd0a02c23b032d5a30a0aed'}]}, 'timestamp': '2025-11-28 09:58:00.776293', '_unique_id': '5aaacfddad3849db937ea3c207a98959'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.777 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.778 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.778 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.778 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '581260ba-a6f1-460c-942f-ebd38a9592d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:58:00.778330', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b9548fc2-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.870612158, 'message_signature': '6f47e2343bfc887fbcafddb1573dce07a17766248b9dd174e69d94c200086461'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:58:00.778330', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b9549f3a-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.870612158, 'message_signature': 'f07d07f12eb6352d6cf434e728ba10520c586f7338546b87094eb16d928ee768'}]}, 'timestamp': '2025-11-28 09:58:00.779166', '_unique_id': '34559702dc9b4854affaa68d236c516c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.780 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.781 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.781 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9bba8455-c7ac-46d3-9237-f943ed1bff05', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:58:00.781240', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'b95501be-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.846130737, 'message_signature': '39d8e722c5054b99ff98c021bcab27398bef4428b533ca4a574815344a3243de'}]}, 'timestamp': '2025-11-28 09:58:00.781685', '_unique_id': 'fa43228f6fd24ab4b42bd6226b33df5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.782 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.783 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.783 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.784 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52c1552a-87d5-4c0f-9fa4-a2d1b7f6c1fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T09:58:00.783697', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b9556136-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.853922026, 'message_signature': '7485c19660a8373d7e8c245726bb6b52959fce92a76f2fedf4c306b56def12f4'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T09:58:00.783697', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b9557220-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.853922026, 'message_signature': '3d40489de150342ae5a8b6261a6a08154afe3833d8852a1846ba4534426a06e1'}]}, 'timestamp': '2025-11-28 09:58:00.784559', '_unique_id': 'c84b6963a25444a4900f00cc2542bba3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.785 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.786 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.787 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73fbef90-9330-4d7b-b0a2-7cbadbe5d5cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:58:00.786999', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'b955e430-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.846130737, 'message_signature': '321742c1e55756f229147e6e1f98c930a2e536abdd1376bd39ac21c2cc61212d'}]}, 'timestamp': '2025-11-28 09:58:00.787488', '_unique_id': 'faa1bf61e93b4810804ccbb9139b8548'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.788 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.789 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.789 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2b1a205-7d28-40a7-a2c3-f68a0d2e6833', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:58:00.789521', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'b9564510-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.846130737, 'message_signature': '0f0c67f84fc5a12ab2e06e52ff8c5774d870d0e36ddb1f01f2ae188bf29b3580'}]}, 'timestamp': '2025-11-28 09:58:00.789962', '_unique_id': '25f0cb0f2cbb4d0080a1a7bbe3ef7f39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.790 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.791 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.792 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7772d74-c8b1-4d77-8ffa-eed95b26a73e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T09:58:00.792127', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'b956ab04-cc40-11f0-a370-fa163eb02593', 'monotonic_time': 11714.846130737, 'message_signature': '75ce98269882f3fe13da93df27fe2416f8194996238ae34f57326c3b4ebfb54d'}]}, 'timestamp': '2025-11-28 09:58:00.792595', '_unique_id': 'babf634c34e74e4dbcc0d6387fef2267'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging yield Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.793 12 ERROR oslo_messaging.notify.messaging Nov 28 04:58:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 09:58:00.794 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 04:58:01 localhost nova_compute[279673]: 2025-11-28 09:58:01.003 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:58:02 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 28 04:58:02 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/2455933958' entity='client.admin' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 28 04:58:04 localhost nova_compute[279673]: 2025-11-28 09:58:04.170 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:58:04 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:58:04 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 04:58:05 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:58:05 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mgr fail"} v 0) Nov 28 04:58:05 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/1630948378' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 28 04:58:05 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e91 do_prune osdmap full prune enabled Nov 28 04:58:05 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : Activating manager daemon np0005538515.yfkzhl Nov 28 04:58:05 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 e92: 6 total, 6 up, 6 in Nov 28 04:58:05 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e92: 6 total, 6 up, 6 in Nov 28 04:58:05 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/1630948378' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Nov 28 04:58:05 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e41: np0005538515.yfkzhl(active, starting, since 0.0380796s), standbys: np0005538513.dsfdlx Nov 28 04:58:05 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : Manager daemon np0005538515.yfkzhl is now available Nov 28 04:58:05 localhost systemd[1]: session-71.scope: Deactivated successfully. Nov 28 04:58:05 localhost systemd[1]: session-71.scope: Consumed 10.450s CPU time. Nov 28 04:58:05 localhost systemd-logind[764]: Session 71 logged out. Waiting for processes to exit. Nov 28 04:58:05 localhost systemd-logind[764]: Removed session 71. Nov 28 04:58:05 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/mirror_snapshot_schedule"} v 0) Nov 28 04:58:05 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/mirror_snapshot_schedule"} : dispatch Nov 28 04:58:05 localhost ceph-mon[292954]: from='mgr.34348 172.18.0.107:0/1408760265' entity='mgr.np0005538514.djozup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:58:05 localhost ceph-mon[292954]: from='mgr.34348 ' entity='mgr.np0005538514.djozup' Nov 28 04:58:05 localhost ceph-mon[292954]: from='client.? 172.18.0.200:0/1630948378' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Nov 28 04:58:05 localhost ceph-mon[292954]: Activating manager daemon np0005538515.yfkzhl Nov 28 04:58:05 localhost ceph-mon[292954]: from='client.? 172.18.0.200:0/1630948378' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Nov 28 04:58:05 localhost ceph-mon[292954]: Manager daemon np0005538515.yfkzhl is now available Nov 28 04:58:05 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/mirror_snapshot_schedule"} : dispatch Nov 28 04:58:05 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/mirror_snapshot_schedule"} : dispatch Nov 28 04:58:05 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/trash_purge_schedule"} v 0) Nov 28 04:58:05 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/trash_purge_schedule"} : dispatch Nov 28 04:58:05 localhost sshd[305576]: main: sshd: ssh-rsa algorithm is disabled Nov 28 04:58:05 localhost systemd-logind[764]: New session 72 of user ceph-admin. Nov 28 04:58:05 localhost systemd[1]: Started Session 72 of User ceph-admin. Nov 28 04:58:06 localhost nova_compute[279673]: 2025-11-28 09:58:06.038 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:58:06 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e42: np0005538515.yfkzhl(active, since 1.05546s), standbys: np0005538513.dsfdlx Nov 28 04:58:06 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/trash_purge_schedule"} : dispatch Nov 28 04:58:06 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005538515.yfkzhl/trash_purge_schedule"} : dispatch Nov 28 04:58:06 localhost podman[305689]: 2025-11-28 09:58:06.980115969 +0000 UTC m=+0.092917750 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, version=7, GIT_CLEAN=True, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.buildah.version=1.33.12, release=553, build-date=2025-09-24T08:57:55, com.redhat.license_terms=https://www.redhat.com/agreements, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, vcs-type=git, ceph=True, com.redhat.component=rhceph-container, name=rhceph, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 28 04:58:07 localhost podman[305689]: 2025-11-28 09:58:07.082931362 +0000 UTC m=+0.195733153 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, com.redhat.license_terms=https://www.redhat.com/agreements, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vendor=Red Hat, Inc., GIT_CLEAN=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, release=553, name=rhceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.buildah.version=1.33.12, vcs-type=git) Nov 28 04:58:07 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm) Nov 28 04:58:07 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm) Nov 28 04:58:07 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : Cluster is now healthy Nov 28 04:58:07 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:58:07 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:07 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:58:07 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:07 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:58:07 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:07 localhost ceph-mon[292954]: [28/Nov/2025:09:58:06] ENGINE Bus STARTING Nov 28 04:58:07 localhost ceph-mon[292954]: [28/Nov/2025:09:58:06] ENGINE Serving on http://172.18.0.108:8765 Nov 28 04:58:07 localhost ceph-mon[292954]: [28/Nov/2025:09:58:07] ENGINE Serving on https://172.18.0.108:7150 Nov 28 04:58:07 localhost ceph-mon[292954]: [28/Nov/2025:09:58:07] ENGINE Bus STARTED Nov 28 04:58:07 localhost ceph-mon[292954]: [28/Nov/2025:09:58:07] ENGINE Client ('172.18.0.108', 56132) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Nov 28 04:58:07 localhost ceph-mon[292954]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm) Nov 28 04:58:07 localhost ceph-mon[292954]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm) Nov 28 04:58:07 localhost ceph-mon[292954]: Cluster is now healthy Nov 28 04:58:07 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:07 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:07 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:07 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:58:07 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:58:07 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:07 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:07 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:58:07 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:58:08 localhost systemd[1]: tmp-crun.oOiKjb.mount: Deactivated successfully. Nov 28 04:58:08 localhost podman[305841]: 2025-11-28 09:58:08.032046397 +0000 UTC m=+0.092748545 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7) Nov 28 04:58:08 localhost podman[305841]: 2025-11-28 09:58:08.048221452 +0000 UTC m=+0.108923600 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350) Nov 28 04:58:08 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:58:08 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:08 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:08 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:08 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e43: np0005538515.yfkzhl(active, since 3s), standbys: np0005538513.dsfdlx Nov 28 04:58:08 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:58:08 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:08 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:58:08 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:08 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0) Nov 28 04:58:08 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 28 04:58:08 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0) Nov 28 04:58:08 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 28 04:58:08 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Nov 28 04:58:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:58:09 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:58:09 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0) Nov 28 04:58:09 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 28 04:58:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0) Nov 28 04:58:09 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 28 04:58:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Nov 28 04:58:09 localhost nova_compute[279673]: 2025-11-28 09:58:09.173 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:58:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:58:09 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:58:09 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0) Nov 28 04:58:09 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 28 04:58:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0) Nov 28 04:58:09 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 28 04:58:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Nov 28 04:58:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:58:09 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:09 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:09 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 28 04:58:09 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 28 04:58:09 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 28 04:58:09 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 28 04:58:09 localhost ceph-mon[292954]: Adjusting osd_memory_target on np0005538514.localdomain to 836.6M Nov 28 04:58:09 localhost ceph-mon[292954]: Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 04:58:09 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:09 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:09 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 28 04:58:09 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 28 04:58:09 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 28 04:58:09 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 28 04:58:09 localhost ceph-mon[292954]: Adjusting osd_memory_target on np0005538515.localdomain to 836.6M Nov 28 04:58:09 localhost ceph-mon[292954]: Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 04:58:09 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:09 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:09 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 28 04:58:09 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 28 04:58:09 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 28 04:58:09 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 28 04:58:09 localhost ceph-mon[292954]: Adjusting osd_memory_target on np0005538513.localdomain to 836.6M Nov 28 04:58:09 localhost ceph-mon[292954]: Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 04:58:09 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:58:09 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.conf Nov 28 04:58:09 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.conf Nov 28 04:58:09 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.conf Nov 28 04:58:10 localhost podman[238687]: time="2025-11-28T09:58:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:58:10 localhost podman[238687]: @ - - [28/Nov/2025:09:58:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 28 04:58:10 localhost podman[238687]: @ - - [28/Nov/2025:09:58:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18743 "" "Go-http-client/1.1" Nov 28 04:58:10 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : Standby manager daemon np0005538514.djozup started Nov 28 04:58:11 localhost nova_compute[279673]: 2025-11-28 09:58:11.042 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:58:11 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:58:11 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:58:11 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.conf Nov 28 04:58:11 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e44: np0005538515.yfkzhl(active, since 5s), standbys: np0005538513.dsfdlx, np0005538514.djozup Nov 28 04:58:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 04:58:11 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 04:58:11 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:12 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 04:58:12 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:12 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 04:58:12 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:12 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 04:58:12 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:12 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 04:58:12 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:12 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 04:58:12 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:12 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:58:12 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:58:12 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/etc/ceph/ceph.client.admin.keyring Nov 28 04:58:12 localhost ceph-mon[292954]: Updating np0005538514.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:58:12 localhost ceph-mon[292954]: Updating np0005538515.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:58:12 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:12 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:12 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:12 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:12 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:12 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:12 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:12 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 04:58:12 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:13 localhost ceph-mon[292954]: Updating np0005538513.localdomain:/var/lib/ceph/2c5417c9-00eb-57d5-a565-ddecbc7995c1/config/ceph.client.admin.keyring Nov 28 04:58:13 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:58:13 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:13 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 04:58:13 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1746112093' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 04:58:13 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 04:58:13 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1746112093' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 04:58:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:58:13 localhost systemd[1]: tmp-crun.dAdJv5.mount: Deactivated successfully. Nov 28 04:58:13 localhost podman[306625]: 2025-11-28 09:58:13.872357412 +0000 UTC m=+0.103883998 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 04:58:13 localhost podman[306625]: 2025-11-28 09:58:13.886516206 +0000 UTC m=+0.118042772 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 04:58:13 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:58:14 localhost nova_compute[279673]: 2025-11-28 09:58:14.209 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:58:14 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:58:15 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 28 04:58:15 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:58:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:58:15 localhost podman[306648]: 2025-11-28 09:58:15.855731313 +0000 UTC m=+0.082841671 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:58:15 localhost podman[306649]: 2025-11-28 09:58:15.948433326 +0000 UTC m=+0.173848542 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Nov 28 04:58:15 localhost podman[306648]: 2025-11-28 09:58:15.959438974 +0000 UTC m=+0.186549352 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:58:15 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:58:15 localhost podman[306649]: 2025-11-28 09:58:15.98641103 +0000 UTC m=+0.211826206 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2) Nov 28 04:58:16 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:58:16 localhost nova_compute[279673]: 2025-11-28 09:58:16.045 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:58:16 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:58:18 localhost openstack_network_exporter[240658]: ERROR 09:58:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:58:18 localhost openstack_network_exporter[240658]: ERROR 09:58:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:58:18 localhost openstack_network_exporter[240658]: ERROR 09:58:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:58:18 localhost openstack_network_exporter[240658]: Nov 28 04:58:18 localhost openstack_network_exporter[240658]: ERROR 09:58:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:58:18 localhost openstack_network_exporter[240658]: ERROR 09:58:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:58:18 localhost openstack_network_exporter[240658]: Nov 28 04:58:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:58:18 localhost podman[306690]: 2025-11-28 09:58:18.855714658 +0000 UTC m=+0.093470097 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible) Nov 28 04:58:18 localhost podman[306690]: 2025-11-28 09:58:18.87041041 +0000 UTC m=+0.108165819 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:58:18 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:58:19 localhost nova_compute[279673]: 2025-11-28 09:58:19.233 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:58:19 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:58:21 localhost nova_compute[279673]: 2025-11-28 09:58:21.046 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:58:24 localhost nova_compute[279673]: 2025-11-28 09:58:24.259 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:58:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:58:26 localhost nova_compute[279673]: 2025-11-28 09:58:26.089 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:58:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:58:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:58:28 localhost podman[306710]: 2025-11-28 09:58:28.846690547 +0000 UTC m=+0.077309962 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 04:58:28 localhost podman[306710]: 2025-11-28 09:58:28.857414026 +0000 UTC m=+0.088033471 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 28 04:58:28 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:58:28 localhost podman[306711]: 2025-11-28 09:58:28.909135641 +0000 UTC m=+0.136269150 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 28 04:58:28 localhost podman[306711]: 2025-11-28 09:58:28.946511157 +0000 UTC m=+0.173644686 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 28 04:58:28 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:58:29 localhost nova_compute[279673]: 2025-11-28 09:58:29.263 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:58:29 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:58:31 localhost nova_compute[279673]: 2025-11-28 09:58:31.092 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:58:34 localhost nova_compute[279673]: 2025-11-28 09:58:34.444 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:58:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 04:58:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.1 total, 600.0 interval#012Cumulative writes: 5057 writes, 22K keys, 5057 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5057 writes, 683 syncs, 7.40 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 86 writes, 299 keys, 86 commit groups, 1.0 writes per commit group, ingest: 0.37 MB, 0.00 MB/s#012Interval WAL: 86 writes, 39 syncs, 2.21 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 28 04:58:34 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:58:34 localhost nova_compute[279673]: 2025-11-28 09:58:34.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:58:35 localhost nova_compute[279673]: 2025-11-28 09:58:35.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:58:35 localhost nova_compute[279673]: 2025-11-28 09:58:35.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:58:35 localhost nova_compute[279673]: 2025-11-28 09:58:35.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:58:35 localhost nova_compute[279673]: 2025-11-28 09:58:35.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:58:35 localhost nova_compute[279673]: 2025-11-28 09:58:35.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 04:58:36 localhost nova_compute[279673]: 2025-11-28 09:58:36.121 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:58:36 localhost nova_compute[279673]: 2025-11-28 09:58:36.768 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:58:37 localhost nova_compute[279673]: 2025-11-28 09:58:37.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:58:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:58:38 localhost podman[306751]: 2025-11-28 09:58:38.839251722 +0000 UTC m=+0.079704896 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, config_id=edpm, io.buildah.version=1.33.7, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 28 04:58:38 localhost podman[306751]: 2025-11-28 09:58:38.882441616 +0000 UTC m=+0.122894750 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, config_id=edpm, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Nov 28 04:58:38 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:58:39 localhost nova_compute[279673]: 2025-11-28 09:58:39.296 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:58:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 04:58:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.3 total, 600.0 interval#012Cumulative writes: 5847 writes, 25K keys, 5847 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5847 writes, 861 syncs, 6.79 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 165 writes, 340 keys, 165 commit groups, 1.0 writes per commit group, ingest: 0.33 MB, 0.00 MB/s#012Interval WAL: 165 writes, 82 syncs, 2.01 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 28 04:58:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:58:39 localhost nova_compute[279673]: 2025-11-28 09:58:39.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:58:39 localhost nova_compute[279673]: 2025-11-28 09:58:39.787 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:58:39 localhost nova_compute[279673]: 2025-11-28 09:58:39.787 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:58:39 localhost nova_compute[279673]: 2025-11-28 09:58:39.788 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:58:39 localhost nova_compute[279673]: 2025-11-28 09:58:39.788 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 04:58:39 localhost nova_compute[279673]: 2025-11-28 09:58:39.789 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:58:40 localhost podman[238687]: time="2025-11-28T09:58:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:58:40 localhost podman[238687]: @ - - [28/Nov/2025:09:58:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 28 04:58:40 localhost podman[238687]: @ - - [28/Nov/2025:09:58:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18739 "" "Go-http-client/1.1" Nov 28 04:58:40 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 04:58:40 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4221705048' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 04:58:40 localhost nova_compute[279673]: 2025-11-28 09:58:40.242 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:58:40 localhost nova_compute[279673]: 2025-11-28 09:58:40.300 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:58:40 localhost nova_compute[279673]: 2025-11-28 09:58:40.300 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:58:40 localhost nova_compute[279673]: 2025-11-28 09:58:40.522 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 04:58:40 localhost nova_compute[279673]: 2025-11-28 09:58:40.523 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11739MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 04:58:40 localhost nova_compute[279673]: 2025-11-28 09:58:40.524 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:58:40 localhost nova_compute[279673]: 2025-11-28 09:58:40.524 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:58:40 localhost nova_compute[279673]: 2025-11-28 09:58:40.594 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 04:58:40 localhost nova_compute[279673]: 2025-11-28 09:58:40.595 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 04:58:40 localhost nova_compute[279673]: 2025-11-28 09:58:40.595 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 04:58:40 localhost nova_compute[279673]: 2025-11-28 09:58:40.633 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:58:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 04:58:41 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1527944200' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 04:58:41 localhost nova_compute[279673]: 2025-11-28 09:58:41.081 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:58:41 localhost nova_compute[279673]: 2025-11-28 09:58:41.088 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 04:58:41 localhost nova_compute[279673]: 2025-11-28 09:58:41.104 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 04:58:41 localhost nova_compute[279673]: 2025-11-28 09:58:41.107 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 04:58:41 localhost nova_compute[279673]: 2025-11-28 09:58:41.107 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.583s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:58:41 localhost nova_compute[279673]: 2025-11-28 09:58:41.123 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:58:44 localhost nova_compute[279673]: 2025-11-28 09:58:44.336 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:58:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:58:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:58:44 localhost systemd[1]: tmp-crun.NyINIH.mount: Deactivated successfully. Nov 28 04:58:44 localhost podman[306815]: 2025-11-28 09:58:44.829135733 +0000 UTC m=+0.070742510 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 28 04:58:44 localhost podman[306815]: 2025-11-28 09:58:44.865348884 +0000 UTC m=+0.106955661 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 28 04:58:44 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:58:46 localhost nova_compute[279673]: 2025-11-28 09:58:46.110 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:58:46 localhost nova_compute[279673]: 2025-11-28 09:58:46.110 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 04:58:46 localhost nova_compute[279673]: 2025-11-28 09:58:46.111 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 04:58:46 localhost nova_compute[279673]: 2025-11-28 09:58:46.173 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:58:46 localhost nova_compute[279673]: 2025-11-28 09:58:46.230 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:58:46 localhost nova_compute[279673]: 2025-11-28 09:58:46.230 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:58:46 localhost nova_compute[279673]: 2025-11-28 09:58:46.230 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 04:58:46 localhost nova_compute[279673]: 2025-11-28 09:58:46.231 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:58:46 localhost nova_compute[279673]: 2025-11-28 09:58:46.614 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 04:58:46 localhost nova_compute[279673]: 2025-11-28 09:58:46.628 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:58:46 localhost nova_compute[279673]: 2025-11-28 09:58:46.628 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 04:58:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:58:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:58:46 localhost podman[306839]: 2025-11-28 09:58:46.846033482 +0000 UTC m=+0.083345107 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:58:46 localhost systemd[1]: tmp-crun.gXDztf.mount: Deactivated successfully. Nov 28 04:58:46 localhost podman[306840]: 2025-11-28 09:58:46.905937078 +0000 UTC m=+0.139750126 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 28 04:58:46 localhost podman[306839]: 2025-11-28 09:58:46.914566143 +0000 UTC m=+0.151877718 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller) Nov 28 04:58:46 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:58:46 localhost podman[306840]: 2025-11-28 09:58:46.939868219 +0000 UTC m=+0.173681287 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:58:46 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:58:48 localhost openstack_network_exporter[240658]: ERROR 09:58:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:58:48 localhost openstack_network_exporter[240658]: ERROR 09:58:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:58:48 localhost openstack_network_exporter[240658]: ERROR 09:58:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:58:48 localhost openstack_network_exporter[240658]: ERROR 09:58:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:58:48 localhost openstack_network_exporter[240658]: Nov 28 04:58:48 localhost openstack_network_exporter[240658]: ERROR 09:58:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:58:48 localhost openstack_network_exporter[240658]: Nov 28 04:58:49 localhost nova_compute[279673]: 2025-11-28 09:58:49.379 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:58:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:58:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:58:49 localhost podman[306882]: 2025-11-28 09:58:49.847572034 +0000 UTC m=+0.084379148 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute) Nov 28 04:58:49 localhost podman[306882]: 2025-11-28 09:58:49.860423759 +0000 UTC m=+0.097230933 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 04:58:49 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:58:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:58:50.839 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:58:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:58:50.839 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:58:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:58:50.840 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:58:51 localhost nova_compute[279673]: 2025-11-28 09:58:51.174 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:58:54 localhost nova_compute[279673]: 2025-11-28 09:58:54.429 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:58:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:58:56 localhost nova_compute[279673]: 2025-11-28 09:58:56.198 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:58:59 localhost nova_compute[279673]: 2025-11-28 09:58:59.432 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:58:59 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:58:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:58:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:58:59 localhost podman[306901]: 2025-11-28 09:58:59.852301428 +0000 UTC m=+0.087973880 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:58:59 localhost podman[306901]: 2025-11-28 09:58:59.863347286 +0000 UTC m=+0.099019758 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:58:59 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:58:59 localhost podman[306902]: 2025-11-28 09:58:59.957190574 +0000 UTC m=+0.189651137 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3) Nov 28 04:58:59 localhost podman[306902]: 2025-11-28 09:58:59.994417105 +0000 UTC m=+0.226877678 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 04:59:00 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:59:01 localhost nova_compute[279673]: 2025-11-28 09:59:01.201 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:04 localhost nova_compute[279673]: 2025-11-28 09:59:04.457 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:04 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:59:06 localhost nova_compute[279673]: 2025-11-28 09:59:06.238 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:09 localhost nova_compute[279673]: 2025-11-28 09:59:09.459 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:59:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:59:09 localhost systemd[1]: tmp-crun.g51wvr.mount: Deactivated successfully. Nov 28 04:59:09 localhost podman[306942]: 2025-11-28 09:59:09.858247797 +0000 UTC m=+0.093339824 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, distribution-scope=public, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Nov 28 04:59:09 localhost podman[306942]: 2025-11-28 09:59:09.898514501 +0000 UTC m=+0.133606538 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.tags=minimal rhel9, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Nov 28 04:59:09 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:59:10 localhost podman[238687]: time="2025-11-28T09:59:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:59:10 localhost podman[238687]: @ - - [28/Nov/2025:09:59:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 28 04:59:10 localhost podman[238687]: @ - - [28/Nov/2025:09:59:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18737 "" "Go-http-client/1.1" Nov 28 04:59:11 localhost nova_compute[279673]: 2025-11-28 09:59:11.241 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:13 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 04:59:13 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4226605833' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 04:59:13 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 04:59:13 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4226605833' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 04:59:13 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 04:59:13 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:59:14 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 04:59:14 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:59:14 localhost nova_compute[279673]: 2025-11-28 09:59:14.462 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:14 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:59:15 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 28 04:59:15 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:59:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:59:15 localhost podman[307049]: 2025-11-28 09:59:15.854268477 +0000 UTC m=+0.087861795 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 04:59:15 localhost podman[307049]: 2025-11-28 09:59:15.891505589 +0000 UTC m=+0.125098847 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 04:59:15 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:59:16 localhost nova_compute[279673]: 2025-11-28 09:59:16.271 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:16 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 04:59:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:59:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:59:17 localhost podman[307072]: 2025-11-28 09:59:17.851126282 +0000 UTC m=+0.083170622 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3) Nov 28 04:59:17 localhost podman[307073]: 2025-11-28 09:59:17.935265602 +0000 UTC m=+0.162412191 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent) Nov 28 04:59:17 localhost podman[307073]: 2025-11-28 09:59:17.94365506 +0000 UTC m=+0.170801669 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:59:17 localhost podman[307072]: 2025-11-28 09:59:17.951453518 +0000 UTC m=+0.183497848 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Nov 28 04:59:17 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:59:17 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:59:18 localhost openstack_network_exporter[240658]: ERROR 09:59:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:59:18 localhost openstack_network_exporter[240658]: ERROR 09:59:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:59:18 localhost openstack_network_exporter[240658]: ERROR 09:59:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:59:18 localhost openstack_network_exporter[240658]: ERROR 09:59:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:59:18 localhost openstack_network_exporter[240658]: Nov 28 04:59:18 localhost openstack_network_exporter[240658]: ERROR 09:59:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:59:18 localhost openstack_network_exporter[240658]: Nov 28 04:59:19 localhost nova_compute[279673]: 2025-11-28 09:59:19.488 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:19 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:59:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:59:20 localhost systemd[1]: tmp-crun.A3VpAz.mount: Deactivated successfully. Nov 28 04:59:20 localhost podman[307114]: 2025-11-28 09:59:20.85545841 +0000 UTC m=+0.092514798 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3) Nov 28 04:59:20 localhost podman[307114]: 2025-11-28 09:59:20.865397705 +0000 UTC m=+0.102454063 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS) Nov 28 04:59:20 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:59:21 localhost nova_compute[279673]: 2025-11-28 09:59:21.275 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0. Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.799368) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34 Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323962799406, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 2315, "num_deletes": 253, "total_data_size": 3904180, "memory_usage": 4047024, "flush_reason": "Manual Compaction"} Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323962818538, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 3664999, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20568, "largest_seqno": 22882, "table_properties": {"data_size": 3655199, "index_size": 6049, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2693, "raw_key_size": 22841, "raw_average_key_size": 21, "raw_value_size": 3634616, "raw_average_value_size": 3451, "num_data_blocks": 261, "num_entries": 1053, "num_filter_entries": 1053, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323814, "oldest_key_time": 1764323814, "file_creation_time": 1764323962, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 19210 microseconds, and 8585 cpu microseconds. Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.818576) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 3664999 bytes OK Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.818598) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.820188) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.820205) EVENT_LOG_v1 {"time_micros": 1764323962820201, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.820222) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 3894106, prev total WAL file size 3894106, number of live WAL files 2. Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.820985) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131323935' seq:72057594037927935, type:22 .. '7061786F73003131353437' seq:0, type:0; will stop at (end) Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(3579KB)], [33(17MB)] Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323962821073, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 21675736, "oldest_snapshot_seqno": -1} Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 12154 keys, 18737772 bytes, temperature: kUnknown Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323962905501, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 18737772, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18666947, "index_size": 39364, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30405, "raw_key_size": 324272, "raw_average_key_size": 26, "raw_value_size": 18458636, "raw_average_value_size": 1518, "num_data_blocks": 1508, "num_entries": 12154, "num_filter_entries": 12154, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764323962, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.905770) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 18737772 bytes Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.907395) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 256.5 rd, 221.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 17.2 +0.0 blob) out(17.9 +0.0 blob), read-write-amplify(11.0) write-amplify(5.1) OK, records in: 12691, records dropped: 537 output_compression: NoCompression Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.907412) EVENT_LOG_v1 {"time_micros": 1764323962907404, "job": 18, "event": "compaction_finished", "compaction_time_micros": 84502, "compaction_time_cpu_micros": 40381, "output_level": 6, "num_output_files": 1, "total_output_size": 18737772, "num_input_records": 12691, "num_output_records": 12154, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323962907821, "job": 18, "event": "table_file_deletion", "file_number": 35} Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764323962908901, "job": 18, "event": "table_file_deletion", "file_number": 33} Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.820881) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.908923) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.908926) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.908928) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.908930) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:59:22 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-09:59:22.908932) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 04:59:24 localhost nova_compute[279673]: 2025-11-28 09:59:24.520 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:59:26 localhost nova_compute[279673]: 2025-11-28 09:59:26.302 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:29 localhost nova_compute[279673]: 2025-11-28 09:59:29.520 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:29 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:59:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 04:59:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 04:59:30 localhost systemd[1]: tmp-crun.i9rMQh.mount: Deactivated successfully. Nov 28 04:59:30 localhost podman[307133]: 2025-11-28 09:59:30.901841136 +0000 UTC m=+0.137905939 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 04:59:30 localhost podman[307134]: 2025-11-28 09:59:30.862411827 +0000 UTC m=+0.094708025 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 28 04:59:30 localhost podman[307134]: 2025-11-28 09:59:30.946529177 +0000 UTC m=+0.178825345 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 28 04:59:30 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 04:59:30 localhost podman[307133]: 2025-11-28 09:59:30.965689115 +0000 UTC m=+0.201753888 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 04:59:30 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 04:59:31 localhost nova_compute[279673]: 2025-11-28 09:59:31.305 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:34 localhost nova_compute[279673]: 2025-11-28 09:59:34.568 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:34 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:59:34 localhost nova_compute[279673]: 2025-11-28 09:59:34.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:59:35 localhost nova_compute[279673]: 2025-11-28 09:59:35.769 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:59:35 localhost nova_compute[279673]: 2025-11-28 09:59:35.770 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 04:59:36 localhost nova_compute[279673]: 2025-11-28 09:59:36.326 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:36 localhost nova_compute[279673]: 2025-11-28 09:59:36.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:59:36 localhost nova_compute[279673]: 2025-11-28 09:59:36.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:59:36 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e45: np0005538515.yfkzhl(active, since 91s), standbys: np0005538513.dsfdlx, np0005538514.djozup Nov 28 04:59:37 localhost nova_compute[279673]: 2025-11-28 09:59:37.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:59:38 localhost nova_compute[279673]: 2025-11-28 09:59:38.766 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:59:39 localhost nova_compute[279673]: 2025-11-28 09:59:39.570 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:59:39 localhost nova_compute[279673]: 2025-11-28 09:59:39.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:59:39 localhost nova_compute[279673]: 2025-11-28 09:59:39.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:59:39 localhost nova_compute[279673]: 2025-11-28 09:59:39.792 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:59:39 localhost nova_compute[279673]: 2025-11-28 09:59:39.793 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:59:39 localhost nova_compute[279673]: 2025-11-28 09:59:39.793 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:59:39 localhost nova_compute[279673]: 2025-11-28 09:59:39.794 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 04:59:39 localhost nova_compute[279673]: 2025-11-28 09:59:39.794 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:59:40 localhost podman[238687]: time="2025-11-28T09:59:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 04:59:40 localhost podman[238687]: @ - - [28/Nov/2025:09:59:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153862 "" "Go-http-client/1.1" Nov 28 04:59:40 localhost podman[238687]: @ - - [28/Nov/2025:09:59:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18744 "" "Go-http-client/1.1" Nov 28 04:59:40 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 04:59:40 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1564004805' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 04:59:40 localhost nova_compute[279673]: 2025-11-28 09:59:40.290 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:59:40 localhost nova_compute[279673]: 2025-11-28 09:59:40.492 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:59:40 localhost nova_compute[279673]: 2025-11-28 09:59:40.493 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 04:59:40 localhost nova_compute[279673]: 2025-11-28 09:59:40.710 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 04:59:40 localhost nova_compute[279673]: 2025-11-28 09:59:40.712 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11677MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 04:59:40 localhost nova_compute[279673]: 2025-11-28 09:59:40.713 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:59:40 localhost nova_compute[279673]: 2025-11-28 09:59:40.713 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:59:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 04:59:40 localhost nova_compute[279673]: 2025-11-28 09:59:40.791 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 04:59:40 localhost nova_compute[279673]: 2025-11-28 09:59:40.792 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 04:59:40 localhost nova_compute[279673]: 2025-11-28 09:59:40.792 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 04:59:40 localhost systemd[1]: tmp-crun.K88qlF.mount: Deactivated successfully. Nov 28 04:59:40 localhost nova_compute[279673]: 2025-11-28 09:59:40.846 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 04:59:40 localhost podman[307198]: 2025-11-28 09:59:40.853383216 +0000 UTC m=+0.091518948 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vcs-type=git, managed_by=edpm_ansible, maintainer=Red Hat, Inc., version=9.6, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=) Nov 28 04:59:40 localhost podman[307198]: 2025-11-28 09:59:40.869913333 +0000 UTC m=+0.108049065 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, release=1755695350, vcs-type=git, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_id=edpm) Nov 28 04:59:40 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 04:59:41 localhost ovn_metadata_agent[158125]: 2025-11-28 09:59:40.999 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 04:59:41 localhost ovn_metadata_agent[158125]: 2025-11-28 09:59:41.000 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 28 04:59:41 localhost nova_compute[279673]: 2025-11-28 09:59:41.035 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:41 localhost nova_compute[279673]: 2025-11-28 09:59:41.328 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 04:59:41 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/986677832' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 04:59:41 localhost nova_compute[279673]: 2025-11-28 09:59:41.347 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 04:59:41 localhost nova_compute[279673]: 2025-11-28 09:59:41.354 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 04:59:41 localhost nova_compute[279673]: 2025-11-28 09:59:41.373 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 04:59:41 localhost nova_compute[279673]: 2025-11-28 09:59:41.376 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 04:59:41 localhost nova_compute[279673]: 2025-11-28 09:59:41.376 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.663s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:59:43 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:59:43.218 261084 INFO oslo.privsep.daemon [None req-4211c54a-c871-4dfb-b53c-cc36590e4915 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp35q7603z/privsep.sock']#033[00m Nov 28 04:59:43 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:59:43.843 261084 INFO oslo.privsep.daemon [None req-4211c54a-c871-4dfb-b53c-cc36590e4915 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Nov 28 04:59:43 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:59:43.728 307245 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 28 04:59:43 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:59:43.735 307245 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 28 04:59:43 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:59:43.738 307245 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m Nov 28 04:59:43 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:59:43.739 307245 INFO oslo.privsep.daemon [-] privsep daemon running as pid 307245#033[00m Nov 28 04:59:44 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:59:44.387 261084 INFO oslo.privsep.daemon [None req-4211c54a-c871-4dfb-b53c-cc36590e4915 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpjkovtoe5/privsep.sock']#033[00m Nov 28 04:59:44 localhost nova_compute[279673]: 2025-11-28 09:59:44.602 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:59:45 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:59:45.041 261084 INFO oslo.privsep.daemon [None req-4211c54a-c871-4dfb-b53c-cc36590e4915 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Nov 28 04:59:45 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:59:44.934 307254 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 28 04:59:45 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:59:44.939 307254 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 28 04:59:45 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:59:44.942 307254 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m Nov 28 04:59:45 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:59:44.943 307254 INFO oslo.privsep.daemon [-] privsep daemon running as pid 307254#033[00m Nov 28 04:59:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 04:59:46 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:59:46.016 261084 INFO oslo.privsep.daemon [None req-4211c54a-c871-4dfb-b53c-cc36590e4915 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmptvywk7ck/privsep.sock']#033[00m Nov 28 04:59:46 localhost podman[307262]: 2025-11-28 09:59:46.082951772 +0000 UTC m=+0.067331486 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 04:59:46 localhost podman[307262]: 2025-11-28 09:59:46.089955147 +0000 UTC m=+0.074334851 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 04:59:46 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e92 do_prune osdmap full prune enabled Nov 28 04:59:46 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 04:59:46 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e93 e93: 6 total, 6 up, 6 in Nov 28 04:59:46 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e93: 6 total, 6 up, 6 in Nov 28 04:59:46 localhost nova_compute[279673]: 2025-11-28 09:59:46.364 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:46 localhost nova_compute[279673]: 2025-11-28 09:59:46.377 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:59:46 localhost nova_compute[279673]: 2025-11-28 09:59:46.378 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 04:59:46 localhost nova_compute[279673]: 2025-11-28 09:59:46.378 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 04:59:46 localhost nova_compute[279673]: 2025-11-28 09:59:46.459 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 04:59:46 localhost nova_compute[279673]: 2025-11-28 09:59:46.460 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 04:59:46 localhost nova_compute[279673]: 2025-11-28 09:59:46.460 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 04:59:46 localhost nova_compute[279673]: 2025-11-28 09:59:46.461 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 04:59:46 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:59:46.624 261084 INFO oslo.privsep.daemon [None req-4211c54a-c871-4dfb-b53c-cc36590e4915 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Nov 28 04:59:46 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:59:46.524 307289 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 28 04:59:46 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:59:46.528 307289 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 28 04:59:46 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:59:46.532 307289 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m Nov 28 04:59:46 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:59:46.532 307289 INFO oslo.privsep.daemon [-] privsep daemon running as pid 307289#033[00m Nov 28 04:59:47 localhost nova_compute[279673]: 2025-11-28 09:59:47.608 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 04:59:47 localhost nova_compute[279673]: 2025-11-28 09:59:47.622 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 04:59:47 localhost nova_compute[279673]: 2025-11-28 09:59:47.623 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 04:59:48 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:59:48.017 261084 INFO neutron.agent.linux.ip_lib [None req-4211c54a-c871-4dfb-b53c-cc36590e4915 - - - - - -] Device tap1686a1d1-ca cannot be used as it has no MAC address#033[00m Nov 28 04:59:48 localhost nova_compute[279673]: 2025-11-28 09:59:48.086 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:48 localhost openstack_network_exporter[240658]: ERROR 09:59:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:59:48 localhost openstack_network_exporter[240658]: ERROR 09:59:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 04:59:48 localhost openstack_network_exporter[240658]: ERROR 09:59:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 04:59:48 localhost kernel: device tap1686a1d1-ca entered promiscuous mode Nov 28 04:59:48 localhost ovn_controller[152322]: 2025-11-28T09:59:48Z|00071|binding|INFO|Claiming lport 1686a1d1-caea-4208-9d74-34f3140388c4 for this chassis. Nov 28 04:59:48 localhost nova_compute[279673]: 2025-11-28 09:59:48.097 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:48 localhost ovn_controller[152322]: 2025-11-28T09:59:48Z|00072|binding|INFO|1686a1d1-caea-4208-9d74-34f3140388c4: Claiming unknown Nov 28 04:59:48 localhost NetworkManager[5967]: [1764323988.1003] manager: (tap1686a1d1-ca): new Generic device (/org/freedesktop/NetworkManager/Devices/17) Nov 28 04:59:48 localhost openstack_network_exporter[240658]: ERROR 09:59:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 04:59:48 localhost openstack_network_exporter[240658]: Nov 28 04:59:48 localhost openstack_network_exporter[240658]: ERROR 09:59:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 04:59:48 localhost openstack_network_exporter[240658]: Nov 28 04:59:48 localhost systemd-udevd[307304]: Network interface NamePolicy= disabled on kernel command line. Nov 28 04:59:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 04:59:48 localhost ovn_controller[152322]: 2025-11-28T09:59:48Z|00073|binding|INFO|Setting lport 1686a1d1-caea-4208-9d74-34f3140388c4 ovn-installed in OVS Nov 28 04:59:48 localhost nova_compute[279673]: 2025-11-28 09:59:48.117 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:48 localhost nova_compute[279673]: 2025-11-28 09:59:48.119 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:48 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e93 do_prune osdmap full prune enabled Nov 28 04:59:48 localhost ovn_controller[152322]: 2025-11-28T09:59:48Z|00074|binding|INFO|Setting lport 1686a1d1-caea-4208-9d74-34f3140388c4 up in Southbound Nov 28 04:59:48 localhost ovn_metadata_agent[158125]: 2025-11-28 09:59:48.121 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.199.3/24', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-5d25adc8-19ca-4816-87ea-2f93f610a253', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d25adc8-19ca-4816-87ea-2f93f610a253', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c326253de5044a60be18dcfa12e29a2c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=97b765ce-e608-49a3-80f7-70d475d900a9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1686a1d1-caea-4208-9d74-34f3140388c4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 04:59:48 localhost ovn_metadata_agent[158125]: 2025-11-28 09:59:48.123 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 1686a1d1-caea-4208-9d74-34f3140388c4 in datapath 5d25adc8-19ca-4816-87ea-2f93f610a253 bound to our chassis#033[00m Nov 28 04:59:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 04:59:48 localhost ovn_metadata_agent[158125]: 2025-11-28 09:59:48.128 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 6c531ff3-3ac3-49e1-bdb8-4b1db0bd1f7e IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 04:59:48 localhost ovn_metadata_agent[158125]: 2025-11-28 09:59:48.129 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5d25adc8-19ca-4816-87ea-2f93f610a253, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 04:59:48 localhost ovn_metadata_agent[158125]: 2025-11-28 09:59:48.130 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[4d195c6c-b3a6-4cae-aa2d-76c2ec29afe8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 04:59:48 localhost journal[227875]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, ) Nov 28 04:59:48 localhost journal[227875]: hostname: np0005538513.localdomain Nov 28 04:59:48 localhost journal[227875]: ethtool ioctl error on tap1686a1d1-ca: No such device Nov 28 04:59:48 localhost nova_compute[279673]: 2025-11-28 09:59:48.148 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:48 localhost journal[227875]: ethtool ioctl error on tap1686a1d1-ca: No such device Nov 28 04:59:48 localhost journal[227875]: ethtool ioctl error on tap1686a1d1-ca: No such device Nov 28 04:59:48 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 e94: 6 total, 6 up, 6 in Nov 28 04:59:48 localhost journal[227875]: ethtool ioctl error on tap1686a1d1-ca: No such device Nov 28 04:59:48 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e94: 6 total, 6 up, 6 in Nov 28 04:59:48 localhost journal[227875]: ethtool ioctl error on tap1686a1d1-ca: No such device Nov 28 04:59:48 localhost journal[227875]: ethtool ioctl error on tap1686a1d1-ca: No such device Nov 28 04:59:48 localhost journal[227875]: ethtool ioctl error on tap1686a1d1-ca: No such device Nov 28 04:59:48 localhost journal[227875]: ethtool ioctl error on tap1686a1d1-ca: No such device Nov 28 04:59:48 localhost nova_compute[279673]: 2025-11-28 09:59:48.196 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:48 localhost nova_compute[279673]: 2025-11-28 09:59:48.223 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:48 localhost podman[307307]: 2025-11-28 09:59:48.229418495 +0000 UTC m=+0.104057492 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:59:48 localhost podman[307306]: 2025-11-28 09:59:48.241288248 +0000 UTC m=+0.113628796 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Nov 28 04:59:48 localhost podman[307307]: 2025-11-28 09:59:48.267416179 +0000 UTC m=+0.142055106 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:59:48 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 04:59:48 localhost podman[307306]: 2025-11-28 09:59:48.299248626 +0000 UTC m=+0.171589163 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Nov 28 04:59:48 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 04:59:49 localhost podman[307420]: Nov 28 04:59:49 localhost podman[307420]: 2025-11-28 09:59:49.090106348 +0000 UTC m=+0.080117158 container create cdcdbea8cfce5c4adf286559acf5e95b26fc212b158f9ca5eedaa4d2e7ad0bdd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d25adc8-19ca-4816-87ea-2f93f610a253, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:59:49 localhost systemd[1]: Started libpod-conmon-cdcdbea8cfce5c4adf286559acf5e95b26fc212b158f9ca5eedaa4d2e7ad0bdd.scope. Nov 28 04:59:49 localhost systemd[1]: Started libcrun container. Nov 28 04:59:49 localhost podman[307420]: 2025-11-28 09:59:49.044887981 +0000 UTC m=+0.034898751 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 04:59:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d2c22b3950f04e53676908bff2fdb47d1d2f45894d34c3bc4830014816f6320d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 04:59:49 localhost podman[307420]: 2025-11-28 09:59:49.158056111 +0000 UTC m=+0.148066911 container init cdcdbea8cfce5c4adf286559acf5e95b26fc212b158f9ca5eedaa4d2e7ad0bdd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d25adc8-19ca-4816-87ea-2f93f610a253, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 04:59:49 localhost podman[307420]: 2025-11-28 09:59:49.166183691 +0000 UTC m=+0.156194451 container start cdcdbea8cfce5c4adf286559acf5e95b26fc212b158f9ca5eedaa4d2e7ad0bdd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5d25adc8-19ca-4816-87ea-2f93f610a253, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true) Nov 28 04:59:49 localhost dnsmasq[307439]: started, version 2.85 cachesize 150 Nov 28 04:59:49 localhost dnsmasq[307439]: DNS service limited to local subnets Nov 28 04:59:49 localhost dnsmasq[307439]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 04:59:49 localhost dnsmasq[307439]: warning: no upstream servers configured Nov 28 04:59:49 localhost dnsmasq-dhcp[307439]: DHCP, static leases only on 192.168.199.0, lease time 1d Nov 28 04:59:49 localhost dnsmasq[307439]: read /var/lib/neutron/dhcp/5d25adc8-19ca-4816-87ea-2f93f610a253/addn_hosts - 0 addresses Nov 28 04:59:49 localhost dnsmasq-dhcp[307439]: read /var/lib/neutron/dhcp/5d25adc8-19ca-4816-87ea-2f93f610a253/host Nov 28 04:59:49 localhost dnsmasq-dhcp[307439]: read /var/lib/neutron/dhcp/5d25adc8-19ca-4816-87ea-2f93f610a253/opts Nov 28 04:59:49 localhost nova_compute[279673]: 2025-11-28 09:59:49.605 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:59:49 localhost neutron_dhcp_agent[261080]: 2025-11-28 09:59:49.820 261084 INFO neutron.agent.dhcp.agent [None req-0eda4073-35cd-43fe-8c0c-1ef507f749f8 - - - - - -] DHCP configuration for ports {'fb2fd669-dbf0-4ffc-99a7-fb5973f387f6'} is completed#033[00m Nov 28 04:59:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:59:50.840 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 04:59:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:59:50.841 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 04:59:50 localhost ovn_metadata_agent[158125]: 2025-11-28 09:59:50.842 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 04:59:51 localhost ovn_metadata_agent[158125]: 2025-11-28 09:59:51.002 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 04:59:51 localhost nova_compute[279673]: 2025-11-28 09:59:51.012 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 04:59:51 localhost nova_compute[279673]: 2025-11-28 09:59:51.397 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 04:59:51 localhost podman[307440]: 2025-11-28 09:59:51.851783836 +0000 UTC m=+0.081890343 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 28 04:59:51 localhost podman[307440]: 2025-11-28 09:59:51.866353602 +0000 UTC m=+0.096460109 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 04:59:51 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 04:59:54 localhost nova_compute[279673]: 2025-11-28 09:59:54.641 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 04:59:56 localhost nova_compute[279673]: 2025-11-28 09:59:56.435 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:59 localhost nova_compute[279673]: 2025-11-28 09:59:59.643 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 04:59:59 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:00:00 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : overall HEALTH_OK Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.675 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.675 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.679 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4fcf2ef0-da97-4d6b-b949-7f5957166d51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:00:00.675893', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '00cc0ee8-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.847788181, 'message_signature': '708b3d23a8558ce9fd8c5a101dac69d363494e10f163bda1ca95326935519461'}]}, 'timestamp': '2025-11-28 10:00:00.680238', '_unique_id': 'b6957b2a1aa64c6eaabff00c46c5ccda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.681 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.682 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.683 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.683 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af622ff2-193e-46c0-83c6-d8dadeab421e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:00:00.683264', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '00cc9bb0-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.847788181, 'message_signature': '05abe9605d585255f12cf26c7ae1f04a5437258b0cc329a6764c34a3602d9808'}]}, 'timestamp': '2025-11-28 10:00:00.683770', '_unique_id': 'd7ffcff4df484ee280990dab24729c94'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.684 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.685 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.686 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30af38aa-c6e7-48fe-a84a-5d4ed8e4322c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:00:00.686006', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '00cd07bc-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.847788181, 'message_signature': 'e5ed4158c320833ea36316b980798e71c1a855b5238c9ebe24be83298e918e5f'}]}, 'timestamp': '2025-11-28 10:00:00.686530', '_unique_id': '729e9e8119a6401d8885a6f0a681f1b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.687 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.688 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.715 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.715 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bbbbd770-f7d3-4f9b-8a22-813d5ac15300', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:00:00.688768', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '00d17fea-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.860666345, 'message_signature': '1ad95c6468c92c467dc069736c58cebd413726da091f36ceb77aacdd4c3cb291'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:00:00.688768', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00d193fe-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.860666345, 'message_signature': '53a2c4ed7ff22f2171934b94b1cab5e7bd2853be6e98d213dc41ff409a213e27'}]}, 'timestamp': '2025-11-28 10:00:00.716303', '_unique_id': '979b6a58f3f44e79b1ff41bfce8cb2ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.717 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.718 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.718 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.719 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0957538-bcc1-4205-8aa0-927eccb35529', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:00:00.718559', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '00d1fda8-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.860666345, 'message_signature': 'cdab6bfdd1344717bf9b5804acdb98b49a5be1263f50890f74960426c7673929'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:00:00.718559', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00d21022-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.860666345, 'message_signature': 'b45025be4ccea91b6d87418a4f31dcc4b671dddafcb212bc1a541d418e867448'}]}, 'timestamp': '2025-11-28 10:00:00.719477', '_unique_id': 'dcf87a874ec942e9995f8b605a684668'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.720 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.721 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.721 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.722 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ff7dc53-8fb7-40a9-9943-5dcad6a6f135', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:00:00.721724', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '00d278f0-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.860666345, 'message_signature': '076ff3c2e16fff47d3c894003750fc2f0fb315d2c80eb0d69f127fffad5fa999'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:00:00.721724', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00d28b24-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.860666345, 'message_signature': '92c62f570e9062ef06607485fe0f65d89240e254b8300373a6b922d89069422f'}]}, 'timestamp': '2025-11-28 10:00:00.722625', '_unique_id': '9482be48f20142df80249cef056e2a10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.723 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.724 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.724 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5edd321c-6f05-448e-b215-29cb1b0044b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:00:00.724849', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '00d2f4b0-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.847788181, 'message_signature': 'd98c1ece93ae85d17dbcfe61467586528b271c68d518d63a805d20680ea67ea5'}]}, 'timestamp': '2025-11-28 10:00:00.725362', '_unique_id': '11b832d685254a86b9c7ba724f752f42'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.726 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.727 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.727 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4459693-10d3-4a1d-836f-5d474d0a48a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:00:00.727513', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '00d35b4e-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.847788181, 'message_signature': '5de45355469180aea3de458d6f8da0801f19bb53a9daab0319bbc6216c1ecfce'}]}, 'timestamp': '2025-11-28 10:00:00.727984', '_unique_id': 'c0e3c8a9331c450bb3620cb6bd326fd0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.728 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.730 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.740 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.740 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a5484bf-ebb0-48aa-bce1-0a71b49a28be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:00:00.730197', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '00d54bca-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.902092836, 'message_signature': '28734dd9774b5457a642c8e87c2494ee3c734056c629281aa3a51ae30728f39c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:00:00.730197', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00d55692-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.902092836, 'message_signature': '54b73825b5903b05a9e6c3119aab07e7c24961fd57f344181d439439e0e24ee2'}]}, 'timestamp': '2025-11-28 10:00:00.740853', '_unique_id': '028ca4c2debc4a668895c8fd78dd7fa9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.741 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.742 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.742 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bcb4c919-8875-4de6-8669-f3fd30a3600b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:00:00.742256', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '00d597ec-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.847788181, 'message_signature': 'fc9a5bccfd5f782a1059cdcc14fcf2ac5dc014400ba7deccb35e689561a31148'}]}, 'timestamp': '2025-11-28 10:00:00.742545', '_unique_id': 'c5c98c11c9574253830436639f624659'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.743 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.760 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dfb4a4e0-c60f-4cc4-b9a2-5be14eab10c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:00:00.743826', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '00d85de2-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.93222141, 'message_signature': '0e270cb9e1b677cb9b23501ead7041043e334f0137ba5a7275dac5f6643f413b'}]}, 'timestamp': '2025-11-28 10:00:00.760713', '_unique_id': '090eb78e0a7246d697f45470b28473ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.761 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.762 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.762 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.762 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15273d9c-5284-49c8-a2c0-477cb228c576', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:00:00.762333', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '00d8a7f2-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.860666345, 'message_signature': '8f19f6dc0e41c8cd3067757600006ae252a26c97652b05421b34685a0c4d783e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:00:00.762333', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00d8b1e8-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.860666345, 'message_signature': 'b728341f1fbeb73599aabbf3b435f30b6fdd90e4a4150b7a2e4221245c31025c'}]}, 'timestamp': '2025-11-28 10:00:00.762849', '_unique_id': '7bdc8324860144f58489cda2495912dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.763 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.764 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.764 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4cddb6e6-f009-4c9a-985c-584a9a38598c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:00:00.764204', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '00d8f126-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.847788181, 'message_signature': '9df5794e59309d7d5e9ae54fff7ca228df719f84f4bf88662048f218b843d716'}]}, 'timestamp': '2025-11-28 10:00:00.764486', '_unique_id': 'd5d94baea96041cb935fbba9635dab24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.765 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.766 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.766 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.766 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.766 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b986547f-d25f-4f77-8b07-37d5abe9e0a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:00:00.766310', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '00d946a8-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.902092836, 'message_signature': 'a77d918f7587cd0df00a79938165e6e2f9a1a1d2bff12dcb287de095c6e2e188'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:00:00.766310', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00d95670-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.902092836, 'message_signature': 'b20cb7fce8b6dba1f1a85b53fd4fc721f1feb44162ef30be8a6fc8651ec66517'}]}, 'timestamp': '2025-11-28 10:00:00.767170', '_unique_id': 'f8d1380af4074493b9168f4437c97bc8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.768 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.769 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.769 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 15130000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ffcee6b2-04b8-4af2-9f1f-c4b4e4aa60fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15130000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:00:00.769336', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '00d9bcd2-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.93222141, 'message_signature': '370b3991f8fdcf573bb36b492ef3be1eec26609fc64cc38ba57270d122c332a3'}]}, 'timestamp': '2025-11-28 10:00:00.769777', '_unique_id': '58d4b92144fa411cb5d461ab909ad691'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.770 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.771 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.771 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45d1300a-a5db-4347-b64b-304e344d99e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:00:00.771806', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '00da1d76-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.847788181, 'message_signature': 'c6beceeb62c9d5c6d1d5909ceadad01bd075f14f3b7c949158ed1cddd741f77b'}]}, 'timestamp': '2025-11-28 10:00:00.772289', '_unique_id': 'd5b93b46000f441682835e0386df27d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.773 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.774 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.774 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf02dfc2-b545-42de-b9a6-66cbb4bc70ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:00:00.774346', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '00da804a-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.847788181, 'message_signature': '40bd15e643a7b8748428b3be916bb50c9f606f37b2d19a4f72c01752b7b47747'}]}, 'timestamp': '2025-11-28 10:00:00.774790', '_unique_id': '189d8bc186b14b1eb2c0746c6c7713ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.775 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.776 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.776 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b45daaf-e3a1-4330-9738-9b4e59b3c26e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:00:00.776892', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '00dae666-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.847788181, 'message_signature': '67409d38b69557410dba1f828d65f3c3549510fa5a3df2f39350b9a022de6546'}]}, 'timestamp': '2025-11-28 10:00:00.777405', '_unique_id': '17456ed79a3d4e98814153b4b6bfe5db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.778 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.779 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.779 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.780 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '341b4372-b4c1-47bb-aad7-1b5d14e16e5f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:00:00.779664', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '00db5128-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.902092836, 'message_signature': '39ff792a604c6202ad12f5b31e887655572dfe637dea049a386cb92f4302bb7a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:00:00.779664', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00db6348-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.902092836, 'message_signature': '2e878b2b527c5396637327ee22547c0f847ad23018c46a97af7f8afa69e5bc10'}]}, 'timestamp': '2025-11-28 10:00:00.780576', '_unique_id': '4f2eb4cb9585482b93b6341cd9e006e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:00:00 localhost ceph-mon[292954]: overall HEALTH_OK Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.781 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.782 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.782 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.783 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3275c8c5-58c8-4e64-b1ab-9e2f06161c2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:00:00.782719', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '00dbc7fc-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.860666345, 'message_signature': '0c2fea60a4ca41b17b7e62e7f9a21fec10a664ac0b9a885370f6cdb25cdb611b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:00:00.782719', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00dbdaa8-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.860666345, 'message_signature': '798d7fcf033419e489e17666414e7707f6962a8e749736ba873007f94a622b8e'}]}, 'timestamp': '2025-11-28 10:00:00.783631', '_unique_id': 'd3429e20496f40e68713897bddcc9ec0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.785 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.785 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.785 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.786 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.786 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c86b5669-f002-4b7c-a039-649199b4359a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:00:00.786068', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '00dc4b00-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.860666345, 'message_signature': 'a12d5b7eb0a9d011d1c0ed158139a239e759326ce79eeacf6cc344765d3cba12'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:00:00.786068', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '00dc5ad2-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11834.860666345, 'message_signature': 'c4294288779ed8822dbf4e69364855c691b1e6b50f78c36be591c1fa6a0a204b'}]}, 'timestamp': '2025-11-28 10:00:00.786910', '_unique_id': 'ccdb52d9ce8443a5b78b87bc4f3b64e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:00:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:00:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 05:00:01 localhost nova_compute[279673]: 2025-11-28 10:00:01.470 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:00:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:00:01 localhost systemd[1]: tmp-crun.48VnJ6.mount: Deactivated successfully. Nov 28 05:00:01 localhost podman[307462]: 2025-11-28 10:00:01.875246819 +0000 UTC m=+0.104908729 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 05:00:01 localhost podman[307462]: 2025-11-28 10:00:01.956404117 +0000 UTC m=+0.186066047 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 28 05:00:01 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:00:01 localhost podman[307463]: 2025-11-28 10:00:01.92878322 +0000 UTC m=+0.155120368 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible) Nov 28 05:00:02 localhost podman[307463]: 2025-11-28 10:00:02.012545089 +0000 UTC m=+0.238882237 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.build-date=20251125) Nov 28 05:00:02 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:00:02 localhost systemd[1]: tmp-crun.MC0ZQS.mount: Deactivated successfully. Nov 28 05:00:04 localhost nova_compute[279673]: 2025-11-28 10:00:04.670 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:04 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:00:06 localhost nova_compute[279673]: 2025-11-28 10:00:06.501 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:09 localhost nova_compute[279673]: 2025-11-28 10:00:09.674 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:00:10 localhost podman[238687]: time="2025-11-28T10:00:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:00:10 localhost podman[238687]: @ - - [28/Nov/2025:10:00:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 28 05:00:10 localhost podman[238687]: @ - - [28/Nov/2025:10:00:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19225 "" "Go-http-client/1.1" Nov 28 05:00:11 localhost nova_compute[279673]: 2025-11-28 10:00:11.532 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:00:11 localhost podman[307505]: 2025-11-28 10:00:11.834933374 +0000 UTC m=+0.069849723 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, managed_by=edpm_ansible, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9) Nov 28 05:00:11 localhost podman[307505]: 2025-11-28 10:00:11.849335696 +0000 UTC m=+0.084252015 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, maintainer=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Nov 28 05:00:11 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:00:14 localhost nova_compute[279673]: 2025-11-28 10:00:14.718 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:14 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0. Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.793650) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37 Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324014793680, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 783, "num_deletes": 256, "total_data_size": 828956, "memory_usage": 844576, "flush_reason": "Manual Compaction"} Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324014799442, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 816319, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 22883, "largest_seqno": 23665, "table_properties": {"data_size": 812648, "index_size": 1462, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8362, "raw_average_key_size": 18, "raw_value_size": 805113, "raw_average_value_size": 1821, "num_data_blocks": 65, "num_entries": 442, "num_filter_entries": 442, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323963, "oldest_key_time": 1764323963, "file_creation_time": 1764324014, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}} Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 5841 microseconds, and 2280 cpu microseconds. Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.799490) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 816319 bytes OK Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.799507) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.802496) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.802512) EVENT_LOG_v1 {"time_micros": 1764324014802507, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.802530) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 825031, prev total WAL file size 825355, number of live WAL files 2. Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.803907) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033373730' seq:72057594037927935, type:22 .. '6C6F676D0034303232' seq:0, type:0; will stop at (end) Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(797KB)], [36(17MB)] Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324014803962, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 19554091, "oldest_snapshot_seqno": -1} Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 12064 keys, 19455161 bytes, temperature: kUnknown Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324014897737, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 19455161, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19383514, "index_size": 40378, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30213, "raw_key_size": 323368, "raw_average_key_size": 26, "raw_value_size": 19175323, "raw_average_value_size": 1589, "num_data_blocks": 1549, "num_entries": 12064, "num_filter_entries": 12064, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764324014, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}} Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.898187) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 19455161 bytes Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.900220) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 208.2 rd, 207.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 17.9 +0.0 blob) out(18.6 +0.0 blob), read-write-amplify(47.8) write-amplify(23.8) OK, records in: 12596, records dropped: 532 output_compression: NoCompression Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.900253) EVENT_LOG_v1 {"time_micros": 1764324014900237, "job": 20, "event": "compaction_finished", "compaction_time_micros": 93920, "compaction_time_cpu_micros": 37695, "output_level": 6, "num_output_files": 1, "total_output_size": 19455161, "num_input_records": 12596, "num_output_records": 12064, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324014900593, "job": 20, "event": "table_file_deletion", "file_number": 38} Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324014903431, "job": 20, "event": "table_file_deletion", "file_number": 36} Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.803853) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.903691) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.903703) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.903707) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.903712) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:00:14 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:00:14.903717) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:00:14 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 05:00:14 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:00:14 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 05:00:14 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:00:15 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 28 05:00:15 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:00:15 localhost nova_compute[279673]: 2025-11-28 10:00:15.910 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:15 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:00:16 localhost nova_compute[279673]: 2025-11-28 10:00:16.533 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:00:16 localhost podman[307610]: 2025-11-28 10:00:16.852407088 +0000 UTC m=+0.084648406 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 05:00:16 localhost podman[307610]: 2025-11-28 10:00:16.866450029 +0000 UTC m=+0.098691387 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 05:00:16 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:00:18 localhost openstack_network_exporter[240658]: ERROR 10:00:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:00:18 localhost openstack_network_exporter[240658]: ERROR 10:00:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:00:18 localhost openstack_network_exporter[240658]: ERROR 10:00:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:00:18 localhost openstack_network_exporter[240658]: ERROR 10:00:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:00:18 localhost openstack_network_exporter[240658]: Nov 28 05:00:18 localhost openstack_network_exporter[240658]: ERROR 10:00:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:00:18 localhost openstack_network_exporter[240658]: Nov 28 05:00:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:00:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:00:18 localhost podman[307634]: 2025-11-28 10:00:18.856328039 +0000 UTC m=+0.087387650 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:00:18 localhost podman[307634]: 2025-11-28 10:00:18.866724378 +0000 UTC m=+0.097783959 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 28 05:00:18 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:00:18 localhost systemd[1]: tmp-crun.TCxsar.mount: Deactivated successfully. Nov 28 05:00:18 localhost podman[307633]: 2025-11-28 10:00:18.959292547 +0000 UTC m=+0.193576087 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Nov 28 05:00:19 localhost podman[307633]: 2025-11-28 10:00:19.021009979 +0000 UTC m=+0.255293489 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3) Nov 28 05:00:19 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:00:19 localhost nova_compute[279673]: 2025-11-28 10:00:19.471 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:19 localhost nova_compute[279673]: 2025-11-28 10:00:19.722 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:19 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:00:21 localhost nova_compute[279673]: 2025-11-28 10:00:21.563 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:00:22 localhost systemd[299031]: Created slice User Background Tasks Slice. Nov 28 05:00:22 localhost podman[307676]: 2025-11-28 10:00:22.84871229 +0000 UTC m=+0.086729261 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 28 05:00:22 localhost systemd[299031]: Starting Cleanup of User's Temporary Files and Directories... Nov 28 05:00:22 localhost systemd[299031]: Finished Cleanup of User's Temporary Files and Directories. Nov 28 05:00:22 localhost podman[307676]: 2025-11-28 10:00:22.915750024 +0000 UTC m=+0.153766945 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 05:00:22 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:00:22 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:00:22.950 261084 INFO neutron.agent.linux.ip_lib [None req-6cf535a6-0ca9-496a-b979-7b405dfe1f6c - - - - - -] Device tap2b1e8904-1c cannot be used as it has no MAC address#033[00m Nov 28 05:00:22 localhost nova_compute[279673]: 2025-11-28 10:00:22.970 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:22 localhost kernel: device tap2b1e8904-1c entered promiscuous mode Nov 28 05:00:22 localhost ovn_controller[152322]: 2025-11-28T10:00:22Z|00075|binding|INFO|Claiming lport 2b1e8904-1c88-4828-a7bc-9f34a2930819 for this chassis. Nov 28 05:00:22 localhost ovn_controller[152322]: 2025-11-28T10:00:22Z|00076|binding|INFO|2b1e8904-1c88-4828-a7bc-9f34a2930819: Claiming unknown Nov 28 05:00:22 localhost nova_compute[279673]: 2025-11-28 10:00:22.978 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:22 localhost NetworkManager[5967]: [1764324022.9823] manager: (tap2b1e8904-1c): new Generic device (/org/freedesktop/NetworkManager/Devices/18) Nov 28 05:00:22 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:22.990 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-0303a35a-aae2-4e58-b0e5-9091112c9857', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0303a35a-aae2-4e58-b0e5-9091112c9857', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2159235bf1c5407eac7a3e3826561913', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=20ff3eab-119a-4740-918d-4005c52a4e27, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2b1e8904-1c88-4828-a7bc-9f34a2930819) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:00:22 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:22.992 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 2b1e8904-1c88-4828-a7bc-9f34a2930819 in datapath 0303a35a-aae2-4e58-b0e5-9091112c9857 bound to our chassis#033[00m Nov 28 05:00:22 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:22.995 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 92b0a1b7-94a7-4946-a022-10c4e44505bf IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:00:22 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:22.995 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0303a35a-aae2-4e58-b0e5-9091112c9857, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:00:22 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:22.997 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[4afd2102-e1c4-4737-980d-d89b8a978b80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:00:23 localhost ovn_controller[152322]: 2025-11-28T10:00:23Z|00077|binding|INFO|Setting lport 2b1e8904-1c88-4828-a7bc-9f34a2930819 ovn-installed in OVS Nov 28 05:00:23 localhost ovn_controller[152322]: 2025-11-28T10:00:23Z|00078|binding|INFO|Setting lport 2b1e8904-1c88-4828-a7bc-9f34a2930819 up in Southbound Nov 28 05:00:23 localhost nova_compute[279673]: 2025-11-28 10:00:23.019 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:23 localhost nova_compute[279673]: 2025-11-28 10:00:23.042 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:23 localhost nova_compute[279673]: 2025-11-28 10:00:23.064 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:23 localhost nova_compute[279673]: 2025-11-28 10:00:23.507 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:23 localhost podman[307761]: Nov 28 05:00:23 localhost podman[307761]: 2025-11-28 10:00:23.865182249 +0000 UTC m=+0.091772594 container create c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0303a35a-aae2-4e58-b0e5-9091112c9857, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:00:23 localhost systemd[1]: Started libpod-conmon-c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b.scope. Nov 28 05:00:23 localhost systemd[1]: tmp-crun.06UuwW.mount: Deactivated successfully. Nov 28 05:00:23 localhost podman[307761]: 2025-11-28 10:00:23.818683544 +0000 UTC m=+0.045273939 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:00:23 localhost systemd[1]: Started libcrun container. Nov 28 05:00:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/776ee30c8f4b1d3b8a5504203661447b2ee50a3f6f2aafa660ced48066eed543/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:00:23 localhost podman[307761]: 2025-11-28 10:00:23.95715074 +0000 UTC m=+0.183741085 container init c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0303a35a-aae2-4e58-b0e5-9091112c9857, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS) Nov 28 05:00:23 localhost podman[307761]: 2025-11-28 10:00:23.966447545 +0000 UTC m=+0.193037900 container start c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0303a35a-aae2-4e58-b0e5-9091112c9857, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 28 05:00:23 localhost dnsmasq[307779]: started, version 2.85 cachesize 150 Nov 28 05:00:23 localhost dnsmasq[307779]: DNS service limited to local subnets Nov 28 05:00:23 localhost dnsmasq[307779]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:00:23 localhost dnsmasq[307779]: warning: no upstream servers configured Nov 28 05:00:23 localhost dnsmasq-dhcp[307779]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:00:23 localhost dnsmasq[307779]: read /var/lib/neutron/dhcp/0303a35a-aae2-4e58-b0e5-9091112c9857/addn_hosts - 0 addresses Nov 28 05:00:23 localhost dnsmasq-dhcp[307779]: read /var/lib/neutron/dhcp/0303a35a-aae2-4e58-b0e5-9091112c9857/host Nov 28 05:00:23 localhost dnsmasq-dhcp[307779]: read /var/lib/neutron/dhcp/0303a35a-aae2-4e58-b0e5-9091112c9857/opts Nov 28 05:00:24 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:00:24.167 261084 INFO neutron.agent.dhcp.agent [None req-ea06895c-1efe-4171-8a6c-fc71b43405d2 - - - - - -] DHCP configuration for ports {'37eaaacf-ed90-43ad-bf8b-bb907591b4ec'} is completed#033[00m Nov 28 05:00:24 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:00:24.343 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:00:23Z, description=, device_id=76efbe69-508f-4a6c-bc6c-575aca933da7, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=8f481bfb-0f63-4459-a61b-a544bb537944, ip_allocation=immediate, mac_address=fa:16:3e:a4:82:17, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:00:20Z, description=, dns_domain=, id=0303a35a-aae2-4e58-b0e5-9091112c9857, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-667497220-network, port_security_enabled=True, project_id=2159235bf1c5407eac7a3e3826561913, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42725, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=176, status=ACTIVE, subnets=['ce2bee42-bca3-4580-8785-d86292bed448'], tags=[], tenant_id=2159235bf1c5407eac7a3e3826561913, updated_at=2025-11-28T10:00:21Z, vlan_transparent=None, network_id=0303a35a-aae2-4e58-b0e5-9091112c9857, port_security_enabled=False, project_id=2159235bf1c5407eac7a3e3826561913, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=210, status=DOWN, tags=[], tenant_id=2159235bf1c5407eac7a3e3826561913, updated_at=2025-11-28T10:00:24Z on network 0303a35a-aae2-4e58-b0e5-9091112c9857#033[00m Nov 28 05:00:24 localhost dnsmasq[307779]: read /var/lib/neutron/dhcp/0303a35a-aae2-4e58-b0e5-9091112c9857/addn_hosts - 1 addresses Nov 28 05:00:24 localhost dnsmasq-dhcp[307779]: read /var/lib/neutron/dhcp/0303a35a-aae2-4e58-b0e5-9091112c9857/host Nov 28 05:00:24 localhost dnsmasq-dhcp[307779]: read /var/lib/neutron/dhcp/0303a35a-aae2-4e58-b0e5-9091112c9857/opts Nov 28 05:00:24 localhost podman[307797]: 2025-11-28 10:00:24.550814044 +0000 UTC m=+0.059017601 container kill c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0303a35a-aae2-4e58-b0e5-9091112c9857, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 28 05:00:24 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:00:24.740 261084 INFO neutron.agent.dhcp.agent [None req-d61f26ab-2d2b-40f2-a922-2384a36589e2 - - - - - -] DHCP configuration for ports {'8f481bfb-0f63-4459-a61b-a544bb537944'} is completed#033[00m Nov 28 05:00:24 localhost nova_compute[279673]: 2025-11-28 10:00:24.745 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:00:25 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:00:25.293 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:00:23Z, description=, device_id=76efbe69-508f-4a6c-bc6c-575aca933da7, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=8f481bfb-0f63-4459-a61b-a544bb537944, ip_allocation=immediate, mac_address=fa:16:3e:a4:82:17, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:00:20Z, description=, dns_domain=, id=0303a35a-aae2-4e58-b0e5-9091112c9857, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-667497220-network, port_security_enabled=True, project_id=2159235bf1c5407eac7a3e3826561913, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42725, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=176, status=ACTIVE, subnets=['ce2bee42-bca3-4580-8785-d86292bed448'], tags=[], tenant_id=2159235bf1c5407eac7a3e3826561913, updated_at=2025-11-28T10:00:21Z, vlan_transparent=None, network_id=0303a35a-aae2-4e58-b0e5-9091112c9857, port_security_enabled=False, project_id=2159235bf1c5407eac7a3e3826561913, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=210, status=DOWN, tags=[], tenant_id=2159235bf1c5407eac7a3e3826561913, updated_at=2025-11-28T10:00:24Z on network 0303a35a-aae2-4e58-b0e5-9091112c9857#033[00m Nov 28 05:00:25 localhost dnsmasq[307779]: read /var/lib/neutron/dhcp/0303a35a-aae2-4e58-b0e5-9091112c9857/addn_hosts - 1 addresses Nov 28 05:00:25 localhost podman[307836]: 2025-11-28 10:00:25.552313826 +0000 UTC m=+0.061887149 container kill c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0303a35a-aae2-4e58-b0e5-9091112c9857, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:00:25 localhost dnsmasq-dhcp[307779]: read /var/lib/neutron/dhcp/0303a35a-aae2-4e58-b0e5-9091112c9857/host Nov 28 05:00:25 localhost dnsmasq-dhcp[307779]: read /var/lib/neutron/dhcp/0303a35a-aae2-4e58-b0e5-9091112c9857/opts Nov 28 05:00:25 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:00:25.701 261084 INFO neutron.agent.linux.ip_lib [None req-d599649c-2a5b-4443-a91a-077aaa3d06fe - - - - - -] Device tap7ac539be-36 cannot be used as it has no MAC address#033[00m Nov 28 05:00:25 localhost nova_compute[279673]: 2025-11-28 10:00:25.729 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:25 localhost kernel: device tap7ac539be-36 entered promiscuous mode Nov 28 05:00:25 localhost NetworkManager[5967]: [1764324025.7352] manager: (tap7ac539be-36): new Generic device (/org/freedesktop/NetworkManager/Devices/19) Nov 28 05:00:25 localhost systemd-udevd[307711]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:00:25 localhost ovn_controller[152322]: 2025-11-28T10:00:25Z|00079|binding|INFO|Claiming lport 7ac539be-3605-4c5e-bb0c-6fbaaf95259b for this chassis. Nov 28 05:00:25 localhost ovn_controller[152322]: 2025-11-28T10:00:25Z|00080|binding|INFO|7ac539be-3605-4c5e-bb0c-6fbaaf95259b: Claiming unknown Nov 28 05:00:25 localhost nova_compute[279673]: 2025-11-28 10:00:25.742 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:25.748 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-eaf25156-4f94-45e1-8ecf-348de157355a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eaf25156-4f94-45e1-8ecf-348de157355a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e759105895c542a0bccbe08b81ef5fde', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d8cbd739-26de-4c8a-898b-6e16313588b2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7ac539be-3605-4c5e-bb0c-6fbaaf95259b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:00:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:25.753 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 7ac539be-3605-4c5e-bb0c-6fbaaf95259b in datapath eaf25156-4f94-45e1-8ecf-348de157355a bound to our chassis#033[00m Nov 28 05:00:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:25.754 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network eaf25156-4f94-45e1-8ecf-348de157355a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:00:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:25.755 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[04757c39-4a8b-47f4-8067-24715abc0c96]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:00:25 localhost ovn_controller[152322]: 2025-11-28T10:00:25Z|00081|binding|INFO|Setting lport 7ac539be-3605-4c5e-bb0c-6fbaaf95259b ovn-installed in OVS Nov 28 05:00:25 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:00:25.778 261084 INFO neutron.agent.dhcp.agent [None req-03019dd2-66de-4dbb-a795-65dc3cf488ff - - - - - -] DHCP configuration for ports {'8f481bfb-0f63-4459-a61b-a544bb537944'} is completed#033[00m Nov 28 05:00:25 localhost ovn_controller[152322]: 2025-11-28T10:00:25Z|00082|binding|INFO|Setting lport 7ac539be-3605-4c5e-bb0c-6fbaaf95259b up in Southbound Nov 28 05:00:25 localhost nova_compute[279673]: 2025-11-28 10:00:25.818 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:25 localhost nova_compute[279673]: 2025-11-28 10:00:25.837 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:25 localhost nova_compute[279673]: 2025-11-28 10:00:25.860 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:26 localhost nova_compute[279673]: 2025-11-28 10:00:26.565 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:26 localhost podman[307923]: Nov 28 05:00:26 localhost podman[307923]: 2025-11-28 10:00:26.936228134 +0000 UTC m=+0.093126867 container create 660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eaf25156-4f94-45e1-8ecf-348de157355a, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Nov 28 05:00:26 localhost systemd[1]: Started libpod-conmon-660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481.scope. Nov 28 05:00:26 localhost systemd[1]: Started libcrun container. Nov 28 05:00:26 localhost podman[307923]: 2025-11-28 10:00:26.89043779 +0000 UTC m=+0.047336563 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:00:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2df9b00178a60559ff4adec6a755d20e525233a80e04a1bc351385bca38e5adf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:00:27 localhost podman[307923]: 2025-11-28 10:00:27.002461215 +0000 UTC m=+0.159359908 container init 660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eaf25156-4f94-45e1-8ecf-348de157355a, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 05:00:27 localhost podman[307923]: 2025-11-28 10:00:27.008692987 +0000 UTC m=+0.165591680 container start 660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eaf25156-4f94-45e1-8ecf-348de157355a, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125) Nov 28 05:00:27 localhost dnsmasq[307941]: started, version 2.85 cachesize 150 Nov 28 05:00:27 localhost dnsmasq[307941]: DNS service limited to local subnets Nov 28 05:00:27 localhost dnsmasq[307941]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:00:27 localhost dnsmasq[307941]: warning: no upstream servers configured Nov 28 05:00:27 localhost dnsmasq-dhcp[307941]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:00:27 localhost dnsmasq[307941]: read /var/lib/neutron/dhcp/eaf25156-4f94-45e1-8ecf-348de157355a/addn_hosts - 0 addresses Nov 28 05:00:27 localhost dnsmasq-dhcp[307941]: read /var/lib/neutron/dhcp/eaf25156-4f94-45e1-8ecf-348de157355a/host Nov 28 05:00:27 localhost dnsmasq-dhcp[307941]: read /var/lib/neutron/dhcp/eaf25156-4f94-45e1-8ecf-348de157355a/opts Nov 28 05:00:27 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:00:27.129 261084 INFO neutron.agent.dhcp.agent [None req-b7f67ea2-1058-43a1-bf19-9b6b0df4439f - - - - - -] DHCP configuration for ports {'2dbf24aa-19c5-4e6e-b9d9-1bcffbe746db'} is completed#033[00m Nov 28 05:00:28 localhost neutron_sriov_agent[254147]: 2025-11-28 10:00:28.457 2 INFO neutron.agent.securitygroups_rpc [None req-cdabab7f-6f0b-439c-ae1d-dd4a3208cd11 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Security group member updated ['dc0a6e12-205a-4d7d-adb2-6545f08f7990']#033[00m Nov 28 05:00:28 localhost nova_compute[279673]: 2025-11-28 10:00:28.896 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:29 localhost nova_compute[279673]: 2025-11-28 10:00:29.791 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:29 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:00:29 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:00:29.876 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:00:29Z, description=, device_id=cd2a6086-9326-4cdd-a015-4768e2092068, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=02232087-1968-4362-9c1a-bb2acd27c4fd, ip_allocation=immediate, mac_address=fa:16:3e:1e:27:3b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:00:23Z, description=, dns_domain=, id=eaf25156-4f94-45e1-8ecf-348de157355a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ImagesNegativeTestJSON-1382899154-network, port_security_enabled=True, project_id=e759105895c542a0bccbe08b81ef5fde, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=30925, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=209, status=ACTIVE, subnets=['832963b0-c873-40fe-a644-0392c51c7abc'], tags=[], tenant_id=e759105895c542a0bccbe08b81ef5fde, updated_at=2025-11-28T10:00:24Z, vlan_transparent=None, network_id=eaf25156-4f94-45e1-8ecf-348de157355a, port_security_enabled=False, project_id=e759105895c542a0bccbe08b81ef5fde, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=266, status=DOWN, tags=[], tenant_id=e759105895c542a0bccbe08b81ef5fde, updated_at=2025-11-28T10:00:29Z on network eaf25156-4f94-45e1-8ecf-348de157355a#033[00m Nov 28 05:00:30 localhost systemd[1]: tmp-crun.5zwxIR.mount: Deactivated successfully. Nov 28 05:00:30 localhost dnsmasq[307941]: read /var/lib/neutron/dhcp/eaf25156-4f94-45e1-8ecf-348de157355a/addn_hosts - 1 addresses Nov 28 05:00:30 localhost podman[307959]: 2025-11-28 10:00:30.081305568 +0000 UTC m=+0.073540346 container kill 660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eaf25156-4f94-45e1-8ecf-348de157355a, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:00:30 localhost dnsmasq-dhcp[307941]: read /var/lib/neutron/dhcp/eaf25156-4f94-45e1-8ecf-348de157355a/host Nov 28 05:00:30 localhost dnsmasq-dhcp[307941]: read /var/lib/neutron/dhcp/eaf25156-4f94-45e1-8ecf-348de157355a/opts Nov 28 05:00:30 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:00:30.291 261084 INFO neutron.agent.dhcp.agent [None req-234ca4e1-bb35-4f86-be5b-19c875954a0b - - - - - -] DHCP configuration for ports {'02232087-1968-4362-9c1a-bb2acd27c4fd'} is completed#033[00m Nov 28 05:00:31 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:00:31.434 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:00:29Z, description=, device_id=cd2a6086-9326-4cdd-a015-4768e2092068, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=02232087-1968-4362-9c1a-bb2acd27c4fd, ip_allocation=immediate, mac_address=fa:16:3e:1e:27:3b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:00:23Z, description=, dns_domain=, id=eaf25156-4f94-45e1-8ecf-348de157355a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ImagesNegativeTestJSON-1382899154-network, port_security_enabled=True, project_id=e759105895c542a0bccbe08b81ef5fde, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=30925, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=209, status=ACTIVE, subnets=['832963b0-c873-40fe-a644-0392c51c7abc'], tags=[], tenant_id=e759105895c542a0bccbe08b81ef5fde, updated_at=2025-11-28T10:00:24Z, vlan_transparent=None, network_id=eaf25156-4f94-45e1-8ecf-348de157355a, port_security_enabled=False, project_id=e759105895c542a0bccbe08b81ef5fde, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=266, status=DOWN, tags=[], tenant_id=e759105895c542a0bccbe08b81ef5fde, updated_at=2025-11-28T10:00:29Z on network eaf25156-4f94-45e1-8ecf-348de157355a#033[00m Nov 28 05:00:31 localhost nova_compute[279673]: 2025-11-28 10:00:31.566 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:31 localhost systemd[1]: tmp-crun.xcSOIe.mount: Deactivated successfully. Nov 28 05:00:31 localhost dnsmasq[307941]: read /var/lib/neutron/dhcp/eaf25156-4f94-45e1-8ecf-348de157355a/addn_hosts - 1 addresses Nov 28 05:00:31 localhost dnsmasq-dhcp[307941]: read /var/lib/neutron/dhcp/eaf25156-4f94-45e1-8ecf-348de157355a/host Nov 28 05:00:31 localhost dnsmasq-dhcp[307941]: read /var/lib/neutron/dhcp/eaf25156-4f94-45e1-8ecf-348de157355a/opts Nov 28 05:00:31 localhost podman[307997]: 2025-11-28 10:00:31.676661541 +0000 UTC m=+0.067476650 container kill 660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eaf25156-4f94-45e1-8ecf-348de157355a, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Nov 28 05:00:31 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:00:31.947 261084 INFO neutron.agent.dhcp.agent [None req-d9267bb8-5b10-4d33-a5f9-fc4f94d88822 - - - - - -] DHCP configuration for ports {'02232087-1968-4362-9c1a-bb2acd27c4fd'} is completed#033[00m Nov 28 05:00:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:00:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:00:32 localhost nova_compute[279673]: 2025-11-28 10:00:32.768 279685 DEBUG oslo_concurrency.processutils [None req-52762099-22fc-4981-b6d6-4df819853c4e 8ea6e2aec9474e6594c08987b8c79204 ea61a5236ad2407485482a6f7462d550 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:00:32 localhost nova_compute[279673]: 2025-11-28 10:00:32.790 279685 DEBUG oslo_concurrency.processutils [None req-52762099-22fc-4981-b6d6-4df819853c4e 8ea6e2aec9474e6594c08987b8c79204 ea61a5236ad2407485482a6f7462d550 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:00:32 localhost systemd[1]: tmp-crun.7r4fWd.mount: Deactivated successfully. Nov 28 05:00:32 localhost podman[308018]: 2025-11-28 10:00:32.870958284 +0000 UTC m=+0.103367111 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:00:32 localhost neutron_sriov_agent[254147]: 2025-11-28 10:00:32.894 2 INFO neutron.agent.securitygroups_rpc [None req-d73a2eae-adce-4d72-91ae-34d59db28d8a c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Security group member updated ['dc0a6e12-205a-4d7d-adb2-6545f08f7990']#033[00m Nov 28 05:00:32 localhost podman[308017]: 2025-11-28 10:00:32.907618738 +0000 UTC m=+0.141743007 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 05:00:32 localhost podman[308017]: 2025-11-28 10:00:32.92138581 +0000 UTC m=+0.155510109 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 05:00:32 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:00:32 localhost podman[308018]: 2025-11-28 10:00:32.937220306 +0000 UTC m=+0.169629193 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 28 05:00:32 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:00:33 localhost nova_compute[279673]: 2025-11-28 10:00:33.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:00:33 localhost nova_compute[279673]: 2025-11-28 10:00:33.770 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Nov 28 05:00:33 localhost nova_compute[279673]: 2025-11-28 10:00:33.789 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Nov 28 05:00:34 localhost podman[308076]: 2025-11-28 10:00:34.603358189 +0000 UTC m=+0.121603361 container kill 660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eaf25156-4f94-45e1-8ecf-348de157355a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125) Nov 28 05:00:34 localhost systemd[1]: tmp-crun.5kjoZO.mount: Deactivated successfully. Nov 28 05:00:34 localhost dnsmasq[307941]: read /var/lib/neutron/dhcp/eaf25156-4f94-45e1-8ecf-348de157355a/addn_hosts - 0 addresses Nov 28 05:00:34 localhost dnsmasq-dhcp[307941]: read /var/lib/neutron/dhcp/eaf25156-4f94-45e1-8ecf-348de157355a/host Nov 28 05:00:34 localhost dnsmasq-dhcp[307941]: read /var/lib/neutron/dhcp/eaf25156-4f94-45e1-8ecf-348de157355a/opts Nov 28 05:00:34 localhost nova_compute[279673]: 2025-11-28 10:00:34.789 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:00:34 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:00:34 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:00:34.798 261084 INFO neutron.agent.linux.ip_lib [None req-0f46b05d-c1aa-41b7-8cf8-ed5d8b68a147 - - - - - -] Device tap74cdc895-82 cannot be used as it has no MAC address#033[00m Nov 28 05:00:34 localhost nova_compute[279673]: 2025-11-28 10:00:34.822 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:34 localhost kernel: device tap7ac539be-36 left promiscuous mode Nov 28 05:00:34 localhost ovn_controller[152322]: 2025-11-28T10:00:34Z|00083|binding|INFO|Releasing lport 7ac539be-3605-4c5e-bb0c-6fbaaf95259b from this chassis (sb_readonly=0) Nov 28 05:00:34 localhost ovn_controller[152322]: 2025-11-28T10:00:34Z|00084|binding|INFO|Setting lport 7ac539be-3605-4c5e-bb0c-6fbaaf95259b down in Southbound Nov 28 05:00:34 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:34.833 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-eaf25156-4f94-45e1-8ecf-348de157355a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eaf25156-4f94-45e1-8ecf-348de157355a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e759105895c542a0bccbe08b81ef5fde', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d8cbd739-26de-4c8a-898b-6e16313588b2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7ac539be-3605-4c5e-bb0c-6fbaaf95259b) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:00:34 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:34.835 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 7ac539be-3605-4c5e-bb0c-6fbaaf95259b in datapath eaf25156-4f94-45e1-8ecf-348de157355a unbound from our chassis#033[00m Nov 28 05:00:34 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:34.838 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network eaf25156-4f94-45e1-8ecf-348de157355a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:00:34 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:34.839 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[f717e42d-5471-4b99-a0a0-7ac189dbda8f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:00:34 localhost nova_compute[279673]: 2025-11-28 10:00:34.843 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:34 localhost nova_compute[279673]: 2025-11-28 10:00:34.848 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:34 localhost kernel: device tap74cdc895-82 entered promiscuous mode Nov 28 05:00:34 localhost NetworkManager[5967]: [1764324034.8563] manager: (tap74cdc895-82): new Generic device (/org/freedesktop/NetworkManager/Devices/20) Nov 28 05:00:34 localhost systemd-udevd[308110]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:00:34 localhost ovn_controller[152322]: 2025-11-28T10:00:34Z|00085|binding|INFO|Claiming lport 74cdc895-82eb-45db-a408-03b43d3fc10f for this chassis. Nov 28 05:00:34 localhost ovn_controller[152322]: 2025-11-28T10:00:34Z|00086|binding|INFO|74cdc895-82eb-45db-a408-03b43d3fc10f: Claiming unknown Nov 28 05:00:34 localhost nova_compute[279673]: 2025-11-28 10:00:34.863 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:34 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:34.878 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-991320b8-b994-4199-922f-5c3428b3e7ba', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-991320b8-b994-4199-922f-5c3428b3e7ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '40693e6dadaf448a8cb4caeb6899effc', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=16db044e-e796-4ecb-9ada-84075a99aa73, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=74cdc895-82eb-45db-a408-03b43d3fc10f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:00:34 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:34.880 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 74cdc895-82eb-45db-a408-03b43d3fc10f in datapath 991320b8-b994-4199-922f-5c3428b3e7ba bound to our chassis#033[00m Nov 28 05:00:34 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:34.883 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port e3d22942-ad97-4dc8-965d-3e878722ec78 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:00:34 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:34.883 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 991320b8-b994-4199-922f-5c3428b3e7ba, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:00:34 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:34.884 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[4450e3dc-aa82-4c6b-923a-eba5d97ace38]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:00:34 localhost journal[227875]: ethtool ioctl error on tap74cdc895-82: No such device Nov 28 05:00:34 localhost journal[227875]: ethtool ioctl error on tap74cdc895-82: No such device Nov 28 05:00:34 localhost ovn_controller[152322]: 2025-11-28T10:00:34Z|00087|binding|INFO|Setting lport 74cdc895-82eb-45db-a408-03b43d3fc10f ovn-installed in OVS Nov 28 05:00:34 localhost ovn_controller[152322]: 2025-11-28T10:00:34Z|00088|binding|INFO|Setting lport 74cdc895-82eb-45db-a408-03b43d3fc10f up in Southbound Nov 28 05:00:34 localhost nova_compute[279673]: 2025-11-28 10:00:34.906 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:34 localhost journal[227875]: ethtool ioctl error on tap74cdc895-82: No such device Nov 28 05:00:34 localhost journal[227875]: ethtool ioctl error on tap74cdc895-82: No such device Nov 28 05:00:34 localhost journal[227875]: ethtool ioctl error on tap74cdc895-82: No such device Nov 28 05:00:34 localhost journal[227875]: ethtool ioctl error on tap74cdc895-82: No such device Nov 28 05:00:34 localhost journal[227875]: ethtool ioctl error on tap74cdc895-82: No such device Nov 28 05:00:34 localhost journal[227875]: ethtool ioctl error on tap74cdc895-82: No such device Nov 28 05:00:34 localhost nova_compute[279673]: 2025-11-28 10:00:34.944 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:34 localhost nova_compute[279673]: 2025-11-28 10:00:34.975 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:35 localhost nova_compute[279673]: 2025-11-28 10:00:35.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:00:35 localhost nova_compute[279673]: 2025-11-28 10:00:35.770 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 05:00:35 localhost podman[308181]: Nov 28 05:00:35 localhost podman[308181]: 2025-11-28 10:00:35.984321847 +0000 UTC m=+0.090583210 container create 65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-991320b8-b994-4199-922f-5c3428b3e7ba, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 28 05:00:36 localhost systemd[1]: Started libpod-conmon-65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3.scope. Nov 28 05:00:36 localhost podman[308181]: 2025-11-28 10:00:35.939721069 +0000 UTC m=+0.045982482 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:00:36 localhost systemd[1]: tmp-crun.nxU2n8.mount: Deactivated successfully. Nov 28 05:00:36 localhost systemd[1]: Started libcrun container. Nov 28 05:00:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e7a176f0a1d7dcae07345c04d77b729e8101196fbb4e6b2ed6c12ecbe6a4c6d1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:00:36 localhost podman[308181]: 2025-11-28 10:00:36.065870767 +0000 UTC m=+0.172132130 container init 65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-991320b8-b994-4199-922f-5c3428b3e7ba, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:00:36 localhost podman[308181]: 2025-11-28 10:00:36.075396259 +0000 UTC m=+0.181657622 container start 65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-991320b8-b994-4199-922f-5c3428b3e7ba, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Nov 28 05:00:36 localhost dnsmasq[308200]: started, version 2.85 cachesize 150 Nov 28 05:00:36 localhost dnsmasq[308200]: DNS service limited to local subnets Nov 28 05:00:36 localhost dnsmasq[308200]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:00:36 localhost dnsmasq[308200]: warning: no upstream servers configured Nov 28 05:00:36 localhost dnsmasq-dhcp[308200]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:00:36 localhost dnsmasq[308200]: read /var/lib/neutron/dhcp/991320b8-b994-4199-922f-5c3428b3e7ba/addn_hosts - 0 addresses Nov 28 05:00:36 localhost dnsmasq-dhcp[308200]: read /var/lib/neutron/dhcp/991320b8-b994-4199-922f-5c3428b3e7ba/host Nov 28 05:00:36 localhost dnsmasq-dhcp[308200]: read /var/lib/neutron/dhcp/991320b8-b994-4199-922f-5c3428b3e7ba/opts Nov 28 05:00:36 localhost nova_compute[279673]: 2025-11-28 10:00:36.195 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:36 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:00:36.235 261084 INFO neutron.agent.dhcp.agent [None req-2958e5aa-ff2b-4bc3-ba12-f371f4b2838e - - - - - -] DHCP configuration for ports {'a2cf2768-78e2-4b7f-a3d6-298ef981fb5a'} is completed#033[00m Nov 28 05:00:36 localhost nova_compute[279673]: 2025-11-28 10:00:36.567 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:36 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:00:36.688 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:00:36Z, description=, device_id=5e2bdb5c-9386-4f23-88bb-b64884bb41d1, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=198dd71f-c3e0-4377-9719-f2a42158bba7, ip_allocation=immediate, mac_address=fa:16:3e:0d:77:d5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:00:32Z, description=, dns_domain=, id=991320b8-b994-4199-922f-5c3428b3e7ba, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupRulesTestJSON-474917441-network, port_security_enabled=True, project_id=40693e6dadaf448a8cb4caeb6899effc, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=43850, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=289, status=ACTIVE, subnets=['ea71b8b0-31a5-4591-a369-92ec92968c41'], tags=[], tenant_id=40693e6dadaf448a8cb4caeb6899effc, updated_at=2025-11-28T10:00:33Z, vlan_transparent=None, network_id=991320b8-b994-4199-922f-5c3428b3e7ba, port_security_enabled=False, project_id=40693e6dadaf448a8cb4caeb6899effc, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=329, status=DOWN, tags=[], tenant_id=40693e6dadaf448a8cb4caeb6899effc, updated_at=2025-11-28T10:00:36Z on network 991320b8-b994-4199-922f-5c3428b3e7ba#033[00m Nov 28 05:00:36 localhost dnsmasq[308200]: read /var/lib/neutron/dhcp/991320b8-b994-4199-922f-5c3428b3e7ba/addn_hosts - 1 addresses Nov 28 05:00:36 localhost podman[308218]: 2025-11-28 10:00:36.901949286 +0000 UTC m=+0.060676772 container kill 65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-991320b8-b994-4199-922f-5c3428b3e7ba, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3) Nov 28 05:00:36 localhost dnsmasq-dhcp[308200]: read /var/lib/neutron/dhcp/991320b8-b994-4199-922f-5c3428b3e7ba/host Nov 28 05:00:36 localhost dnsmasq-dhcp[308200]: read /var/lib/neutron/dhcp/991320b8-b994-4199-922f-5c3428b3e7ba/opts Nov 28 05:00:37 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:00:37.097 261084 INFO neutron.agent.dhcp.agent [None req-700b54e0-0430-4124-8e02-376dd1f082ac - - - - - -] DHCP configuration for ports {'198dd71f-c3e0-4377-9719-f2a42158bba7'} is completed#033[00m Nov 28 05:00:37 localhost ovn_controller[152322]: 2025-11-28T10:00:37Z|00089|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:00:37 localhost nova_compute[279673]: 2025-11-28 10:00:37.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:00:37 localhost nova_compute[279673]: 2025-11-28 10:00:37.843 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:37 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:00:37.906 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:00:36Z, description=, device_id=5e2bdb5c-9386-4f23-88bb-b64884bb41d1, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=198dd71f-c3e0-4377-9719-f2a42158bba7, ip_allocation=immediate, mac_address=fa:16:3e:0d:77:d5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:00:32Z, description=, dns_domain=, id=991320b8-b994-4199-922f-5c3428b3e7ba, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupRulesTestJSON-474917441-network, port_security_enabled=True, project_id=40693e6dadaf448a8cb4caeb6899effc, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=43850, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=289, status=ACTIVE, subnets=['ea71b8b0-31a5-4591-a369-92ec92968c41'], tags=[], tenant_id=40693e6dadaf448a8cb4caeb6899effc, updated_at=2025-11-28T10:00:33Z, vlan_transparent=None, network_id=991320b8-b994-4199-922f-5c3428b3e7ba, port_security_enabled=False, project_id=40693e6dadaf448a8cb4caeb6899effc, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=329, status=DOWN, tags=[], tenant_id=40693e6dadaf448a8cb4caeb6899effc, updated_at=2025-11-28T10:00:36Z on network 991320b8-b994-4199-922f-5c3428b3e7ba#033[00m Nov 28 05:00:38 localhost dnsmasq[308200]: read /var/lib/neutron/dhcp/991320b8-b994-4199-922f-5c3428b3e7ba/addn_hosts - 1 addresses Nov 28 05:00:38 localhost dnsmasq-dhcp[308200]: read /var/lib/neutron/dhcp/991320b8-b994-4199-922f-5c3428b3e7ba/host Nov 28 05:00:38 localhost dnsmasq-dhcp[308200]: read /var/lib/neutron/dhcp/991320b8-b994-4199-922f-5c3428b3e7ba/opts Nov 28 05:00:38 localhost podman[308255]: 2025-11-28 10:00:38.128118878 +0000 UTC m=+0.069828113 container kill 65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-991320b8-b994-4199-922f-5c3428b3e7ba, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:00:38 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:00:38.373 261084 INFO neutron.agent.dhcp.agent [None req-02a85b7f-f781-4cc1-8f41-70ded975ff74 - - - - - -] DHCP configuration for ports {'198dd71f-c3e0-4377-9719-f2a42158bba7'} is completed#033[00m Nov 28 05:00:38 localhost dnsmasq[307941]: exiting on receipt of SIGTERM Nov 28 05:00:38 localhost podman[308292]: 2025-11-28 10:00:38.478465621 +0000 UTC m=+0.058999130 container kill 660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eaf25156-4f94-45e1-8ecf-348de157355a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:00:38 localhost systemd[1]: libpod-660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481.scope: Deactivated successfully. Nov 28 05:00:38 localhost podman[308306]: 2025-11-28 10:00:38.543916078 +0000 UTC m=+0.055209135 container died 660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eaf25156-4f94-45e1-8ecf-348de157355a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:00:38 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481-userdata-shm.mount: Deactivated successfully. Nov 28 05:00:38 localhost podman[308306]: 2025-11-28 10:00:38.592334822 +0000 UTC m=+0.103627839 container cleanup 660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eaf25156-4f94-45e1-8ecf-348de157355a, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Nov 28 05:00:38 localhost systemd[1]: libpod-conmon-660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481.scope: Deactivated successfully. Nov 28 05:00:38 localhost podman[308313]: 2025-11-28 10:00:38.618486255 +0000 UTC m=+0.115216764 container remove 660b7142be459deaae037bbef89739d51791c4171c3ba2cdada588eae4d46481 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eaf25156-4f94-45e1-8ecf-348de157355a, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125) Nov 28 05:00:38 localhost nova_compute[279673]: 2025-11-28 10:00:38.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:00:38 localhost nova_compute[279673]: 2025-11-28 10:00:38.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:00:38 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:00:38.806 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:00:39 localhost systemd[1]: var-lib-containers-storage-overlay-2df9b00178a60559ff4adec6a755d20e525233a80e04a1bc351385bca38e5adf-merged.mount: Deactivated successfully. Nov 28 05:00:39 localhost systemd[1]: run-netns-qdhcp\x2deaf25156\x2d4f94\x2d45e1\x2d8ecf\x2d348de157355a.mount: Deactivated successfully. Nov 28 05:00:39 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:00:39.188 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:00:39 localhost nova_compute[279673]: 2025-11-28 10:00:39.768 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:00:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:00:39 localhost nova_compute[279673]: 2025-11-28 10:00:39.826 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:40 localhost podman[238687]: time="2025-11-28T10:00:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:00:40 localhost podman[238687]: @ - - [28/Nov/2025:10:00:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159336 "" "Go-http-client/1.1" Nov 28 05:00:40 localhost podman[238687]: @ - - [28/Nov/2025:10:00:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20195 "" "Go-http-client/1.1" Nov 28 05:00:40 localhost nova_compute[279673]: 2025-11-28 10:00:40.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:00:41 localhost neutron_sriov_agent[254147]: 2025-11-28 10:00:41.117 2 INFO neutron.agent.securitygroups_rpc [req-a368a1e6-562c-4526-ae3a-0e69b2a15bca req-d778405e-6e70-4ce8-a2a0-d781e7c8b4de 67aa4e1531db4854a2788f0bbeaba314 40693e6dadaf448a8cb4caeb6899effc - - default default] Security group rule updated ['f59fc346-b907-4d25-9b54-7ce550f4338f']#033[00m Nov 28 05:00:41 localhost nova_compute[279673]: 2025-11-28 10:00:41.147 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:41.147 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:00:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:41.149 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 28 05:00:41 localhost nova_compute[279673]: 2025-11-28 10:00:41.597 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:41 localhost nova_compute[279673]: 2025-11-28 10:00:41.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:00:41 localhost nova_compute[279673]: 2025-11-28 10:00:41.796 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:00:41 localhost nova_compute[279673]: 2025-11-28 10:00:41.796 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:00:41 localhost nova_compute[279673]: 2025-11-28 10:00:41.797 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:00:41 localhost nova_compute[279673]: 2025-11-28 10:00:41.797 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 05:00:41 localhost nova_compute[279673]: 2025-11-28 10:00:41.797 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:00:42 localhost neutron_sriov_agent[254147]: 2025-11-28 10:00:42.140 2 INFO neutron.agent.securitygroups_rpc [req-91a233d2-d128-4e77-ba4a-bffdfb538e2c req-2dda1abe-8246-445d-82ca-7050072f5ab7 67aa4e1531db4854a2788f0bbeaba314 40693e6dadaf448a8cb4caeb6899effc - - default default] Security group rule updated ['793a871e-42bd-4871-9764-ed4c16f282ee']#033[00m Nov 28 05:00:42 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:00:42 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2613041327' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:00:42 localhost nova_compute[279673]: 2025-11-28 10:00:42.250 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:00:42 localhost nova_compute[279673]: 2025-11-28 10:00:42.319 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:00:42 localhost nova_compute[279673]: 2025-11-28 10:00:42.320 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:00:42 localhost nova_compute[279673]: 2025-11-28 10:00:42.553 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 05:00:42 localhost nova_compute[279673]: 2025-11-28 10:00:42.555 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11361MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 05:00:42 localhost nova_compute[279673]: 2025-11-28 10:00:42.556 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:00:42 localhost nova_compute[279673]: 2025-11-28 10:00:42.556 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:00:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:00:42 localhost podman[308359]: 2025-11-28 10:00:42.856065581 +0000 UTC m=+0.080948432 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350) Nov 28 05:00:42 localhost podman[308359]: 2025-11-28 10:00:42.869798603 +0000 UTC m=+0.094681444 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 05:00:42 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:00:43 localhost nova_compute[279673]: 2025-11-28 10:00:43.023 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 05:00:43 localhost nova_compute[279673]: 2025-11-28 10:00:43.023 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 05:00:43 localhost nova_compute[279673]: 2025-11-28 10:00:43.024 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 05:00:43 localhost neutron_sriov_agent[254147]: 2025-11-28 10:00:43.145 2 INFO neutron.agent.securitygroups_rpc [req-9f8c831a-ecf0-4e4e-9595-719ef0ba964a req-7a744279-94cd-4894-9a01-f869bc00b409 67aa4e1531db4854a2788f0bbeaba314 40693e6dadaf448a8cb4caeb6899effc - - default default] Security group rule updated ['03578922-528e-499a-8e7e-7a5c262d5e64']#033[00m Nov 28 05:00:43 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:43.151 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:00:43 localhost nova_compute[279673]: 2025-11-28 10:00:43.313 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing inventories for resource provider 35fead26-0bad-4950-b646-987079d58a17 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 28 05:00:43 localhost nova_compute[279673]: 2025-11-28 10:00:43.775 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Updating ProviderTree inventory for provider 35fead26-0bad-4950-b646-987079d58a17 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 28 05:00:43 localhost nova_compute[279673]: 2025-11-28 10:00:43.775 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 28 05:00:43 localhost nova_compute[279673]: 2025-11-28 10:00:43.797 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing aggregate associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 28 05:00:43 localhost nova_compute[279673]: 2025-11-28 10:00:43.834 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing trait associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_FMA3,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_ABM,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AMD_SVM,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_CLMUL,HW_CPU_X86_BMI2,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 28 05:00:43 localhost nova_compute[279673]: 2025-11-28 10:00:43.875 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:00:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:00:44 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1539368950' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:00:44 localhost nova_compute[279673]: 2025-11-28 10:00:44.335 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:00:44 localhost nova_compute[279673]: 2025-11-28 10:00:44.341 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 05:00:44 localhost nova_compute[279673]: 2025-11-28 10:00:44.359 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 05:00:44 localhost nova_compute[279673]: 2025-11-28 10:00:44.362 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 05:00:44 localhost nova_compute[279673]: 2025-11-28 10:00:44.362 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.806s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:00:44 localhost nova_compute[279673]: 2025-11-28 10:00:44.363 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:00:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:00:44 localhost neutron_sriov_agent[254147]: 2025-11-28 10:00:44.822 2 INFO neutron.agent.securitygroups_rpc [req-cee9d54d-2061-4e18-9dbb-7978cc78c723 req-de7c358c-da89-4568-8167-d43d2e6c2b50 67aa4e1531db4854a2788f0bbeaba314 40693e6dadaf448a8cb4caeb6899effc - - default default] Security group rule updated ['0b748e56-a20d-4a74-8688-d245ea875072']#033[00m Nov 28 05:00:44 localhost nova_compute[279673]: 2025-11-28 10:00:44.867 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:46 localhost neutron_sriov_agent[254147]: 2025-11-28 10:00:46.158 2 INFO neutron.agent.securitygroups_rpc [req-fb04029d-a55f-44ca-91cb-e1804a48ba9f req-358dd77e-4f38-41f7-8d87-d246c9bdd01f 67aa4e1531db4854a2788f0bbeaba314 40693e6dadaf448a8cb4caeb6899effc - - default default] Security group rule updated ['e4169f5e-6c1f-4c53-9aba-f1f5fa41bfad']#033[00m Nov 28 05:00:46 localhost nova_compute[279673]: 2025-11-28 10:00:46.630 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:46 localhost neutron_sriov_agent[254147]: 2025-11-28 10:00:46.815 2 INFO neutron.agent.securitygroups_rpc [req-587359a2-3e51-4204-a146-11784139df1a req-865ecdad-325d-46ee-b040-9c64055b894f 67aa4e1531db4854a2788f0bbeaba314 40693e6dadaf448a8cb4caeb6899effc - - default default] Security group rule updated ['e4169f5e-6c1f-4c53-9aba-f1f5fa41bfad']#033[00m Nov 28 05:00:47 localhost neutron_sriov_agent[254147]: 2025-11-28 10:00:47.103 2 INFO neutron.agent.securitygroups_rpc [req-95a69005-5e60-4e46-b43c-b0ec65f622be req-ea491abe-a44c-4d6d-8b8a-511053fe8f93 67aa4e1531db4854a2788f0bbeaba314 40693e6dadaf448a8cb4caeb6899effc - - default default] Security group rule updated ['e4169f5e-6c1f-4c53-9aba-f1f5fa41bfad']#033[00m Nov 28 05:00:47 localhost nova_compute[279673]: 2025-11-28 10:00:47.380 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:00:47 localhost nova_compute[279673]: 2025-11-28 10:00:47.380 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 05:00:47 localhost nova_compute[279673]: 2025-11-28 10:00:47.381 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 05:00:47 localhost nova_compute[279673]: 2025-11-28 10:00:47.463 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 05:00:47 localhost nova_compute[279673]: 2025-11-28 10:00:47.463 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 05:00:47 localhost nova_compute[279673]: 2025-11-28 10:00:47.464 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 05:00:47 localhost nova_compute[279673]: 2025-11-28 10:00:47.464 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:00:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:00:47 localhost podman[308403]: 2025-11-28 10:00:47.85153773 +0000 UTC m=+0.083238574 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 05:00:47 localhost podman[308403]: 2025-11-28 10:00:47.888542775 +0000 UTC m=+0.120243639 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 05:00:47 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:00:48 localhost nova_compute[279673]: 2025-11-28 10:00:48.062 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:00:48 localhost nova_compute[279673]: 2025-11-28 10:00:48.078 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 05:00:48 localhost nova_compute[279673]: 2025-11-28 10:00:48.079 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 05:00:48 localhost nova_compute[279673]: 2025-11-28 10:00:48.079 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:00:48 localhost nova_compute[279673]: 2025-11-28 10:00:48.080 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Nov 28 05:00:48 localhost openstack_network_exporter[240658]: ERROR 10:00:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:00:48 localhost openstack_network_exporter[240658]: ERROR 10:00:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:00:48 localhost openstack_network_exporter[240658]: ERROR 10:00:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:00:48 localhost openstack_network_exporter[240658]: ERROR 10:00:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:00:48 localhost openstack_network_exporter[240658]: Nov 28 05:00:48 localhost openstack_network_exporter[240658]: ERROR 10:00:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:00:48 localhost openstack_network_exporter[240658]: Nov 28 05:00:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:00:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:00:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:00:49 localhost podman[308426]: 2025-11-28 10:00:49.857265366 +0000 UTC m=+0.084881585 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2) Nov 28 05:00:49 localhost nova_compute[279673]: 2025-11-28 10:00:49.908 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:49 localhost podman[308427]: 2025-11-28 10:00:49.951232217 +0000 UTC m=+0.175395279 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:00:49 localhost podman[308427]: 2025-11-28 10:00:49.955990803 +0000 UTC m=+0.180153875 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent) Nov 28 05:00:49 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:00:50 localhost podman[308426]: 2025-11-28 10:00:50.004873751 +0000 UTC m=+0.232489970 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:00:50 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:00:50 localhost dnsmasq[308200]: read /var/lib/neutron/dhcp/991320b8-b994-4199-922f-5c3428b3e7ba/addn_hosts - 0 addresses Nov 28 05:00:50 localhost dnsmasq-dhcp[308200]: read /var/lib/neutron/dhcp/991320b8-b994-4199-922f-5c3428b3e7ba/host Nov 28 05:00:50 localhost dnsmasq-dhcp[308200]: read /var/lib/neutron/dhcp/991320b8-b994-4199-922f-5c3428b3e7ba/opts Nov 28 05:00:50 localhost podman[308482]: 2025-11-28 10:00:50.294435962 +0000 UTC m=+0.065541462 container kill 65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-991320b8-b994-4199-922f-5c3428b3e7ba, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 05:00:50 localhost ovn_controller[152322]: 2025-11-28T10:00:50Z|00090|binding|INFO|Releasing lport 74cdc895-82eb-45db-a408-03b43d3fc10f from this chassis (sb_readonly=0) Nov 28 05:00:50 localhost kernel: device tap74cdc895-82 left promiscuous mode Nov 28 05:00:50 localhost ovn_controller[152322]: 2025-11-28T10:00:50Z|00091|binding|INFO|Setting lport 74cdc895-82eb-45db-a408-03b43d3fc10f down in Southbound Nov 28 05:00:50 localhost nova_compute[279673]: 2025-11-28 10:00:50.482 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:50 localhost nova_compute[279673]: 2025-11-28 10:00:50.503 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:50.504 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-991320b8-b994-4199-922f-5c3428b3e7ba', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-991320b8-b994-4199-922f-5c3428b3e7ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '40693e6dadaf448a8cb4caeb6899effc', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=16db044e-e796-4ecb-9ada-84075a99aa73, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=74cdc895-82eb-45db-a408-03b43d3fc10f) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:00:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:50.507 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 74cdc895-82eb-45db-a408-03b43d3fc10f in datapath 991320b8-b994-4199-922f-5c3428b3e7ba unbound from our chassis#033[00m Nov 28 05:00:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:50.510 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 991320b8-b994-4199-922f-5c3428b3e7ba, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:00:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:50.512 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[093e3915-8951-4838-9881-5ab3c6a44ba6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:00:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:50.840 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:00:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:50.841 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:00:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:00:50.841 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:00:51 localhost nova_compute[279673]: 2025-11-28 10:00:51.633 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:52 localhost nova_compute[279673]: 2025-11-28 10:00:52.049 279685 DEBUG nova.virt.libvirt.driver [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Creating tmpfile /var/lib/nova/instances/tmpkz_uaqrw to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m Nov 28 05:00:52 localhost nova_compute[279673]: 2025-11-28 10:00:52.062 279685 DEBUG nova.compute.manager [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] destination check data is LibvirtLiveMigrateData(bdms=,block_migration=,disk_available_mb=13312,disk_over_commit=,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmpkz_uaqrw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=,is_shared_block_storage=,is_shared_instance_path=,is_volume_backed=,migration=,old_vol_attachment_ids=,serial_listen_addr=None,serial_listen_ports=,src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=,target_connect_addr=,vifs=[VIFMigrateData],wait_for_vif_plugged=) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m Nov 28 05:00:52 localhost nova_compute[279673]: 2025-11-28 10:00:52.095 279685 DEBUG oslo_concurrency.lockutils [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 05:00:52 localhost nova_compute[279673]: 2025-11-28 10:00:52.096 279685 DEBUG oslo_concurrency.lockutils [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 05:00:52 localhost nova_compute[279673]: 2025-11-28 10:00:52.110 279685 INFO nova.compute.rpcapi [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m Nov 28 05:00:52 localhost nova_compute[279673]: 2025-11-28 10:00:52.111 279685 DEBUG oslo_concurrency.lockutils [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 05:00:52 localhost ovn_controller[152322]: 2025-11-28T10:00:52Z|00092|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:00:52 localhost nova_compute[279673]: 2025-11-28 10:00:52.487 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:53 localhost dnsmasq[308200]: exiting on receipt of SIGTERM Nov 28 05:00:53 localhost podman[308519]: 2025-11-28 10:00:53.013817092 +0000 UTC m=+0.060743923 container kill 65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-991320b8-b994-4199-922f-5c3428b3e7ba, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125) Nov 28 05:00:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:00:53 localhost systemd[1]: libpod-65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3.scope: Deactivated successfully. Nov 28 05:00:53 localhost podman[308533]: 2025-11-28 10:00:53.083911141 +0000 UTC m=+0.059724502 container died 65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-991320b8-b994-4199-922f-5c3428b3e7ba, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:00:53 localhost podman[308540]: 2025-11-28 10:00:53.129607053 +0000 UTC m=+0.088405992 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 28 05:00:53 localhost podman[308533]: 2025-11-28 10:00:53.172169788 +0000 UTC m=+0.147983099 container cleanup 65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-991320b8-b994-4199-922f-5c3428b3e7ba, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 05:00:53 localhost systemd[1]: libpod-conmon-65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3.scope: Deactivated successfully. Nov 28 05:00:53 localhost podman[308535]: 2025-11-28 10:00:53.190685746 +0000 UTC m=+0.155783468 container remove 65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-991320b8-b994-4199-922f-5c3428b3e7ba, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 28 05:00:53 localhost podman[308540]: 2025-11-28 10:00:53.192263994 +0000 UTC m=+0.151062933 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:00:53 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:00:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:00:53.408 261084 INFO neutron.agent.dhcp.agent [None req-6782fca0-efd2-40a9-82ea-c9eac2ed2582 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:00:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:00:53.409 261084 INFO neutron.agent.dhcp.agent [None req-6782fca0-efd2-40a9-82ea-c9eac2ed2582 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:00:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:00:53.661 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:00:54 localhost systemd[1]: var-lib-containers-storage-overlay-e7a176f0a1d7dcae07345c04d77b729e8101196fbb4e6b2ed6c12ecbe6a4c6d1-merged.mount: Deactivated successfully. Nov 28 05:00:54 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-65e7e9e94b6e4e3d79589b3097ce670592d095ffe1bf0a8d36ad0968dd5a93b3-userdata-shm.mount: Deactivated successfully. Nov 28 05:00:54 localhost systemd[1]: run-netns-qdhcp\x2d991320b8\x2db994\x2d4199\x2d922f\x2d5c3428b3e7ba.mount: Deactivated successfully. Nov 28 05:00:54 localhost nova_compute[279673]: 2025-11-28 10:00:54.069 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:54 localhost nova_compute[279673]: 2025-11-28 10:00:54.533 279685 DEBUG nova.compute.manager [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=,block_migration=False,disk_available_mb=13312,disk_over_commit=,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmpkz_uaqrw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='d716674a-ba14-466a-956f-5bca9404174f',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=,old_vol_attachment_ids=,serial_listen_addr=None,serial_listen_ports=,src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=,target_connect_addr=,vifs=[VIFMigrateData],wait_for_vif_plugged=) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m Nov 28 05:00:54 localhost nova_compute[279673]: 2025-11-28 10:00:54.576 279685 DEBUG oslo_concurrency.lockutils [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Acquiring lock "refresh_cache-d716674a-ba14-466a-956f-5bca9404174f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 05:00:54 localhost nova_compute[279673]: 2025-11-28 10:00:54.577 279685 DEBUG oslo_concurrency.lockutils [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Acquired lock "refresh_cache-d716674a-ba14-466a-956f-5bca9404174f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 05:00:54 localhost nova_compute[279673]: 2025-11-28 10:00:54.577 279685 DEBUG nova.network.neutron [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Nov 28 05:00:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:00:54 localhost nova_compute[279673]: 2025-11-28 10:00:54.911 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:55 localhost nova_compute[279673]: 2025-11-28 10:00:55.170 279685 DEBUG nova.network.neutron [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Updating instance_info_cache with network_info: [{"id": "d4b2e0ba-de4a-4cfb-af66-1ed3abdde376", "address": "fa:16:3e:a9:a4:65", "network": {"id": "b1cd9c9c-949c-46cf-bb45-dc659f651fc3", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-91322750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "3e4b394501d24dc7954ec5d2f27b8081", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4b2e0ba-de", "ovs_interfaceid": "d4b2e0ba-de4a-4cfb-af66-1ed3abdde376", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:00:55 localhost nova_compute[279673]: 2025-11-28 10:00:55.217 279685 DEBUG oslo_concurrency.lockutils [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Releasing lock "refresh_cache-d716674a-ba14-466a-956f-5bca9404174f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 05:00:55 localhost nova_compute[279673]: 2025-11-28 10:00:55.220 279685 DEBUG nova.virt.libvirt.driver [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=,block_migration=False,disk_available_mb=13312,disk_over_commit=,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmpkz_uaqrw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='d716674a-ba14-466a-956f-5bca9404174f',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=,src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=,target_connect_addr=,vifs=[VIFMigrateData],wait_for_vif_plugged=) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m Nov 28 05:00:55 localhost nova_compute[279673]: 2025-11-28 10:00:55.221 279685 DEBUG nova.virt.libvirt.driver [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Creating instance directory: /var/lib/nova/instances/d716674a-ba14-466a-956f-5bca9404174f pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m Nov 28 05:00:55 localhost nova_compute[279673]: 2025-11-28 10:00:55.222 279685 DEBUG nova.virt.libvirt.driver [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Ensure instance console log exists: /var/lib/nova/instances/d716674a-ba14-466a-956f-5bca9404174f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m Nov 28 05:00:55 localhost nova_compute[279673]: 2025-11-28 10:00:55.222 279685 DEBUG nova.virt.libvirt.driver [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m Nov 28 05:00:55 localhost nova_compute[279673]: 2025-11-28 10:00:55.224 279685 DEBUG nova.virt.libvirt.vif [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T10:00:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1337177779',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005538514.localdomain',hostname='tempest-liveautoblockmigrationv225test-server-1337177779',id=6,image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-28T10:00:49Z,launched_on='np0005538514.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='np0005538514.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3e4b394501d24dc7954ec5d2f27b8081',ramdisk_id='',reservation_id='r-mfjalp0i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1153414438',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1153414438-project-member'},tags=,task_state='migrating',terminated_at=None,trusted_certs=,updated_at=2025-11-28T10:00:49Z,user_data=None,user_id='c64867c2bac34a819c0995d0b72ee9a7',uuid=d716674a-ba14-466a-956f-5bca9404174f,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d4b2e0ba-de4a-4cfb-af66-1ed3abdde376", "address": "fa:16:3e:a9:a4:65", "network": {"id": "b1cd9c9c-949c-46cf-bb45-dc659f651fc3", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-91322750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "3e4b394501d24dc7954ec5d2f27b8081", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd4b2e0ba-de", "ovs_interfaceid": "d4b2e0ba-de4a-4cfb-af66-1ed3abdde376", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Nov 28 05:00:55 localhost nova_compute[279673]: 2025-11-28 10:00:55.225 279685 DEBUG nova.network.os_vif_util [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Converting VIF {"id": "d4b2e0ba-de4a-4cfb-af66-1ed3abdde376", "address": "fa:16:3e:a9:a4:65", "network": {"id": "b1cd9c9c-949c-46cf-bb45-dc659f651fc3", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-91322750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "3e4b394501d24dc7954ec5d2f27b8081", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd4b2e0ba-de", "ovs_interfaceid": "d4b2e0ba-de4a-4cfb-af66-1ed3abdde376", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 28 05:00:55 localhost nova_compute[279673]: 2025-11-28 10:00:55.226 279685 DEBUG nova.network.os_vif_util [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:a4:65,bridge_name='br-int',has_traffic_filtering=True,id=d4b2e0ba-de4a-4cfb-af66-1ed3abdde376,network=Network(b1cd9c9c-949c-46cf-bb45-dc659f651fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd4b2e0ba-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 28 05:00:55 localhost nova_compute[279673]: 2025-11-28 10:00:55.226 279685 DEBUG os_vif [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:a4:65,bridge_name='br-int',has_traffic_filtering=True,id=d4b2e0ba-de4a-4cfb-af66-1ed3abdde376,network=Network(b1cd9c9c-949c-46cf-bb45-dc659f651fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd4b2e0ba-de') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Nov 28 05:00:55 localhost nova_compute[279673]: 2025-11-28 10:00:55.227 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:55 localhost nova_compute[279673]: 2025-11-28 10:00:55.228 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:00:55 localhost nova_compute[279673]: 2025-11-28 10:00:55.228 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 28 05:00:55 localhost nova_compute[279673]: 2025-11-28 10:00:55.233 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:55 localhost nova_compute[279673]: 2025-11-28 10:00:55.233 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4b2e0ba-de, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:00:55 localhost nova_compute[279673]: 2025-11-28 10:00:55.234 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd4b2e0ba-de, col_values=(('external_ids', {'iface-id': 'd4b2e0ba-de4a-4cfb-af66-1ed3abdde376', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a9:a4:65', 'vm-uuid': 'd716674a-ba14-466a-956f-5bca9404174f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:00:55 localhost nova_compute[279673]: 2025-11-28 10:00:55.262 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:55 localhost nova_compute[279673]: 2025-11-28 10:00:55.264 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:00:55 localhost nova_compute[279673]: 2025-11-28 10:00:55.269 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:55 localhost nova_compute[279673]: 2025-11-28 10:00:55.271 279685 INFO os_vif [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:a4:65,bridge_name='br-int',has_traffic_filtering=True,id=d4b2e0ba-de4a-4cfb-af66-1ed3abdde376,network=Network(b1cd9c9c-949c-46cf-bb45-dc659f651fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd4b2e0ba-de')#033[00m Nov 28 05:00:55 localhost nova_compute[279673]: 2025-11-28 10:00:55.272 279685 DEBUG nova.virt.libvirt.driver [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m Nov 28 05:00:55 localhost nova_compute[279673]: 2025-11-28 10:00:55.273 279685 DEBUG nova.compute.manager [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=13312,disk_over_commit=,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmpkz_uaqrw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='d716674a-ba14-466a-956f-5bca9404174f',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m Nov 28 05:00:55 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e94 do_prune osdmap full prune enabled Nov 28 05:00:55 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e95 e95: 6 total, 6 up, 6 in Nov 28 05:00:55 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e95: 6 total, 6 up, 6 in Nov 28 05:00:56 localhost nova_compute[279673]: 2025-11-28 10:00:56.813 279685 DEBUG nova.network.neutron [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Port d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 updated with migration profile {'migrating_to': 'np0005538513.localdomain'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m Nov 28 05:00:56 localhost nova_compute[279673]: 2025-11-28 10:00:56.816 279685 DEBUG nova.compute.manager [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=13312,disk_over_commit=,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmpkz_uaqrw',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='d716674a-ba14-466a-956f-5bca9404174f',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m Nov 28 05:00:57 localhost sshd[308582]: main: sshd: ssh-rsa algorithm is disabled Nov 28 05:00:57 localhost systemd-logind[764]: New session 73 of user nova. Nov 28 05:00:57 localhost systemd[1]: Created slice User Slice of UID 42436. Nov 28 05:00:57 localhost systemd[1]: Starting User Runtime Directory /run/user/42436... Nov 28 05:00:57 localhost systemd[1]: Finished User Runtime Directory /run/user/42436. Nov 28 05:00:57 localhost systemd[1]: Starting User Manager for UID 42436... Nov 28 05:00:57 localhost systemd[308586]: Queued start job for default target Main User Target. Nov 28 05:00:57 localhost systemd[308586]: Created slice User Application Slice. Nov 28 05:00:57 localhost systemd[308586]: Started Mark boot as successful after the user session has run 2 minutes. Nov 28 05:00:57 localhost systemd[308586]: Started Daily Cleanup of User's Temporary Directories. Nov 28 05:00:57 localhost systemd[308586]: Reached target Paths. Nov 28 05:00:57 localhost systemd[308586]: Reached target Timers. Nov 28 05:00:57 localhost systemd[308586]: Starting D-Bus User Message Bus Socket... Nov 28 05:00:57 localhost systemd[308586]: Starting Create User's Volatile Files and Directories... Nov 28 05:00:57 localhost systemd[308586]: Listening on D-Bus User Message Bus Socket. Nov 28 05:00:57 localhost systemd[308586]: Reached target Sockets. Nov 28 05:00:57 localhost systemd[308586]: Finished Create User's Volatile Files and Directories. Nov 28 05:00:57 localhost systemd[308586]: Reached target Basic System. Nov 28 05:00:57 localhost systemd[308586]: Reached target Main User Target. Nov 28 05:00:57 localhost systemd[308586]: Startup finished in 169ms. Nov 28 05:00:57 localhost systemd[1]: Started User Manager for UID 42436. Nov 28 05:00:57 localhost systemd[1]: Started Session 73 of User nova. Nov 28 05:00:57 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e95 do_prune osdmap full prune enabled Nov 28 05:00:57 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e96 e96: 6 total, 6 up, 6 in Nov 28 05:00:57 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e96: 6 total, 6 up, 6 in Nov 28 05:00:58 localhost systemd[1]: Started libvirt secret daemon. Nov 28 05:00:58 localhost kernel: device tapd4b2e0ba-de entered promiscuous mode Nov 28 05:00:58 localhost NetworkManager[5967]: [1764324058.1471] manager: (tapd4b2e0ba-de): new Tun device (/org/freedesktop/NetworkManager/Devices/21) Nov 28 05:00:58 localhost ovn_controller[152322]: 2025-11-28T10:00:58Z|00093|binding|INFO|Claiming lport d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 for this additional chassis. Nov 28 05:00:58 localhost ovn_controller[152322]: 2025-11-28T10:00:58Z|00094|binding|INFO|d4b2e0ba-de4a-4cfb-af66-1ed3abdde376: Claiming fa:16:3e:a9:a4:65 10.100.0.10 Nov 28 05:00:58 localhost ovn_controller[152322]: 2025-11-28T10:00:58Z|00095|binding|INFO|Claiming lport a48cbb27-d55f-41c4-a09f-9bbe3a14fe95 for this additional chassis. Nov 28 05:00:58 localhost ovn_controller[152322]: 2025-11-28T10:00:58Z|00096|binding|INFO|a48cbb27-d55f-41c4-a09f-9bbe3a14fe95: Claiming fa:16:3e:78:62:32 19.80.0.101 Nov 28 05:00:58 localhost systemd-udevd[308635]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:00:58 localhost nova_compute[279673]: 2025-11-28 10:00:58.153 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:58 localhost NetworkManager[5967]: [1764324058.1659] device (tapd4b2e0ba-de): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Nov 28 05:00:58 localhost NetworkManager[5967]: [1764324058.1669] device (tapd4b2e0ba-de): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Nov 28 05:00:58 localhost systemd-machined[83422]: New machine qemu-3-instance-00000006. Nov 28 05:00:58 localhost systemd[1]: Started Virtual Machine qemu-3-instance-00000006. Nov 28 05:00:58 localhost nova_compute[279673]: 2025-11-28 10:00:58.194 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:58 localhost ovn_controller[152322]: 2025-11-28T10:00:58Z|00097|binding|INFO|Setting lport d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 ovn-installed in OVS Nov 28 05:00:58 localhost nova_compute[279673]: 2025-11-28 10:00:58.197 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:58 localhost nova_compute[279673]: 2025-11-28 10:00:58.479 279685 DEBUG nova.virt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 28 05:00:58 localhost nova_compute[279673]: 2025-11-28 10:00:58.482 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: d716674a-ba14-466a-956f-5bca9404174f] VM Started (Lifecycle Event)#033[00m Nov 28 05:00:58 localhost nova_compute[279673]: 2025-11-28 10:00:58.655 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: d716674a-ba14-466a-956f-5bca9404174f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 05:00:58 localhost nova_compute[279673]: 2025-11-28 10:00:58.732 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:00:59 localhost nova_compute[279673]: 2025-11-28 10:00:59.060 279685 DEBUG nova.virt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 28 05:00:59 localhost nova_compute[279673]: 2025-11-28 10:00:59.060 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: d716674a-ba14-466a-956f-5bca9404174f] VM Resumed (Lifecycle Event)#033[00m Nov 28 05:00:59 localhost nova_compute[279673]: 2025-11-28 10:00:59.079 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: d716674a-ba14-466a-956f-5bca9404174f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 05:00:59 localhost nova_compute[279673]: 2025-11-28 10:00:59.083 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: d716674a-ba14-466a-956f-5bca9404174f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Nov 28 05:00:59 localhost nova_compute[279673]: 2025-11-28 10:00:59.101 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: d716674a-ba14-466a-956f-5bca9404174f] During the sync_power process the instance has moved from host np0005538514.localdomain to host np0005538513.localdomain#033[00m Nov 28 05:00:59 localhost systemd[1]: session-73.scope: Deactivated successfully. Nov 28 05:00:59 localhost systemd-logind[764]: Session 73 logged out. Waiting for processes to exit. Nov 28 05:00:59 localhost systemd-logind[764]: Removed session 73. Nov 28 05:00:59 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:00:59 localhost nova_compute[279673]: 2025-11-28 10:00:59.947 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:00 localhost ovn_controller[152322]: 2025-11-28T10:01:00Z|00098|binding|INFO|Claiming lport d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 for this chassis. Nov 28 05:01:00 localhost ovn_controller[152322]: 2025-11-28T10:01:00Z|00099|binding|INFO|d4b2e0ba-de4a-4cfb-af66-1ed3abdde376: Claiming fa:16:3e:a9:a4:65 10.100.0.10 Nov 28 05:01:00 localhost ovn_controller[152322]: 2025-11-28T10:01:00Z|00100|binding|INFO|Claiming lport a48cbb27-d55f-41c4-a09f-9bbe3a14fe95 for this chassis. Nov 28 05:01:00 localhost ovn_controller[152322]: 2025-11-28T10:01:00Z|00101|binding|INFO|a48cbb27-d55f-41c4-a09f-9bbe3a14fe95: Claiming fa:16:3e:78:62:32 19.80.0.101 Nov 28 05:01:00 localhost ovn_controller[152322]: 2025-11-28T10:01:00Z|00102|binding|INFO|Setting lport d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 up in Southbound Nov 28 05:01:00 localhost ovn_controller[152322]: 2025-11-28T10:01:00Z|00103|binding|INFO|Setting lport a48cbb27-d55f-41c4-a09f-9bbe3a14fe95 up in Southbound Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.147 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:62:32 19.80.0.101'], port_security=['fa:16:3e:78:62:32 19.80.0.101'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['d4b2e0ba-de4a-4cfb-af66-1ed3abdde376'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-27110142', 'neutron:cidrs': '19.80.0.101/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8c8d124f-f6e2-454a-9f65-e2e41a655306', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-27110142', 'neutron:project_id': '3e4b394501d24dc7954ec5d2f27b8081', 'neutron:revision_number': '3', 'neutron:security_group_ids': 'dc0a6e12-205a-4d7d-adb2-6545f08f7990', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=c5f09637-840e-43f3-af58-d197b914a787, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=a48cbb27-d55f-41c4-a09f-9bbe3a14fe95) old=Port_Binding(up=[False], additional_chassis=[], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.150 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:a4:65 10.100.0.10'], port_security=['fa:16:3e:a9:a4:65 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1851630912', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd716674a-ba14-466a-956f-5bca9404174f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1cd9c9c-949c-46cf-bb45-dc659f651fc3', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1851630912', 'neutron:project_id': '3e4b394501d24dc7954ec5d2f27b8081', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'dc0a6e12-205a-4d7d-adb2-6545f08f7990', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d220f056-a923-484b-9df7-f648b3edde7c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=d4b2e0ba-de4a-4cfb-af66-1ed3abdde376) old=Port_Binding(up=[False], additional_chassis=[], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.152 158130 INFO neutron.agent.ovn.metadata.agent [-] Port a48cbb27-d55f-41c4-a09f-9bbe3a14fe95 in datapath 8c8d124f-f6e2-454a-9f65-e2e41a655306 bound to our chassis#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.155 158130 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8c8d124f-f6e2-454a-9f65-e2e41a655306#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.164 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[b7fbf540-f971-429b-a007-8434119b5ce6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.165 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8c8d124f-f1 in ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.167 158233 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8c8d124f-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.167 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[95e6f82d-11d7-4b04-b417-e7d6545066b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.169 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[25d37f20-b9e3-49c8-8ca3-9e6f00818a0b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.186 158264 DEBUG oslo.privsep.daemon [-] privsep: reply[1a050001-1d4b-463e-ad5d-6421cfe2a851]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:00 localhost neutron_sriov_agent[254147]: 2025-11-28 10:01:00.187 2 WARNING neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 req-6fcf345d-e21d-4204-a5a8-b222eb18b3be 9e5033e84dec44f4956046cabe7e22af e2c76e4d27554fd5a4f85cce208b136f - - default default] This port is not SRIOV, skip binding for port d4b2e0ba-de4a-4cfb-af66-1ed3abdde376.#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.210 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[50146e27-96be-4cf7-a5a9-4553e242f39c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.236 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[a6b0ab48-8666-4124-9517-36ca593cd851]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:00 localhost NetworkManager[5967]: [1764324060.2450] manager: (tap8c8d124f-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/22) Nov 28 05:01:00 localhost systemd-udevd[308639]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.243 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[810a196f-3d57-4443-ab85-11aff5add4ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:00 localhost nova_compute[279673]: 2025-11-28 10:01:00.265 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.293 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[c443fc6a-746a-40db-b2ba-20c2afd8a454]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.297 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[75b32d95-e12f-4c44-93a7-df967fe390bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:00 localhost nova_compute[279673]: 2025-11-28 10:01:00.299 279685 INFO nova.compute.manager [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Post operation of migration started#033[00m Nov 28 05:01:00 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap8c8d124f-f1: link becomes ready Nov 28 05:01:00 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap8c8d124f-f0: link becomes ready Nov 28 05:01:00 localhost NetworkManager[5967]: [1764324060.3234] device (tap8c8d124f-f0): carrier: link connected Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.329 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[7118e377-1d7f-4893-97f9-9207068bec74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.346 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[6357979f-e600-4d4a-a3e0-336f6249ff9c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8c8d124f-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:ad:c1:c7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1189442, 'reachable_time': 41884, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 308714, 'error': None, 'target': 'ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.362 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[7927ae6f-8ebd-4623-aef9-3cf0dc0a529c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fead:c1c7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1189442, 'tstamp': 1189442}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 308715, 'error': None, 'target': 'ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.376 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[ac32bbb1-f6ad-4f8f-90d8-3824fc92a6c7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8c8d124f-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:ad:c1:c7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1189442, 'reachable_time': 41884, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 308716, 'error': None, 'target': 'ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.413 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[7ec6bc8e-a07e-4bc9-9288-3867fb00f59b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.459 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[a460732b-b19c-4686-ae1e-a7be73cfe362]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.460 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c8d124f-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.461 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.462 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8c8d124f-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:01:00 localhost nova_compute[279673]: 2025-11-28 10:01:00.464 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:00 localhost kernel: device tap8c8d124f-f0 entered promiscuous mode Nov 28 05:01:00 localhost nova_compute[279673]: 2025-11-28 10:01:00.466 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.476 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8c8d124f-f0, col_values=(('external_ids', {'iface-id': '7166677b-51c2-44e8-9170-e59542d1e9db'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:01:00 localhost nova_compute[279673]: 2025-11-28 10:01:00.477 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:00 localhost ovn_controller[152322]: 2025-11-28T10:01:00Z|00104|binding|INFO|Releasing lport 7166677b-51c2-44e8-9170-e59542d1e9db from this chassis (sb_readonly=0) Nov 28 05:01:00 localhost nova_compute[279673]: 2025-11-28 10:01:00.479 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.484 158130 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8c8d124f-f6e2-454a-9f65-e2e41a655306.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8c8d124f-f6e2-454a-9f65-e2e41a655306.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Nov 28 05:01:00 localhost nova_compute[279673]: 2025-11-28 10:01:00.485 279685 DEBUG oslo_concurrency.lockutils [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Acquiring lock "refresh_cache-d716674a-ba14-466a-956f-5bca9404174f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 05:01:00 localhost nova_compute[279673]: 2025-11-28 10:01:00.485 279685 DEBUG oslo_concurrency.lockutils [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Acquired lock "refresh_cache-d716674a-ba14-466a-956f-5bca9404174f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 05:01:00 localhost nova_compute[279673]: 2025-11-28 10:01:00.486 279685 DEBUG nova.network.neutron [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.485 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[ed53a37c-29d4-43a1-9bc6-7c407dc1a7d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.487 158130 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: global Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: log /dev/log local0 debug Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: log-tag haproxy-metadata-proxy-8c8d124f-f6e2-454a-9f65-e2e41a655306 Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: user root Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: group root Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: maxconn 1024 Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: pidfile /var/lib/neutron/external/pids/8c8d124f-f6e2-454a-9f65-e2e41a655306.pid.haproxy Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: daemon Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: defaults Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: log global Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: mode http Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: option httplog Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: option dontlognull Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: option http-server-close Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: option forwardfor Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: retries 3 Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: timeout http-request 30s Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: timeout connect 30s Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: timeout client 32s Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: timeout server 32s Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: timeout http-keep-alive 30s Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: listen listener Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: bind 169.254.169.254:80 Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: server metadata /var/lib/neutron/metadata_proxy Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: http-request add-header X-OVN-Network-ID 8c8d124f-f6e2-454a-9f65-e2e41a655306 Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Nov 28 05:01:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:00.488 158130 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306', 'env', 'PROCESS_TAG=haproxy-8c8d124f-f6e2-454a-9f65-e2e41a655306', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8c8d124f-f6e2-454a-9f65-e2e41a655306.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Nov 28 05:01:00 localhost nova_compute[279673]: 2025-11-28 10:01:00.495 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:00 localhost podman[308748]: Nov 28 05:01:00 localhost podman[308748]: 2025-11-28 10:01:00.973048137 +0000 UTC m=+0.112264654 container create 0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:01:01 localhost systemd[1]: Started libpod-conmon-0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1.scope. Nov 28 05:01:01 localhost podman[308748]: 2025-11-28 10:01:00.925579861 +0000 UTC m=+0.064796418 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Nov 28 05:01:01 localhost systemd[1]: Started libcrun container. Nov 28 05:01:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/696acd32f1fb59bc8c1e81a6c4eb9b6106d56a44647c38d6c3a5de7e8de7e41a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:01:01 localhost podman[308748]: 2025-11-28 10:01:01.060152768 +0000 UTC m=+0.199369255 container init 0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 28 05:01:01 localhost podman[308748]: 2025-11-28 10:01:01.070618578 +0000 UTC m=+0.209835045 container start 0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125) Nov 28 05:01:01 localhost neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306[308762]: [NOTICE] (308766) : New worker (308768) forked Nov 28 05:01:01 localhost neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306[308762]: [NOTICE] (308766) : Loading success. Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.113 158130 INFO neutron.agent.ovn.metadata.agent [-] Port d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 in datapath b1cd9c9c-949c-46cf-bb45-dc659f651fc3 unbound from our chassis#033[00m Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.117 158130 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b1cd9c9c-949c-46cf-bb45-dc659f651fc3#033[00m Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.126 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[31c61d94-2447-49ac-9bcb-1b8fbe584432]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.127 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb1cd9c9c-91 in ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.129 158233 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb1cd9c9c-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.129 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[5e55d1ad-0f7f-4d83-a3d5-076393f0f379]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.131 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[090d44b8-ff67-4f6c-8a65-29f329b328a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.141 158264 DEBUG oslo.privsep.daemon [-] privsep: reply[84547a6a-6e54-4fc9-9c08-98165faa1134]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.151 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[bd10e679-473a-43ea-a93d-31640388e14d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:01 localhost nova_compute[279673]: 2025-11-28 10:01:01.177 279685 DEBUG nova.network.neutron [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Updating instance_info_cache with network_info: [{"id": "d4b2e0ba-de4a-4cfb-af66-1ed3abdde376", "address": "fa:16:3e:a9:a4:65", "network": {"id": "b1cd9c9c-949c-46cf-bb45-dc659f651fc3", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-91322750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "3e4b394501d24dc7954ec5d2f27b8081", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4b2e0ba-de", "ovs_interfaceid": "d4b2e0ba-de4a-4cfb-af66-1ed3abdde376", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:01:01 localhost systemd[1]: tmp-crun.epAAtR.mount: Deactivated successfully. Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.189 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[7c1ed1f1-6218-4ef0-b835-31817f138dae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:01 localhost NetworkManager[5967]: [1764324061.2004] manager: (tapb1cd9c9c-90): new Veth device (/org/freedesktop/NetworkManager/Devices/23) Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.198 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[3c8500c4-ea4b-4a44-a6d7-d24d741867a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:01 localhost systemd-udevd[308700]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:01:01 localhost nova_compute[279673]: 2025-11-28 10:01:01.206 279685 DEBUG oslo_concurrency.lockutils [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Releasing lock "refresh_cache-d716674a-ba14-466a-956f-5bca9404174f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 05:01:01 localhost nova_compute[279673]: 2025-11-28 10:01:01.233 279685 DEBUG oslo_concurrency.lockutils [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:01 localhost nova_compute[279673]: 2025-11-28 10:01:01.234 279685 DEBUG oslo_concurrency.lockutils [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:01 localhost nova_compute[279673]: 2025-11-28 10:01:01.235 279685 DEBUG oslo_concurrency.lockutils [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.237 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[3fce0ee6-0961-45cd-b0c0-3154f46d79f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:01 localhost nova_compute[279673]: 2025-11-28 10:01:01.245 279685 INFO nova.virt.libvirt.driver [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m Nov 28 05:01:01 localhost journal[201490]: Domain id=3 name='instance-00000006' uuid=d716674a-ba14-466a-956f-5bca9404174f is tainted: custom-monitor Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.246 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[1f84b275-18da-43ef-a612-b315a08bb44f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:01 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapb1cd9c9c-91: link becomes ready Nov 28 05:01:01 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapb1cd9c9c-90: link becomes ready Nov 28 05:01:01 localhost NetworkManager[5967]: [1764324061.2763] device (tapb1cd9c9c-90): carrier: link connected Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.282 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[cd157c62-d8b3-4683-9cfe-047db9413887]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.299 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[f8fc4178-7f78-4f44-a638-92974003268f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb1cd9c9c-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:10:2e:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1189537, 'reachable_time': 27294, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 308787, 'error': None, 'target': 'ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.328 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[2d08e6d1-1749-49c7-b909-e23864444300]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe10:2e50'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1189537, 'tstamp': 1189537}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 308788, 'error': None, 'target': 'ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.347 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[0ae6a962-e4ae-4c36-b12d-339fe09d79c5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb1cd9c9c-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:10:2e:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1189537, 'reachable_time': 27294, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 308789, 'error': None, 'target': 'ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.377 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[10013937-f8ef-4344-b842-5de96a45f25e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.443 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[a0e2fe69-92cb-4e5b-90da-240b931dce83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.445 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1cd9c9c-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.446 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.447 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb1cd9c9c-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:01:01 localhost kernel: device tapb1cd9c9c-90 entered promiscuous mode Nov 28 05:01:01 localhost nova_compute[279673]: 2025-11-28 10:01:01.451 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.459 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb1cd9c9c-90, col_values=(('external_ids', {'iface-id': 'bbc6954a-495e-4b48-9eb4-0e6cdcdb602f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:01:01 localhost ovn_controller[152322]: 2025-11-28T10:01:01Z|00105|binding|INFO|Releasing lport bbc6954a-495e-4b48-9eb4-0e6cdcdb602f from this chassis (sb_readonly=0) Nov 28 05:01:01 localhost nova_compute[279673]: 2025-11-28 10:01:01.461 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:01 localhost nova_compute[279673]: 2025-11-28 10:01:01.463 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:01 localhost nova_compute[279673]: 2025-11-28 10:01:01.468 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.467 158130 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b1cd9c9c-949c-46cf-bb45-dc659f651fc3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b1cd9c9c-949c-46cf-bb45-dc659f651fc3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.470 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[c98d1de9-6e4e-4636-a48c-84442d25feae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.471 158130 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: global Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: log /dev/log local0 debug Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: log-tag haproxy-metadata-proxy-b1cd9c9c-949c-46cf-bb45-dc659f651fc3 Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: user root Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: group root Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: maxconn 1024 Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: pidfile /var/lib/neutron/external/pids/b1cd9c9c-949c-46cf-bb45-dc659f651fc3.pid.haproxy Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: daemon Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: defaults Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: log global Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: mode http Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: option httplog Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: option dontlognull Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: option http-server-close Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: option forwardfor Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: retries 3 Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: timeout http-request 30s Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: timeout connect 30s Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: timeout client 32s Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: timeout server 32s Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: timeout http-keep-alive 30s Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: listen listener Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: bind 169.254.169.254:80 Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: server metadata /var/lib/neutron/metadata_proxy Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: http-request add-header X-OVN-Network-ID b1cd9c9c-949c-46cf-bb45-dc659f651fc3 Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Nov 28 05:01:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:01.474 158130 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3', 'env', 'PROCESS_TAG=haproxy-b1cd9c9c-949c-46cf-bb45-dc659f651fc3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b1cd9c9c-949c-46cf-bb45-dc659f651fc3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Nov 28 05:01:01 localhost podman[308832]: Nov 28 05:01:01 localhost podman[308832]: 2025-11-28 10:01:01.911837945 +0000 UTC m=+0.082748219 container create 3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 28 05:01:01 localhost podman[308832]: 2025-11-28 10:01:01.867718412 +0000 UTC m=+0.038628716 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Nov 28 05:01:01 localhost systemd[1]: Started libpod-conmon-3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45.scope. Nov 28 05:01:02 localhost systemd[1]: Started libcrun container. Nov 28 05:01:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1af4d90b8638c3920ed542c915db741d94ff67735a1d7fb294f7e55b2f0fb4c9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:01:02 localhost podman[308832]: 2025-11-28 10:01:02.02679528 +0000 UTC m=+0.197705534 container init 3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 28 05:01:02 localhost podman[308832]: 2025-11-28 10:01:02.037088585 +0000 UTC m=+0.207998849 container start 3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:01:02 localhost neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3[308846]: [NOTICE] (308850) : New worker (308852) forked Nov 28 05:01:02 localhost neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3[308846]: [NOTICE] (308850) : Loading success. Nov 28 05:01:02 localhost systemd[1]: tmp-crun.op1aZ0.mount: Deactivated successfully. Nov 28 05:01:02 localhost nova_compute[279673]: 2025-11-28 10:01:02.256 279685 INFO nova.virt.libvirt.driver [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m Nov 28 05:01:02 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0. Nov 28 05:01:03 localhost nova_compute[279673]: 2025-11-28 10:01:03.261 279685 INFO nova.virt.libvirt.driver [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m Nov 28 05:01:03 localhost nova_compute[279673]: 2025-11-28 10:01:03.268 279685 DEBUG nova.compute.manager [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 05:01:03 localhost nova_compute[279673]: 2025-11-28 10:01:03.294 279685 DEBUG nova.objects.instance [None req-dac25828-ac2e-4370-bb38-ed6feb9f56e4 b663adc3c1ab4012828d493e2b3ebb79 2159235bf1c5407eac7a3e3826561913 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m Nov 28 05:01:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:01:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:01:03 localhost podman[308861]: 2025-11-28 10:01:03.866160914 +0000 UTC m=+0.102296707 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 28 05:01:03 localhost podman[308862]: 2025-11-28 10:01:03.911571578 +0000 UTC m=+0.143653437 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd) Nov 28 05:01:03 localhost podman[308862]: 2025-11-28 10:01:03.921715808 +0000 UTC m=+0.153797647 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd) Nov 28 05:01:03 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:01:03 localhost podman[308861]: 2025-11-28 10:01:03.975854358 +0000 UTC m=+0.211990151 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 05:01:03 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:01:04 localhost ovn_controller[152322]: 2025-11-28T10:01:04Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a9:a4:65 10.100.0.10 Nov 28 05:01:04 localhost ovn_controller[152322]: 2025-11-28T10:01:04Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a9:a4:65 10.100.0.10 Nov 28 05:01:04 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:01:04 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3322246960' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:01:04 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:01:04 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e96 do_prune osdmap full prune enabled Nov 28 05:01:04 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e97 e97: 6 total, 6 up, 6 in Nov 28 05:01:04 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e97: 6 total, 6 up, 6 in Nov 28 05:01:04 localhost nova_compute[279673]: 2025-11-28 10:01:04.884 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:04 localhost nova_compute[279673]: 2025-11-28 10:01:04.905 279685 DEBUG oslo_concurrency.lockutils [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Acquiring lock "d716674a-ba14-466a-956f-5bca9404174f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:04 localhost nova_compute[279673]: 2025-11-28 10:01:04.905 279685 DEBUG oslo_concurrency.lockutils [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Lock "d716674a-ba14-466a-956f-5bca9404174f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:04 localhost nova_compute[279673]: 2025-11-28 10:01:04.906 279685 DEBUG oslo_concurrency.lockutils [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Acquiring lock "d716674a-ba14-466a-956f-5bca9404174f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:04 localhost nova_compute[279673]: 2025-11-28 10:01:04.906 279685 DEBUG oslo_concurrency.lockutils [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Lock "d716674a-ba14-466a-956f-5bca9404174f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:04 localhost nova_compute[279673]: 2025-11-28 10:01:04.906 279685 DEBUG oslo_concurrency.lockutils [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Lock "d716674a-ba14-466a-956f-5bca9404174f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:04 localhost nova_compute[279673]: 2025-11-28 10:01:04.907 279685 INFO nova.compute.manager [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Terminating instance#033[00m Nov 28 05:01:04 localhost nova_compute[279673]: 2025-11-28 10:01:04.908 279685 DEBUG nova.compute.manager [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m Nov 28 05:01:04 localhost nova_compute[279673]: 2025-11-28 10:01:04.949 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:04 localhost kernel: device tapd4b2e0ba-de left promiscuous mode Nov 28 05:01:04 localhost NetworkManager[5967]: [1764324064.9731] device (tapd4b2e0ba-de): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed') Nov 28 05:01:04 localhost ovn_controller[152322]: 2025-11-28T10:01:04Z|00106|binding|INFO|Releasing lport d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 from this chassis (sb_readonly=0) Nov 28 05:01:04 localhost ovn_controller[152322]: 2025-11-28T10:01:04Z|00107|binding|INFO|Setting lport d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 down in Southbound Nov 28 05:01:04 localhost ovn_controller[152322]: 2025-11-28T10:01:04Z|00108|binding|INFO|Releasing lport a48cbb27-d55f-41c4-a09f-9bbe3a14fe95 from this chassis (sb_readonly=0) Nov 28 05:01:04 localhost ovn_controller[152322]: 2025-11-28T10:01:04Z|00109|binding|INFO|Setting lport a48cbb27-d55f-41c4-a09f-9bbe3a14fe95 down in Southbound Nov 28 05:01:04 localhost ovn_controller[152322]: 2025-11-28T10:01:04Z|00110|binding|INFO|Removing iface tapd4b2e0ba-de ovn-installed in OVS Nov 28 05:01:04 localhost nova_compute[279673]: 2025-11-28 10:01:04.979 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:05 localhost ovn_controller[152322]: 2025-11-28T10:01:04Z|00111|binding|INFO|Releasing lport bbc6954a-495e-4b48-9eb4-0e6cdcdb602f from this chassis (sb_readonly=0) Nov 28 05:01:05 localhost ovn_controller[152322]: 2025-11-28T10:01:04Z|00112|binding|INFO|Releasing lport 7166677b-51c2-44e8-9170-e59542d1e9db from this chassis (sb_readonly=0) Nov 28 05:01:05 localhost ovn_controller[152322]: 2025-11-28T10:01:04Z|00113|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:04.997 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:62:32 19.80.0.101'], port_security=['fa:16:3e:78:62:32 19.80.0.101'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['d4b2e0ba-de4a-4cfb-af66-1ed3abdde376'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-27110142', 'neutron:cidrs': '19.80.0.101/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8c8d124f-f6e2-454a-9f65-e2e41a655306', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-27110142', 'neutron:project_id': '3e4b394501d24dc7954ec5d2f27b8081', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'dc0a6e12-205a-4d7d-adb2-6545f08f7990', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=c5f09637-840e-43f3-af58-d197b914a787, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=a48cbb27-d55f-41c4-a09f-9bbe3a14fe95) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:04.998 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:a4:65 10.100.0.10'], port_security=['fa:16:3e:a9:a4:65 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1851630912', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd716674a-ba14-466a-956f-5bca9404174f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1cd9c9c-949c-46cf-bb45-dc659f651fc3', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1851630912', 'neutron:project_id': '3e4b394501d24dc7954ec5d2f27b8081', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'dc0a6e12-205a-4d7d-adb2-6545f08f7990', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d220f056-a923-484b-9df7-f648b3edde7c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=d4b2e0ba-de4a-4cfb-af66-1ed3abdde376) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:04.999 158130 INFO neutron.agent.ovn.metadata.agent [-] Port a48cbb27-d55f-41c4-a09f-9bbe3a14fe95 in datapath 8c8d124f-f6e2-454a-9f65-e2e41a655306 unbound from our chassis#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.001 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8c8d124f-f6e2-454a-9f65-e2e41a655306, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.001 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[55741819-ec56-4af7-9cf0-ab9db6d0390a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.002 158130 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306 namespace which is not needed anymore#033[00m Nov 28 05:01:05 localhost nova_compute[279673]: 2025-11-28 10:01:05.004 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:05 localhost nova_compute[279673]: 2025-11-28 10:01:05.013 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:05 localhost systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000006.scope: Deactivated successfully. Nov 28 05:01:05 localhost systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000006.scope: Consumed 4.409s CPU time. Nov 28 05:01:05 localhost systemd-machined[83422]: Machine qemu-3-instance-00000006 terminated. Nov 28 05:01:05 localhost nova_compute[279673]: 2025-11-28 10:01:05.147 279685 INFO nova.virt.libvirt.driver [-] [instance: d716674a-ba14-466a-956f-5bca9404174f] Instance destroyed successfully.#033[00m Nov 28 05:01:05 localhost nova_compute[279673]: 2025-11-28 10:01:05.149 279685 DEBUG nova.objects.instance [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Lazy-loading 'resources' on Instance uuid d716674a-ba14-466a-956f-5bca9404174f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:01:05 localhost nova_compute[279673]: 2025-11-28 10:01:05.165 279685 DEBUG nova.virt.libvirt.vif [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-28T10:00:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1337177779',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005538513.localdomain',hostname='tempest-liveautoblockmigrationv225test-server-1337177779',id=6,image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-28T10:00:49Z,launched_on='np0005538514.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='np0005538513.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='3e4b394501d24dc7954ec5d2f27b8081',ramdisk_id='',reservation_id='r-mfjalp0i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1153414438',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1153414438-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2025-11-28T10:01:03Z,user_data=None,user_id='c64867c2bac34a819c0995d0b72ee9a7',uuid=d716674a-ba14-466a-956f-5bca9404174f,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d4b2e0ba-de4a-4cfb-af66-1ed3abdde376", "address": "fa:16:3e:a9:a4:65", "network": {"id": "b1cd9c9c-949c-46cf-bb45-dc659f651fc3", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-91322750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "3e4b394501d24dc7954ec5d2f27b8081", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4b2e0ba-de", "ovs_interfaceid": "d4b2e0ba-de4a-4cfb-af66-1ed3abdde376", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m Nov 28 05:01:05 localhost nova_compute[279673]: 2025-11-28 10:01:05.165 279685 DEBUG nova.network.os_vif_util [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Converting VIF {"id": "d4b2e0ba-de4a-4cfb-af66-1ed3abdde376", "address": "fa:16:3e:a9:a4:65", "network": {"id": "b1cd9c9c-949c-46cf-bb45-dc659f651fc3", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-91322750-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "3e4b394501d24dc7954ec5d2f27b8081", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4b2e0ba-de", "ovs_interfaceid": "d4b2e0ba-de4a-4cfb-af66-1ed3abdde376", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 28 05:01:05 localhost nova_compute[279673]: 2025-11-28 10:01:05.166 279685 DEBUG nova.network.os_vif_util [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a9:a4:65,bridge_name='br-int',has_traffic_filtering=True,id=d4b2e0ba-de4a-4cfb-af66-1ed3abdde376,network=Network(b1cd9c9c-949c-46cf-bb45-dc659f651fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd4b2e0ba-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 28 05:01:05 localhost nova_compute[279673]: 2025-11-28 10:01:05.167 279685 DEBUG os_vif [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:a4:65,bridge_name='br-int',has_traffic_filtering=True,id=d4b2e0ba-de4a-4cfb-af66-1ed3abdde376,network=Network(b1cd9c9c-949c-46cf-bb45-dc659f651fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd4b2e0ba-de') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m Nov 28 05:01:05 localhost nova_compute[279673]: 2025-11-28 10:01:05.170 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:05 localhost nova_compute[279673]: 2025-11-28 10:01:05.170 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4b2e0ba-de, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:01:05 localhost nova_compute[279673]: 2025-11-28 10:01:05.172 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:05 localhost nova_compute[279673]: 2025-11-28 10:01:05.175 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:01:05 localhost neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306[308762]: [NOTICE] (308766) : haproxy version is 2.8.14-c23fe91 Nov 28 05:01:05 localhost neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306[308762]: [NOTICE] (308766) : path to executable is /usr/sbin/haproxy Nov 28 05:01:05 localhost neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306[308762]: [WARNING] (308766) : Exiting Master process... Nov 28 05:01:05 localhost neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306[308762]: [WARNING] (308766) : Exiting Master process... Nov 28 05:01:05 localhost nova_compute[279673]: 2025-11-28 10:01:05.182 279685 INFO os_vif [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:a4:65,bridge_name='br-int',has_traffic_filtering=True,id=d4b2e0ba-de4a-4cfb-af66-1ed3abdde376,network=Network(b1cd9c9c-949c-46cf-bb45-dc659f651fc3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd4b2e0ba-de')#033[00m Nov 28 05:01:05 localhost neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306[308762]: [ALERT] (308766) : Current worker (308768) exited with code 143 (Terminated) Nov 28 05:01:05 localhost neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306[308762]: [WARNING] (308766) : All workers exited. Exiting... (0) Nov 28 05:01:05 localhost systemd[1]: libpod-0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1.scope: Deactivated successfully. Nov 28 05:01:05 localhost podman[308930]: 2025-11-28 10:01:05.194914862 +0000 UTC m=+0.093758727 container died 0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 28 05:01:05 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1-userdata-shm.mount: Deactivated successfully. Nov 28 05:01:05 localhost podman[308930]: 2025-11-28 10:01:05.230323597 +0000 UTC m=+0.129167432 container cleanup 0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 28 05:01:05 localhost podman[308962]: 2025-11-28 10:01:05.261480432 +0000 UTC m=+0.060851246 container cleanup 0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Nov 28 05:01:05 localhost systemd[1]: libpod-conmon-0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1.scope: Deactivated successfully. Nov 28 05:01:05 localhost podman[308983]: 2025-11-28 10:01:05.304968526 +0000 UTC m=+0.056488193 container remove 0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.310 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[8c5b2f0a-e3f0-4a97-8223-9320447b312f]: (4, ('Fri Nov 28 10:01:05 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306 (0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1)\n0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1\nFri Nov 28 10:01:05 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306 (0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1)\n0e55e3481824e1629b4bd8944698ed545d26c1a7bbfa728630d0b11802bd56f1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.312 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[9fd55339-c470-485e-a964-13718b0b468a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.313 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c8d124f-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:01:05 localhost nova_compute[279673]: 2025-11-28 10:01:05.316 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:05 localhost kernel: device tap8c8d124f-f0 left promiscuous mode Nov 28 05:01:05 localhost nova_compute[279673]: 2025-11-28 10:01:05.324 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.327 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[cc8db5d5-27d1-47f9-982e-43704ec69e7a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.346 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[4117bfaa-2041-4563-a70a-a743a9af7e93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.347 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[da793837-3d6e-4547-a2ff-6897c4e37db8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.356 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[431035ce-eb41-4561-971a-c58779c85699]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1189433, 'reachable_time': 29335, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309006, 'error': None, 'target': 'ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.358 158264 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8c8d124f-f6e2-454a-9f65-e2e41a655306 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.359 158264 DEBUG oslo.privsep.daemon [-] privsep: reply[c5102584-41b5-4160-9fc5-4364d2d35b57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.359 158130 INFO neutron.agent.ovn.metadata.agent [-] Port d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 in datapath b1cd9c9c-949c-46cf-bb45-dc659f651fc3 unbound from our chassis#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.361 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b1cd9c9c-949c-46cf-bb45-dc659f651fc3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.362 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[bfbf1ddd-6397-4e13-8954-934cea4eb50a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.363 158130 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3 namespace which is not needed anymore#033[00m Nov 28 05:01:05 localhost neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3[308846]: [NOTICE] (308850) : haproxy version is 2.8.14-c23fe91 Nov 28 05:01:05 localhost neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3[308846]: [NOTICE] (308850) : path to executable is /usr/sbin/haproxy Nov 28 05:01:05 localhost neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3[308846]: [WARNING] (308850) : Exiting Master process... Nov 28 05:01:05 localhost neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3[308846]: [WARNING] (308850) : Exiting Master process... Nov 28 05:01:05 localhost neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3[308846]: [ALERT] (308850) : Current worker (308852) exited with code 143 (Terminated) Nov 28 05:01:05 localhost neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3[308846]: [WARNING] (308850) : All workers exited. Exiting... (0) Nov 28 05:01:05 localhost systemd[1]: libpod-3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45.scope: Deactivated successfully. Nov 28 05:01:05 localhost podman[309024]: 2025-11-28 10:01:05.532855284 +0000 UTC m=+0.079622732 container died 3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Nov 28 05:01:05 localhost podman[309024]: 2025-11-28 10:01:05.571558211 +0000 UTC m=+0.118325579 container cleanup 3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Nov 28 05:01:05 localhost podman[309036]: 2025-11-28 10:01:05.606789922 +0000 UTC m=+0.070644648 container cleanup 3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 05:01:05 localhost systemd[1]: libpod-conmon-3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45.scope: Deactivated successfully. Nov 28 05:01:05 localhost podman[309051]: 2025-11-28 10:01:05.650227774 +0000 UTC m=+0.063803198 container remove 3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.654 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[6900e654-282a-434a-8331-e31e9ce465cd]: (4, ('Fri Nov 28 10:01:05 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3 (3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45)\n3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45\nFri Nov 28 10:01:05 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3 (3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45)\n3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.656 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[45df0646-e119-4187-81a9-af756ff16ba6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.657 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1cd9c9c-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:01:05 localhost nova_compute[279673]: 2025-11-28 10:01:05.659 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:05 localhost kernel: device tapb1cd9c9c-90 left promiscuous mode Nov 28 05:01:05 localhost nova_compute[279673]: 2025-11-28 10:01:05.665 279685 DEBUG nova.compute.manager [req-ce984fd2-c349-4d0f-8dde-a2cd19204e91 req-1ac28375-7efd-44c4-8e8d-feb411b52a18 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Received event network-vif-unplugged-d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 28 05:01:05 localhost nova_compute[279673]: 2025-11-28 10:01:05.665 279685 DEBUG oslo_concurrency.lockutils [req-ce984fd2-c349-4d0f-8dde-a2cd19204e91 req-1ac28375-7efd-44c4-8e8d-feb411b52a18 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "d716674a-ba14-466a-956f-5bca9404174f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:05 localhost nova_compute[279673]: 2025-11-28 10:01:05.666 279685 DEBUG oslo_concurrency.lockutils [req-ce984fd2-c349-4d0f-8dde-a2cd19204e91 req-1ac28375-7efd-44c4-8e8d-feb411b52a18 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "d716674a-ba14-466a-956f-5bca9404174f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:05 localhost nova_compute[279673]: 2025-11-28 10:01:05.666 279685 DEBUG oslo_concurrency.lockutils [req-ce984fd2-c349-4d0f-8dde-a2cd19204e91 req-1ac28375-7efd-44c4-8e8d-feb411b52a18 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "d716674a-ba14-466a-956f-5bca9404174f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:05 localhost nova_compute[279673]: 2025-11-28 10:01:05.667 279685 DEBUG nova.compute.manager [req-ce984fd2-c349-4d0f-8dde-a2cd19204e91 req-1ac28375-7efd-44c4-8e8d-feb411b52a18 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] No waiting events found dispatching network-vif-unplugged-d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 28 05:01:05 localhost nova_compute[279673]: 2025-11-28 10:01:05.668 279685 DEBUG nova.compute.manager [req-ce984fd2-c349-4d0f-8dde-a2cd19204e91 req-1ac28375-7efd-44c4-8e8d-feb411b52a18 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Received event network-vif-unplugged-d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m Nov 28 05:01:05 localhost nova_compute[279673]: 2025-11-28 10:01:05.669 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.670 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[2fd45e0d-2ba3-4747-a921-5bac20cb4ebc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.684 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[6040aab6-17d1-438e-96d2-a3290b821780]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.685 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[b7c99ea8-6d17-4d33-b181-1e5c5d66db6a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.699 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[b24c2e00-46e3-4405-93f3-1cddacaff20f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1189527, 'reachable_time': 31604, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309072, 'error': None, 'target': 'ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.701 158264 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b1cd9c9c-949c-46cf-bb45-dc659f651fc3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Nov 28 05:01:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:05.701 158264 DEBUG oslo.privsep.daemon [-] privsep: reply[b8745d41-3d10-49f7-8034-1bd9728047a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:06 localhost systemd[1]: var-lib-containers-storage-overlay-1af4d90b8638c3920ed542c915db741d94ff67735a1d7fb294f7e55b2f0fb4c9-merged.mount: Deactivated successfully. Nov 28 05:01:06 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3cb421a733fd9877922dfa93a01b5c5bb1b4072d5a49acadbf260c98769c1b45-userdata-shm.mount: Deactivated successfully. Nov 28 05:01:06 localhost systemd[1]: run-netns-ovnmeta\x2db1cd9c9c\x2d949c\x2d46cf\x2dbb45\x2ddc659f651fc3.mount: Deactivated successfully. Nov 28 05:01:06 localhost systemd[1]: var-lib-containers-storage-overlay-696acd32f1fb59bc8c1e81a6c4eb9b6106d56a44647c38d6c3a5de7e8de7e41a-merged.mount: Deactivated successfully. Nov 28 05:01:06 localhost systemd[1]: run-netns-ovnmeta\x2d8c8d124f\x2df6e2\x2d454a\x2d9f65\x2de2e41a655306.mount: Deactivated successfully. Nov 28 05:01:06 localhost nova_compute[279673]: 2025-11-28 10:01:06.586 279685 INFO nova.virt.libvirt.driver [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Deleting instance files /var/lib/nova/instances/d716674a-ba14-466a-956f-5bca9404174f_del#033[00m Nov 28 05:01:06 localhost nova_compute[279673]: 2025-11-28 10:01:06.587 279685 INFO nova.virt.libvirt.driver [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Deletion of /var/lib/nova/instances/d716674a-ba14-466a-956f-5bca9404174f_del complete#033[00m Nov 28 05:01:06 localhost nova_compute[279673]: 2025-11-28 10:01:06.676 279685 INFO nova.compute.manager [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Took 1.77 seconds to destroy the instance on the hypervisor.#033[00m Nov 28 05:01:06 localhost nova_compute[279673]: 2025-11-28 10:01:06.677 279685 DEBUG oslo.service.loopingcall [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m Nov 28 05:01:06 localhost nova_compute[279673]: 2025-11-28 10:01:06.678 279685 DEBUG nova.compute.manager [-] [instance: d716674a-ba14-466a-956f-5bca9404174f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m Nov 28 05:01:06 localhost nova_compute[279673]: 2025-11-28 10:01:06.678 279685 DEBUG nova.network.neutron [-] [instance: d716674a-ba14-466a-956f-5bca9404174f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m Nov 28 05:01:07 localhost nova_compute[279673]: 2025-11-28 10:01:07.779 279685 DEBUG nova.compute.manager [req-f9925f73-8793-4608-8bc9-371e5e8b8132 req-4469b561-e280-4039-945d-4a164381d41b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Received event network-vif-plugged-d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 28 05:01:07 localhost nova_compute[279673]: 2025-11-28 10:01:07.779 279685 DEBUG oslo_concurrency.lockutils [req-f9925f73-8793-4608-8bc9-371e5e8b8132 req-4469b561-e280-4039-945d-4a164381d41b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "d716674a-ba14-466a-956f-5bca9404174f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:07 localhost nova_compute[279673]: 2025-11-28 10:01:07.780 279685 DEBUG oslo_concurrency.lockutils [req-f9925f73-8793-4608-8bc9-371e5e8b8132 req-4469b561-e280-4039-945d-4a164381d41b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "d716674a-ba14-466a-956f-5bca9404174f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:07 localhost nova_compute[279673]: 2025-11-28 10:01:07.780 279685 DEBUG oslo_concurrency.lockutils [req-f9925f73-8793-4608-8bc9-371e5e8b8132 req-4469b561-e280-4039-945d-4a164381d41b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "d716674a-ba14-466a-956f-5bca9404174f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:07 localhost nova_compute[279673]: 2025-11-28 10:01:07.781 279685 DEBUG nova.compute.manager [req-f9925f73-8793-4608-8bc9-371e5e8b8132 req-4469b561-e280-4039-945d-4a164381d41b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] No waiting events found dispatching network-vif-plugged-d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 28 05:01:07 localhost nova_compute[279673]: 2025-11-28 10:01:07.781 279685 WARNING nova.compute.manager [req-f9925f73-8793-4608-8bc9-371e5e8b8132 req-4469b561-e280-4039-945d-4a164381d41b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: d716674a-ba14-466a-956f-5bca9404174f] Received unexpected event network-vif-plugged-d4b2e0ba-de4a-4cfb-af66-1ed3abdde376 for instance with vm_state active and task_state deleting.#033[00m Nov 28 05:01:09 localhost systemd[1]: Stopping User Manager for UID 42436... Nov 28 05:01:09 localhost systemd[308586]: Activating special unit Exit the Session... Nov 28 05:01:09 localhost systemd[308586]: Stopped target Main User Target. Nov 28 05:01:09 localhost systemd[308586]: Stopped target Basic System. Nov 28 05:01:09 localhost systemd[308586]: Stopped target Paths. Nov 28 05:01:09 localhost systemd[308586]: Stopped target Sockets. Nov 28 05:01:09 localhost systemd[308586]: Stopped target Timers. Nov 28 05:01:09 localhost systemd[308586]: Stopped Mark boot as successful after the user session has run 2 minutes. Nov 28 05:01:09 localhost systemd[308586]: Stopped Daily Cleanup of User's Temporary Directories. Nov 28 05:01:09 localhost systemd[308586]: Closed D-Bus User Message Bus Socket. Nov 28 05:01:09 localhost systemd[308586]: Stopped Create User's Volatile Files and Directories. Nov 28 05:01:09 localhost systemd[308586]: Removed slice User Application Slice. Nov 28 05:01:09 localhost systemd[308586]: Reached target Shutdown. Nov 28 05:01:09 localhost systemd[308586]: Finished Exit the Session. Nov 28 05:01:09 localhost systemd[308586]: Reached target Exit the Session. Nov 28 05:01:09 localhost systemd[1]: user@42436.service: Deactivated successfully. Nov 28 05:01:09 localhost systemd[1]: Stopped User Manager for UID 42436. Nov 28 05:01:09 localhost systemd[1]: Stopping User Runtime Directory /run/user/42436... Nov 28 05:01:09 localhost systemd[1]: run-user-42436.mount: Deactivated successfully. Nov 28 05:01:09 localhost systemd[1]: user-runtime-dir@42436.service: Deactivated successfully. Nov 28 05:01:09 localhost systemd[1]: Stopped User Runtime Directory /run/user/42436. Nov 28 05:01:09 localhost systemd[1]: Removed slice User Slice of UID 42436. Nov 28 05:01:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:01:09 localhost nova_compute[279673]: 2025-11-28 10:01:09.981 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:10 localhost podman[238687]: time="2025-11-28T10:01:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:01:10 localhost podman[238687]: @ - - [28/Nov/2025:10:01:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157512 "" "Go-http-client/1.1" Nov 28 05:01:10 localhost podman[238687]: @ - - [28/Nov/2025:10:01:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19730 "" "Go-http-client/1.1" Nov 28 05:01:10 localhost nova_compute[279673]: 2025-11-28 10:01:10.173 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:10 localhost nova_compute[279673]: 2025-11-28 10:01:10.194 279685 DEBUG nova.network.neutron [-] [instance: d716674a-ba14-466a-956f-5bca9404174f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:01:10 localhost nova_compute[279673]: 2025-11-28 10:01:10.236 279685 INFO nova.compute.manager [-] [instance: d716674a-ba14-466a-956f-5bca9404174f] Took 3.56 seconds to deallocate network for instance.#033[00m Nov 28 05:01:10 localhost nova_compute[279673]: 2025-11-28 10:01:10.297 279685 DEBUG oslo_concurrency.lockutils [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:10 localhost nova_compute[279673]: 2025-11-28 10:01:10.297 279685 DEBUG oslo_concurrency.lockutils [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:10 localhost nova_compute[279673]: 2025-11-28 10:01:10.300 279685 DEBUG oslo_concurrency.lockutils [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:10 localhost nova_compute[279673]: 2025-11-28 10:01:10.528 279685 INFO nova.scheduler.client.report [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Deleted allocations for instance d716674a-ba14-466a-956f-5bca9404174f#033[00m Nov 28 05:01:10 localhost nova_compute[279673]: 2025-11-28 10:01:10.590 279685 DEBUG oslo_concurrency.lockutils [None req-8a1162f6-cbfa-4125-a587-cc15c86440e6 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Lock "d716674a-ba14-466a-956f-5bca9404174f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 5.684s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:11 localhost nova_compute[279673]: 2025-11-28 10:01:11.298 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:11 localhost neutron_sriov_agent[254147]: 2025-11-28 10:01:11.677 2 INFO neutron.agent.securitygroups_rpc [None req-64ca5811-bbe2-4768-b546-3f3c65a295fa c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Security group member updated ['dc0a6e12-205a-4d7d-adb2-6545f08f7990']#033[00m Nov 28 05:01:13 localhost neutron_sriov_agent[254147]: 2025-11-28 10:01:13.328 2 INFO neutron.agent.securitygroups_rpc [None req-0bbc429e-2462-4e1e-9b65-ff3c254999c8 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Security group member updated ['8cd6a72f-0cb3-42f5-95bb-7d1b962c8a1e']#033[00m Nov 28 05:01:13 localhost neutron_sriov_agent[254147]: 2025-11-28 10:01:13.667 2 INFO neutron.agent.securitygroups_rpc [None req-8218c41f-1821-4284-8a71-4b98eaf9d107 c64867c2bac34a819c0995d0b72ee9a7 3e4b394501d24dc7954ec5d2f27b8081 - - default default] Security group member updated ['dc0a6e12-205a-4d7d-adb2-6545f08f7990']#033[00m Nov 28 05:01:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:01:13 localhost podman[309075]: 2025-11-28 10:01:13.841064259 +0000 UTC m=+0.078561000 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., release=1755695350, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal) Nov 28 05:01:13 localhost podman[309075]: 2025-11-28 10:01:13.853698067 +0000 UTC m=+0.091194858 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, release=1755695350, io.openshift.expose-services=, version=9.6, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc.) Nov 28 05:01:13 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:01:14 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:01:14.405 261084 INFO neutron.agent.linux.ip_lib [None req-2cee969e-79d6-443f-b01f-72d840f98265 - - - - - -] Device tap26794c2c-f6 cannot be used as it has no MAC address#033[00m Nov 28 05:01:14 localhost nova_compute[279673]: 2025-11-28 10:01:14.468 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:14 localhost kernel: device tap26794c2c-f6 entered promiscuous mode Nov 28 05:01:14 localhost NetworkManager[5967]: [1764324074.4775] manager: (tap26794c2c-f6): new Generic device (/org/freedesktop/NetworkManager/Devices/24) Nov 28 05:01:14 localhost systemd-udevd[309108]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:01:14 localhost ovn_controller[152322]: 2025-11-28T10:01:14Z|00114|binding|INFO|Claiming lport 26794c2c-f636-431d-8d25-cfbabb72ae33 for this chassis. Nov 28 05:01:14 localhost ovn_controller[152322]: 2025-11-28T10:01:14Z|00115|binding|INFO|26794c2c-f636-431d-8d25-cfbabb72ae33: Claiming unknown Nov 28 05:01:14 localhost nova_compute[279673]: 2025-11-28 10:01:14.481 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:14 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:14.495 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-b42f9a09-f299-4469-8a2a-b6b8c70a7aed', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b42f9a09-f299-4469-8a2a-b6b8c70a7aed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6dce7e95fa0443beb41563da37907095', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e8b0f643-a0b0-4f6d-b3fc-848bb5b1dd8d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=26794c2c-f636-431d-8d25-cfbabb72ae33) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:01:14 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:14.500 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 26794c2c-f636-431d-8d25-cfbabb72ae33 in datapath b42f9a09-f299-4469-8a2a-b6b8c70a7aed bound to our chassis#033[00m Nov 28 05:01:14 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:14.502 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b42f9a09-f299-4469-8a2a-b6b8c70a7aed or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:01:14 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:14.503 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[1a689ffc-cbce-4612-ae8a-a1a395f6b8b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:14 localhost journal[227875]: ethtool ioctl error on tap26794c2c-f6: No such device Nov 28 05:01:14 localhost journal[227875]: ethtool ioctl error on tap26794c2c-f6: No such device Nov 28 05:01:14 localhost ovn_controller[152322]: 2025-11-28T10:01:14Z|00116|binding|INFO|Setting lport 26794c2c-f636-431d-8d25-cfbabb72ae33 ovn-installed in OVS Nov 28 05:01:14 localhost ovn_controller[152322]: 2025-11-28T10:01:14Z|00117|binding|INFO|Setting lport 26794c2c-f636-431d-8d25-cfbabb72ae33 up in Southbound Nov 28 05:01:14 localhost nova_compute[279673]: 2025-11-28 10:01:14.514 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:14 localhost journal[227875]: ethtool ioctl error on tap26794c2c-f6: No such device Nov 28 05:01:14 localhost nova_compute[279673]: 2025-11-28 10:01:14.517 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:14 localhost journal[227875]: ethtool ioctl error on tap26794c2c-f6: No such device Nov 28 05:01:14 localhost journal[227875]: ethtool ioctl error on tap26794c2c-f6: No such device Nov 28 05:01:14 localhost journal[227875]: ethtool ioctl error on tap26794c2c-f6: No such device Nov 28 05:01:14 localhost journal[227875]: ethtool ioctl error on tap26794c2c-f6: No such device Nov 28 05:01:14 localhost journal[227875]: ethtool ioctl error on tap26794c2c-f6: No such device Nov 28 05:01:14 localhost nova_compute[279673]: 2025-11-28 10:01:14.557 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:14 localhost nova_compute[279673]: 2025-11-28 10:01:14.582 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:14 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:01:14 localhost nova_compute[279673]: 2025-11-28 10:01:14.983 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:15 localhost nova_compute[279673]: 2025-11-28 10:01:15.175 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:15 localhost podman[309205]: Nov 28 05:01:15 localhost podman[309205]: 2025-11-28 10:01:15.426002191 +0000 UTC m=+0.085283276 container create 2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b42f9a09-f299-4469-8a2a-b6b8c70a7aed, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Nov 28 05:01:15 localhost systemd[1]: Started libpod-conmon-2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242.scope. Nov 28 05:01:15 localhost podman[309205]: 2025-11-28 10:01:15.38749644 +0000 UTC m=+0.046777575 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:01:15 localhost systemd[1]: Started libcrun container. Nov 28 05:01:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1d5f4174ee8c8f2aa1c8d3fbcaed9ef7f054f6f66b168db2dbfe4a5c6fb91850/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:01:15 localhost podman[309205]: 2025-11-28 10:01:15.505263712 +0000 UTC m=+0.164544797 container init 2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b42f9a09-f299-4469-8a2a-b6b8c70a7aed, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 05:01:15 localhost podman[309205]: 2025-11-28 10:01:15.515549767 +0000 UTC m=+0.174830852 container start 2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b42f9a09-f299-4469-8a2a-b6b8c70a7aed, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true) Nov 28 05:01:15 localhost dnsmasq[309233]: started, version 2.85 cachesize 150 Nov 28 05:01:15 localhost dnsmasq[309233]: DNS service limited to local subnets Nov 28 05:01:15 localhost dnsmasq[309233]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:01:15 localhost dnsmasq[309233]: warning: no upstream servers configured Nov 28 05:01:15 localhost dnsmasq-dhcp[309233]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:01:15 localhost dnsmasq[309233]: read /var/lib/neutron/dhcp/b42f9a09-f299-4469-8a2a-b6b8c70a7aed/addn_hosts - 0 addresses Nov 28 05:01:15 localhost dnsmasq-dhcp[309233]: read /var/lib/neutron/dhcp/b42f9a09-f299-4469-8a2a-b6b8c70a7aed/host Nov 28 05:01:15 localhost dnsmasq-dhcp[309233]: read /var/lib/neutron/dhcp/b42f9a09-f299-4469-8a2a-b6b8c70a7aed/opts Nov 28 05:01:15 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:01:15.702 261084 INFO neutron.agent.dhcp.agent [None req-9108ab0a-0bfa-4ba6-bdf0-e16d29ad2d9f - - - - - -] DHCP configuration for ports {'94f53e87-a0f5-4c4e-ba83-ece1220ca48d'} is completed#033[00m Nov 28 05:01:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 05:01:16 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:01:16 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 05:01:16 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:01:17 localhost neutron_sriov_agent[254147]: 2025-11-28 10:01:17.036 2 INFO neutron.agent.securitygroups_rpc [None req-2679fbf8-01c0-46c1-b86d-7a154868a163 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Security group member updated ['8cd6a72f-0cb3-42f5-95bb-7d1b962c8a1e']#033[00m Nov 28 05:01:17 localhost nova_compute[279673]: 2025-11-28 10:01:17.176 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:17 localhost ovn_controller[152322]: 2025-11-28T10:01:17Z|00118|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:01:17 localhost nova_compute[279673]: 2025-11-28 10:01:17.358 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:18 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:01:18.087 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:01:17Z, description=, device_id=350c5687-2c97-42e6-96bf-0b6c681cec37, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=9b55c433-acbb-4821-80ef-e5fdd0471217, ip_allocation=immediate, mac_address=fa:16:3e:a0:00:c1, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:01:12Z, description=, dns_domain=, id=b42f9a09-f299-4469-8a2a-b6b8c70a7aed, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPDetailsNegativeTestJSON-1727831009-network, port_security_enabled=True, project_id=6dce7e95fa0443beb41563da37907095, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=14204, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=566, status=ACTIVE, subnets=['0396b46f-a28c-409c-94f1-33c424a08c25'], tags=[], tenant_id=6dce7e95fa0443beb41563da37907095, updated_at=2025-11-28T10:01:13Z, vlan_transparent=None, network_id=b42f9a09-f299-4469-8a2a-b6b8c70a7aed, port_security_enabled=False, project_id=6dce7e95fa0443beb41563da37907095, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=590, status=DOWN, tags=[], tenant_id=6dce7e95fa0443beb41563da37907095, updated_at=2025-11-28T10:01:17Z on network b42f9a09-f299-4469-8a2a-b6b8c70a7aed#033[00m Nov 28 05:01:18 localhost openstack_network_exporter[240658]: ERROR 10:01:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:01:18 localhost openstack_network_exporter[240658]: ERROR 10:01:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:01:18 localhost openstack_network_exporter[240658]: ERROR 10:01:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:01:18 localhost openstack_network_exporter[240658]: ERROR 10:01:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:01:18 localhost openstack_network_exporter[240658]: Nov 28 05:01:18 localhost openstack_network_exporter[240658]: ERROR 10:01:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:01:18 localhost openstack_network_exporter[240658]: Nov 28 05:01:18 localhost podman[309298]: 2025-11-28 10:01:18.337823492 +0000 UTC m=+0.055615417 container kill 2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b42f9a09-f299-4469-8a2a-b6b8c70a7aed, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Nov 28 05:01:18 localhost dnsmasq[309233]: read /var/lib/neutron/dhcp/b42f9a09-f299-4469-8a2a-b6b8c70a7aed/addn_hosts - 1 addresses Nov 28 05:01:18 localhost dnsmasq-dhcp[309233]: read /var/lib/neutron/dhcp/b42f9a09-f299-4469-8a2a-b6b8c70a7aed/host Nov 28 05:01:18 localhost dnsmasq-dhcp[309233]: read /var/lib/neutron/dhcp/b42f9a09-f299-4469-8a2a-b6b8c70a7aed/opts Nov 28 05:01:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:01:18 localhost podman[309312]: 2025-11-28 10:01:18.452859339 +0000 UTC m=+0.082546731 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 05:01:18 localhost podman[309312]: 2025-11-28 10:01:18.460977508 +0000 UTC m=+0.090664890 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 28 05:01:18 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:01:18 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:01:18.559 261084 INFO neutron.agent.dhcp.agent [None req-b5036dfa-d387-4221-b087-919466830ae4 - - - - - -] DHCP configuration for ports {'9b55c433-acbb-4821-80ef-e5fdd0471217'} is completed#033[00m Nov 28 05:01:18 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:01:18.984 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:01:17Z, description=, device_id=350c5687-2c97-42e6-96bf-0b6c681cec37, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=9b55c433-acbb-4821-80ef-e5fdd0471217, ip_allocation=immediate, mac_address=fa:16:3e:a0:00:c1, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:01:12Z, description=, dns_domain=, id=b42f9a09-f299-4469-8a2a-b6b8c70a7aed, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPDetailsNegativeTestJSON-1727831009-network, port_security_enabled=True, project_id=6dce7e95fa0443beb41563da37907095, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=14204, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=566, status=ACTIVE, subnets=['0396b46f-a28c-409c-94f1-33c424a08c25'], tags=[], tenant_id=6dce7e95fa0443beb41563da37907095, updated_at=2025-11-28T10:01:13Z, vlan_transparent=None, network_id=b42f9a09-f299-4469-8a2a-b6b8c70a7aed, port_security_enabled=False, project_id=6dce7e95fa0443beb41563da37907095, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=590, status=DOWN, tags=[], tenant_id=6dce7e95fa0443beb41563da37907095, updated_at=2025-11-28T10:01:17Z on network b42f9a09-f299-4469-8a2a-b6b8c70a7aed#033[00m Nov 28 05:01:19 localhost dnsmasq[309233]: read /var/lib/neutron/dhcp/b42f9a09-f299-4469-8a2a-b6b8c70a7aed/addn_hosts - 1 addresses Nov 28 05:01:19 localhost dnsmasq-dhcp[309233]: read /var/lib/neutron/dhcp/b42f9a09-f299-4469-8a2a-b6b8c70a7aed/host Nov 28 05:01:19 localhost dnsmasq-dhcp[309233]: read /var/lib/neutron/dhcp/b42f9a09-f299-4469-8a2a-b6b8c70a7aed/opts Nov 28 05:01:19 localhost podman[309359]: 2025-11-28 10:01:19.217188618 +0000 UTC m=+0.062561739 container kill 2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b42f9a09-f299-4469-8a2a-b6b8c70a7aed, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:01:19 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:01:19.437 261084 INFO neutron.agent.dhcp.agent [None req-4fdaed7f-fb3f-496e-89dc-71411dca00f2 - - - - - -] DHCP configuration for ports {'9b55c433-acbb-4821-80ef-e5fdd0471217'} is completed#033[00m Nov 28 05:01:19 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:01:19 localhost nova_compute[279673]: 2025-11-28 10:01:19.987 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.037 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Acquiring lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.038 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.059 279685 DEBUG nova.compute.manager [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.130 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.131 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.137 279685 DEBUG nova.virt.hardware [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.137 279685 INFO nova.compute.claims [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Claim successful on node np0005538513.localdomain#033[00m Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.143 279685 DEBUG nova.virt.driver [-] Emitting event Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.143 279685 INFO nova.compute.manager [-] [instance: d716674a-ba14-466a-956f-5bca9404174f] VM Stopped (Lifecycle Event)#033[00m Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.177 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.183 279685 DEBUG nova.compute.manager [None req-605a5be0-9c41-4684-8485-0a4e64da372b - - - - - -] [instance: d716674a-ba14-466a-956f-5bca9404174f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.293 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:01:20 localhost dnsmasq[307779]: read /var/lib/neutron/dhcp/0303a35a-aae2-4e58-b0e5-9091112c9857/addn_hosts - 0 addresses Nov 28 05:01:20 localhost dnsmasq-dhcp[307779]: read /var/lib/neutron/dhcp/0303a35a-aae2-4e58-b0e5-9091112c9857/host Nov 28 05:01:20 localhost dnsmasq-dhcp[307779]: read /var/lib/neutron/dhcp/0303a35a-aae2-4e58-b0e5-9091112c9857/opts Nov 28 05:01:20 localhost podman[309416]: 2025-11-28 10:01:20.546101289 +0000 UTC m=+0.058135613 container kill c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0303a35a-aae2-4e58-b0e5-9091112c9857, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 05:01:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:01:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:01:20 localhost podman[309429]: 2025-11-28 10:01:20.658516597 +0000 UTC m=+0.091553049 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125) Nov 28 05:01:20 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 28 05:01:20 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:01:20 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/88113730' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:01:20 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.759 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:01:20 localhost systemd[1]: tmp-crun.fqo8mg.mount: Deactivated successfully. Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.778 279685 DEBUG nova.compute.provider_tree [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 05:01:20 localhost podman[309429]: 2025-11-28 10:01:20.782308183 +0000 UTC m=+0.215344665 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:01:20 localhost kernel: device tap2b1e8904-1c left promiscuous mode Nov 28 05:01:20 localhost ovn_controller[152322]: 2025-11-28T10:01:20Z|00119|binding|INFO|Releasing lport 2b1e8904-1c88-4828-a7bc-9f34a2930819 from this chassis (sb_readonly=0) Nov 28 05:01:20 localhost ovn_controller[152322]: 2025-11-28T10:01:20Z|00120|binding|INFO|Setting lport 2b1e8904-1c88-4828-a7bc-9f34a2930819 down in Southbound Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.783 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:20 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:20.790 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-0303a35a-aae2-4e58-b0e5-9091112c9857', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0303a35a-aae2-4e58-b0e5-9091112c9857', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2159235bf1c5407eac7a3e3826561913', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=20ff3eab-119a-4740-918d-4005c52a4e27, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2b1e8904-1c88-4828-a7bc-9f34a2930819) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.791 279685 DEBUG nova.scheduler.client.report [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 05:01:20 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:20.791 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 2b1e8904-1c88-4828-a7bc-9f34a2930819 in datapath 0303a35a-aae2-4e58-b0e5-9091112c9857 unbound from our chassis#033[00m Nov 28 05:01:20 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:20.793 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0303a35a-aae2-4e58-b0e5-9091112c9857, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:01:20 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:20.794 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[da90546a-3f4f-4c65-93b4-3602a757f5cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:20 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.803 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.814 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.684s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.815 279685 DEBUG nova.compute.manager [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m Nov 28 05:01:20 localhost podman[309430]: 2025-11-28 10:01:20.783688905 +0000 UTC m=+0.209709872 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Nov 28 05:01:20 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:01:20 localhost podman[309430]: 2025-11-28 10:01:20.86766256 +0000 UTC m=+0.293683547 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent) Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.871 279685 DEBUG nova.compute.manager [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.872 279685 DEBUG nova.network.neutron [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m Nov 28 05:01:20 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.893 279685 INFO nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.910 279685 DEBUG nova.compute.manager [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.990 279685 DEBUG nova.compute.manager [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.991 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m Nov 28 05:01:20 localhost nova_compute[279673]: 2025-11-28 10:01:20.992 279685 INFO nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Creating image(s)#033[00m Nov 28 05:01:21 localhost nova_compute[279673]: 2025-11-28 10:01:21.030 279685 DEBUG nova.storage.rbd_utils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] rbd image c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 28 05:01:21 localhost nova_compute[279673]: 2025-11-28 10:01:21.069 279685 DEBUG nova.storage.rbd_utils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] rbd image c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 28 05:01:21 localhost nova_compute[279673]: 2025-11-28 10:01:21.110 279685 DEBUG nova.storage.rbd_utils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] rbd image c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 28 05:01:21 localhost nova_compute[279673]: 2025-11-28 10:01:21.116 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Acquiring lock "1d475a5fe6866c2fa864abfa6db335a58fd8123d" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:21 localhost nova_compute[279673]: 2025-11-28 10:01:21.117 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "1d475a5fe6866c2fa864abfa6db335a58fd8123d" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:21 localhost nova_compute[279673]: 2025-11-28 10:01:21.187 279685 DEBUG nova.virt.libvirt.imagebackend [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Image locations are: [{'url': 'rbd://2c5417c9-00eb-57d5-a565-ddecbc7995c1/images/85968a96-5a0e-43a4-9c04-3954f640a7ed/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://2c5417c9-00eb-57d5-a565-ddecbc7995c1/images/85968a96-5a0e-43a4-9c04-3954f640a7ed/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m Nov 28 05:01:21 localhost nova_compute[279673]: 2025-11-28 10:01:21.258 279685 WARNING oslo_policy.policy [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m Nov 28 05:01:21 localhost nova_compute[279673]: 2025-11-28 10:01:21.258 279685 WARNING oslo_policy.policy [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m Nov 28 05:01:21 localhost nova_compute[279673]: 2025-11-28 10:01:21.262 279685 DEBUG nova.policy [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '318114281cb649bc9eeed12ecdc7273f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '310745a04bd441169ff77f55ccf6bd7b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m Nov 28 05:01:22 localhost nova_compute[279673]: 2025-11-28 10:01:22.074 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:01:22 localhost nova_compute[279673]: 2025-11-28 10:01:22.162 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d.part --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:01:22 localhost nova_compute[279673]: 2025-11-28 10:01:22.163 279685 DEBUG nova.virt.images [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] 85968a96-5a0e-43a4-9c04-3954f640a7ed was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m Nov 28 05:01:22 localhost nova_compute[279673]: 2025-11-28 10:01:22.165 279685 DEBUG nova.privsep.utils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m Nov 28 05:01:22 localhost nova_compute[279673]: 2025-11-28 10:01:22.165 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d.part /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:01:22 localhost nova_compute[279673]: 2025-11-28 10:01:22.470 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d.part /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d.converted" returned: 0 in 0.305s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:01:22 localhost nova_compute[279673]: 2025-11-28 10:01:22.473 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:01:22 localhost nova_compute[279673]: 2025-11-28 10:01:22.516 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d.converted --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:01:22 localhost nova_compute[279673]: 2025-11-28 10:01:22.517 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "1d475a5fe6866c2fa864abfa6db335a58fd8123d" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.400s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:22 localhost dnsmasq[309233]: read /var/lib/neutron/dhcp/b42f9a09-f299-4469-8a2a-b6b8c70a7aed/addn_hosts - 0 addresses Nov 28 05:01:22 localhost podman[309561]: 2025-11-28 10:01:22.520627829 +0000 UTC m=+0.044272178 container kill 2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b42f9a09-f299-4469-8a2a-b6b8c70a7aed, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 28 05:01:22 localhost dnsmasq-dhcp[309233]: read /var/lib/neutron/dhcp/b42f9a09-f299-4469-8a2a-b6b8c70a7aed/host Nov 28 05:01:22 localhost dnsmasq-dhcp[309233]: read /var/lib/neutron/dhcp/b42f9a09-f299-4469-8a2a-b6b8c70a7aed/opts Nov 28 05:01:22 localhost nova_compute[279673]: 2025-11-28 10:01:22.544 279685 DEBUG nova.storage.rbd_utils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] rbd image c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 28 05:01:22 localhost nova_compute[279673]: 2025-11-28 10:01:22.549 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:01:22 localhost ovn_controller[152322]: 2025-11-28T10:01:22Z|00121|binding|INFO|Releasing lport 26794c2c-f636-431d-8d25-cfbabb72ae33 from this chassis (sb_readonly=0) Nov 28 05:01:22 localhost ovn_controller[152322]: 2025-11-28T10:01:22Z|00122|binding|INFO|Setting lport 26794c2c-f636-431d-8d25-cfbabb72ae33 down in Southbound Nov 28 05:01:22 localhost kernel: device tap26794c2c-f6 left promiscuous mode Nov 28 05:01:22 localhost nova_compute[279673]: 2025-11-28 10:01:22.662 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:22 localhost nova_compute[279673]: 2025-11-28 10:01:22.688 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:22 localhost nova_compute[279673]: 2025-11-28 10:01:22.690 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:22 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:22.744 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-b42f9a09-f299-4469-8a2a-b6b8c70a7aed', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b42f9a09-f299-4469-8a2a-b6b8c70a7aed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6dce7e95fa0443beb41563da37907095', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e8b0f643-a0b0-4f6d-b3fc-848bb5b1dd8d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=26794c2c-f636-431d-8d25-cfbabb72ae33) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:01:22 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:22.747 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 26794c2c-f636-431d-8d25-cfbabb72ae33 in datapath b42f9a09-f299-4469-8a2a-b6b8c70a7aed unbound from our chassis#033[00m Nov 28 05:01:22 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:22.751 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b42f9a09-f299-4469-8a2a-b6b8c70a7aed, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:01:22 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:22.752 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[b2ac9ff6-2a4f-4c7c-bd52-cdc8a4a99b30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.017 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/1d475a5fe6866c2fa864abfa6db335a58fd8123d c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.110 279685 DEBUG nova.storage.rbd_utils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] resizing rbd image c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.182 279685 DEBUG nova.network.neutron [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Successfully updated port: 62b8533f-b250-4475-80c2-28c4543536b5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.214 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Acquiring lock "refresh_cache-c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.214 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Acquired lock "refresh_cache-c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.215 279685 DEBUG nova.network.neutron [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.225 279685 DEBUG nova.compute.manager [req-b4ffaa82-b05e-4eea-bdf0-df5714d78b7b req-7dd329a8-2e44-4d2e-8004-6784b1815336 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received event network-changed-62b8533f-b250-4475-80c2-28c4543536b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.226 279685 DEBUG nova.compute.manager [req-b4ffaa82-b05e-4eea-bdf0-df5714d78b7b req-7dd329a8-2e44-4d2e-8004-6784b1815336 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Refreshing instance network info cache due to event network-changed-62b8533f-b250-4475-80c2-28c4543536b5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.226 279685 DEBUG oslo_concurrency.lockutils [req-b4ffaa82-b05e-4eea-bdf0-df5714d78b7b req-7dd329a8-2e44-4d2e-8004-6784b1815336 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "refresh_cache-c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.320 279685 DEBUG nova.network.neutron [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.332 279685 DEBUG nova.objects.instance [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lazy-loading 'migration_context' on Instance uuid c06e2ffc-a8af-41b6-ab88-680ef1f6fe50 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.361 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.361 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Ensure instance console log exists: /var/lib/nova/instances/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.362 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.363 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.363 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.680 279685 DEBUG nova.network.neutron [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Updating instance_info_cache with network_info: [{"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.712 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Releasing lock "refresh_cache-c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.712 279685 DEBUG nova.compute.manager [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Instance network_info: |[{"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.713 279685 DEBUG oslo_concurrency.lockutils [req-b4ffaa82-b05e-4eea-bdf0-df5714d78b7b req-7dd329a8-2e44-4d2e-8004-6784b1815336 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquired lock "refresh_cache-c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.714 279685 DEBUG nova.network.neutron [req-b4ffaa82-b05e-4eea-bdf0-df5714d78b7b req-7dd329a8-2e44-4d2e-8004-6784b1815336 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Refreshing network info cache for port 62b8533f-b250-4475-80c2-28c4543536b5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.719 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Start _get_guest_xml network_info=[{"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T09:59:44Z,direct_url=,disk_format='qcow2',id=85968a96-5a0e-43a4-9c04-3954f640a7ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dda653c53224db086060962b0702694',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2025-11-28T09:59:46Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'device_type': 'disk', 'encrypted': False, 'image_id': '85968a96-5a0e-43a4-9c04-3954f640a7ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.725 279685 WARNING nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.728 279685 DEBUG nova.virt.libvirt.host [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Searching host: 'np0005538513.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.729 279685 DEBUG nova.virt.libvirt.host [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.737 279685 DEBUG nova.virt.libvirt.host [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Searching host: 'np0005538513.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.738 279685 DEBUG nova.virt.libvirt.host [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.739 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.739 279685 DEBUG nova.virt.hardware [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T09:59:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98f289d4-5c06-4ab5-9089-7b580870d676',id=5,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T09:59:44Z,direct_url=,disk_format='qcow2',id=85968a96-5a0e-43a4-9c04-3954f640a7ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dda653c53224db086060962b0702694',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2025-11-28T09:59:46Z,virtual_size=,visibility=), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.740 279685 DEBUG nova.virt.hardware [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.740 279685 DEBUG nova.virt.hardware [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.741 279685 DEBUG nova.virt.hardware [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.741 279685 DEBUG nova.virt.hardware [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.742 279685 DEBUG nova.virt.hardware [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.742 279685 DEBUG nova.virt.hardware [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.743 279685 DEBUG nova.virt.hardware [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.743 279685 DEBUG nova.virt.hardware [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.744 279685 DEBUG nova.virt.hardware [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.744 279685 DEBUG nova.virt.hardware [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m Nov 28 05:01:23 localhost nova_compute[279673]: 2025-11-28 10:01:23.749 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:01:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:01:23 localhost podman[309697]: 2025-11-28 10:01:23.868399099 +0000 UTC m=+0.099451690 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 05:01:23 localhost podman[309697]: 2025-11-28 10:01:23.913798262 +0000 UTC m=+0.144850823 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 28 05:01:23 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:01:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 28 05:01:24 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/85736152' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.223 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.255 279685 DEBUG nova.storage.rbd_utils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] rbd image c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.260 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:01:24 localhost ovn_controller[152322]: 2025-11-28T10:01:24Z|00123|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.463 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.476 279685 DEBUG nova.network.neutron [req-b4ffaa82-b05e-4eea-bdf0-df5714d78b7b req-7dd329a8-2e44-4d2e-8004-6784b1815336 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Updated VIF entry in instance network info cache for port 62b8533f-b250-4475-80c2-28c4543536b5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.477 279685 DEBUG nova.network.neutron [req-b4ffaa82-b05e-4eea-bdf0-df5714d78b7b req-7dd329a8-2e44-4d2e-8004-6784b1815336 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Updating instance_info_cache with network_info: [{"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.507 279685 DEBUG oslo_concurrency.lockutils [req-b4ffaa82-b05e-4eea-bdf0-df5714d78b7b req-7dd329a8-2e44-4d2e-8004-6784b1815336 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Releasing lock "refresh_cache-c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 05:01:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 28 05:01:24 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2373481162' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.704 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.706 279685 DEBUG nova.virt.libvirt.vif [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T10:01:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-915340611',display_name='tempest-LiveMigrationTest-server-915340611',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005538513.localdomain',hostname='tempest-livemigrationtest-server-915340611',id=8,image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0005538513.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005538513.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='310745a04bd441169ff77f55ccf6bd7b',ramdisk_id='',reservation_id='r-1g332w05',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-480152442',owner_user_name='tempest-LiveMigrationTest-480152442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T10:01:20Z,user_data=None,user_id='318114281cb649bc9eeed12ecdc7273f',uuid=c06e2ffc-a8af-41b6-ab88-680ef1f6fe50,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.707 279685 DEBUG nova.network.os_vif_util [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Converting VIF {"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.708 279685 DEBUG nova.network.os_vif_util [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=62b8533f-b250-4475-80c2-28c4543536b5,network=Network(ad2d8cf7-987d-4804-acbd-9b3e248dc8cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap62b8533f-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.710 279685 DEBUG nova.objects.instance [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lazy-loading 'pci_devices' on Instance uuid c06e2ffc-a8af-41b6-ab88-680ef1f6fe50 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.728 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] End _get_guest_xml xml= Nov 28 05:01:24 localhost nova_compute[279673]: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50 Nov 28 05:01:24 localhost nova_compute[279673]: instance-00000008 Nov 28 05:01:24 localhost nova_compute[279673]: 131072 Nov 28 05:01:24 localhost nova_compute[279673]: 1 Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: tempest-LiveMigrationTest-server-915340611 Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:23 Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: 128 Nov 28 05:01:24 localhost nova_compute[279673]: 1 Nov 28 05:01:24 localhost nova_compute[279673]: 0 Nov 28 05:01:24 localhost nova_compute[279673]: 0 Nov 28 05:01:24 localhost nova_compute[279673]: 1 Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: tempest-LiveMigrationTest-480152442-project-member Nov 28 05:01:24 localhost nova_compute[279673]: tempest-LiveMigrationTest-480152442 Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: RDO Nov 28 05:01:24 localhost nova_compute[279673]: OpenStack Compute Nov 28 05:01:24 localhost nova_compute[279673]: 27.5.2-0.20250829104910.6f8decf.el9 Nov 28 05:01:24 localhost nova_compute[279673]: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50 Nov 28 05:01:24 localhost nova_compute[279673]: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50 Nov 28 05:01:24 localhost nova_compute[279673]: Virtual Machine Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: hvm Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: /dev/urandom Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: Nov 28 05:01:24 localhost nova_compute[279673]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.729 279685 DEBUG nova.compute.manager [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Preparing to wait for external event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.729 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Acquiring lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.729 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.730 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.731 279685 DEBUG nova.virt.libvirt.vif [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T10:01:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-915340611',display_name='tempest-LiveMigrationTest-server-915340611',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005538513.localdomain',hostname='tempest-livemigrationtest-server-915340611',id=8,image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0005538513.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005538513.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='310745a04bd441169ff77f55ccf6bd7b',ramdisk_id='',reservation_id='r-1g332w05',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-480152442',owner_user_name='tempest-LiveMigrationTest-480152442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T10:01:20Z,user_data=None,user_id='318114281cb649bc9eeed12ecdc7273f',uuid=c06e2ffc-a8af-41b6-ab88-680ef1f6fe50,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.731 279685 DEBUG nova.network.os_vif_util [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Converting VIF {"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.732 279685 DEBUG nova.network.os_vif_util [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=62b8533f-b250-4475-80c2-28c4543536b5,network=Network(ad2d8cf7-987d-4804-acbd-9b3e248dc8cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap62b8533f-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.732 279685 DEBUG os_vif [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=62b8533f-b250-4475-80c2-28c4543536b5,network=Network(ad2d8cf7-987d-4804-acbd-9b3e248dc8cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap62b8533f-b2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.733 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.733 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.734 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.738 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.738 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62b8533f-b2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.739 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap62b8533f-b2, col_values=(('external_ids', {'iface-id': '62b8533f-b250-4475-80c2-28c4543536b5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:58:68:3c', 'vm-uuid': 'c06e2ffc-a8af-41b6-ab88-680ef1f6fe50'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.741 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.742 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.750 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.751 279685 INFO os_vif [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=62b8533f-b250-4475-80c2-28c4543536b5,network=Network(ad2d8cf7-987d-4804-acbd-9b3e248dc8cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap62b8533f-b2')#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.809 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.809 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.810 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] No VIF found with MAC fa:16:3e:58:68:3c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.811 279685 INFO nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Using config drive#033[00m Nov 28 05:01:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.842 279685 DEBUG nova.storage.rbd_utils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] rbd image c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 28 05:01:24 localhost nova_compute[279673]: 2025-11-28 10:01:24.988 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:25 localhost nova_compute[279673]: 2025-11-28 10:01:25.143 279685 INFO nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Creating config drive at /var/lib/nova/instances/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50/disk.config#033[00m Nov 28 05:01:25 localhost nova_compute[279673]: 2025-11-28 10:01:25.150 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps02jy7nw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:01:25 localhost nova_compute[279673]: 2025-11-28 10:01:25.167 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:25 localhost nova_compute[279673]: 2025-11-28 10:01:25.278 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps02jy7nw" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:01:25 localhost nova_compute[279673]: 2025-11-28 10:01:25.318 279685 DEBUG nova.storage.rbd_utils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] rbd image c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 28 05:01:25 localhost nova_compute[279673]: 2025-11-28 10:01:25.323 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50/disk.config c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:01:25 localhost dnsmasq[307779]: exiting on receipt of SIGTERM Nov 28 05:01:25 localhost podman[309833]: 2025-11-28 10:01:25.387078201 +0000 UTC m=+0.067862513 container kill c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0303a35a-aae2-4e58-b0e5-9091112c9857, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 28 05:01:25 localhost systemd[1]: libpod-c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b.scope: Deactivated successfully. Nov 28 05:01:25 localhost podman[309867]: 2025-11-28 10:01:25.46762315 +0000 UTC m=+0.054472291 container died c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0303a35a-aae2-4e58-b0e5-9091112c9857, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 28 05:01:25 localhost systemd[1]: tmp-crun.EAJ36a.mount: Deactivated successfully. Nov 28 05:01:25 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b-userdata-shm.mount: Deactivated successfully. Nov 28 05:01:25 localhost podman[309867]: 2025-11-28 10:01:25.539149864 +0000 UTC m=+0.125998965 container remove c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0303a35a-aae2-4e58-b0e5-9091112c9857, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:01:25 localhost nova_compute[279673]: 2025-11-28 10:01:25.569 279685 DEBUG oslo_concurrency.processutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50/disk.config c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.246s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:01:25 localhost nova_compute[279673]: 2025-11-28 10:01:25.570 279685 INFO nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Deleting local config drive /var/lib/nova/instances/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50/disk.config because it was imported into RBD.#033[00m Nov 28 05:01:25 localhost systemd[1]: libpod-conmon-c46654d3d04fcd252763426cc06e1231bb8336d92b2f305e57fbef13140f2d5b.scope: Deactivated successfully. Nov 28 05:01:25 localhost kernel: device tap62b8533f-b2 entered promiscuous mode Nov 28 05:01:25 localhost NetworkManager[5967]: [1764324085.6169] manager: (tap62b8533f-b2): new Tun device (/org/freedesktop/NetworkManager/Devices/25) Nov 28 05:01:25 localhost systemd-udevd[309902]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:01:25 localhost nova_compute[279673]: 2025-11-28 10:01:25.651 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:25 localhost nova_compute[279673]: 2025-11-28 10:01:25.656 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:25 localhost ovn_controller[152322]: 2025-11-28T10:01:25Z|00124|binding|INFO|Claiming lport 62b8533f-b250-4475-80c2-28c4543536b5 for this chassis. Nov 28 05:01:25 localhost ovn_controller[152322]: 2025-11-28T10:01:25Z|00125|binding|INFO|62b8533f-b250-4475-80c2-28c4543536b5: Claiming fa:16:3e:58:68:3c 10.100.0.12 Nov 28 05:01:25 localhost ovn_controller[152322]: 2025-11-28T10:01:25Z|00126|binding|INFO|Claiming lport fc82099a-3702-4952-add7-ba3d39b895a0 for this chassis. Nov 28 05:01:25 localhost ovn_controller[152322]: 2025-11-28T10:01:25Z|00127|binding|INFO|fc82099a-3702-4952-add7-ba3d39b895a0: Claiming fa:16:3e:41:3c:a8 19.80.0.139 Nov 28 05:01:25 localhost NetworkManager[5967]: [1764324085.6661] device (tap62b8533f-b2): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Nov 28 05:01:25 localhost NetworkManager[5967]: [1764324085.6667] device (tap62b8533f-b2): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.667 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:3c:a8 19.80.0.139'], port_security=['fa:16:3e:41:3c:a8 19.80.0.139'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['62b8533f-b250-4475-80c2-28c4543536b5'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-957922340', 'neutron:cidrs': '19.80.0.139/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-492ef1de-4a68-49e4-b736-13cdb2eb7b59', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-957922340', 'neutron:project_id': '310745a04bd441169ff77f55ccf6bd7b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8cd6a72f-0cb3-42f5-95bb-7d1b962c8a1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=96ffb618-d617-4e8c-a498-acb365ae5313, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=fc82099a-3702-4952-add7-ba3d39b895a0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:01:25 localhost systemd-machined[83422]: New machine qemu-4-instance-00000008. Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.671 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:68:3c 10.100.0.12'], port_security=['fa:16:3e:58:68:3c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-191355626', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c06e2ffc-a8af-41b6-ab88-680ef1f6fe50', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-191355626', 'neutron:project_id': '310745a04bd441169ff77f55ccf6bd7b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8cd6a72f-0cb3-42f5-95bb-7d1b962c8a1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b393f93f-1891-43a2-aa26-a4cab2642f74, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=62b8533f-b250-4475-80c2-28c4543536b5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.673 158130 INFO neutron.agent.ovn.metadata.agent [-] Port fc82099a-3702-4952-add7-ba3d39b895a0 in datapath 492ef1de-4a68-49e4-b736-13cdb2eb7b59 bound to our chassis#033[00m Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.677 158130 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 492ef1de-4a68-49e4-b736-13cdb2eb7b59#033[00m Nov 28 05:01:25 localhost ovn_controller[152322]: 2025-11-28T10:01:25Z|00128|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.689 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[a55569bd-8ec3-44d1-9c08-5ddf44691ecc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.690 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap492ef1de-41 in ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Nov 28 05:01:25 localhost systemd[1]: Started Virtual Machine qemu-4-instance-00000008. Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.693 158233 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap492ef1de-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.693 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[6e6fcf51-879f-4a4f-bfbd-e5b2505def49]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.695 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[34acae76-551f-4402-ab63-28680f008912]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.706 158264 DEBUG oslo.privsep.daemon [-] privsep: reply[58d6bd2e-05c8-49da-bd79-db1ada65b8e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:25 localhost nova_compute[279673]: 2025-11-28 10:01:25.717 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.730 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[5abf38d3-fe9e-4350-bf47-a8dd78cf660e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:25 localhost ovn_controller[152322]: 2025-11-28T10:01:25Z|00129|binding|INFO|Setting lport 62b8533f-b250-4475-80c2-28c4543536b5 ovn-installed in OVS Nov 28 05:01:25 localhost ovn_controller[152322]: 2025-11-28T10:01:25Z|00130|binding|INFO|Setting lport 62b8533f-b250-4475-80c2-28c4543536b5 up in Southbound Nov 28 05:01:25 localhost ovn_controller[152322]: 2025-11-28T10:01:25Z|00131|binding|INFO|Setting lport fc82099a-3702-4952-add7-ba3d39b895a0 up in Southbound Nov 28 05:01:25 localhost nova_compute[279673]: 2025-11-28 10:01:25.733 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.759 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[d53b3cc8-2927-4eff-89f4-9075c669e3e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.770 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[c63d471d-6034-4550-8f41-a05f0137f5b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:25 localhost NetworkManager[5967]: [1764324085.7717] manager: (tap492ef1de-40): new Veth device (/org/freedesktop/NetworkManager/Devices/26) Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.796 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[7212105e-b42d-4553-ba8d-27fd77df2bf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.804 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[3006f1a9-2778-48dc-8635-dc042172dfe0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:25 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap492ef1de-41: link becomes ready Nov 28 05:01:25 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap492ef1de-40: link becomes ready Nov 28 05:01:25 localhost NetworkManager[5967]: [1764324085.8307] device (tap492ef1de-40): carrier: link connected Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.842 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[dd4388d8-6cea-43d9-bc16-e80f1d2623dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.860 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[7b7de348-9fca-4c9d-ac16-d718d97cd64c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap492ef1de-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:e3:7c:76'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1191993, 'reachable_time': 15378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309959, 'error': None, 'target': 'ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.874 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[ba1360fb-98a0-465d-84c7-782e4f66e91c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee3:7c76'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1191993, 'tstamp': 1191993}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 309967, 'error': None, 'target': 'ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:25 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:01:25.890 261084 INFO neutron.agent.dhcp.agent [None req-63ded401-cc9b-44ea-988e-ce6bf04017c3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:01:25 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:01:25.891 261084 INFO neutron.agent.dhcp.agent [None req-63ded401-cc9b-44ea-988e-ce6bf04017c3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.889 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[62bb0251-596a-41d5-ac74-7a84ad98a81d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap492ef1de-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:e3:7c:76'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1191993, 'reachable_time': 15378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 309976, 'error': None, 'target': 'ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.918 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[fc766d5c-f077-4c92-810f-5a7f0f9b2105]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.974 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[a4529775-7859-48d2-b846-7a4f73caa5a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.976 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap492ef1de-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.976 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.977 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap492ef1de-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:01:25 localhost nova_compute[279673]: 2025-11-28 10:01:25.979 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:25 localhost kernel: device tap492ef1de-40 entered promiscuous mode Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.982 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap492ef1de-40, col_values=(('external_ids', {'iface-id': '6838a8cb-20d7-44c7-aad3-e7f442484bd5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:01:25 localhost nova_compute[279673]: 2025-11-28 10:01:25.983 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:25 localhost ovn_controller[152322]: 2025-11-28T10:01:25Z|00132|binding|INFO|Releasing lport 6838a8cb-20d7-44c7-aad3-e7f442484bd5 from this chassis (sb_readonly=0) Nov 28 05:01:25 localhost nova_compute[279673]: 2025-11-28 10:01:25.993 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.995 158130 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/492ef1de-4a68-49e4-b736-13cdb2eb7b59.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/492ef1de-4a68-49e4-b736-13cdb2eb7b59.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.997 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[f68c6aa4-ecbf-4434-b2cd-7f533711339b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.998 158130 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: global Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: log /dev/log local0 debug Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: log-tag haproxy-metadata-proxy-492ef1de-4a68-49e4-b736-13cdb2eb7b59 Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: user root Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: group root Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: maxconn 1024 Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: pidfile /var/lib/neutron/external/pids/492ef1de-4a68-49e4-b736-13cdb2eb7b59.pid.haproxy Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: daemon Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: defaults Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: log global Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: mode http Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: option httplog Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: option dontlognull Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: option http-server-close Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: option forwardfor Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: retries 3 Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: timeout http-request 30s Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: timeout connect 30s Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: timeout client 32s Nov 28 05:01:25 localhost ovn_metadata_agent[158125]: timeout server 32s Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: timeout http-keep-alive 30s Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: listen listener Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: bind 169.254.169.254:80 Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: server metadata /var/lib/neutron/metadata_proxy Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: http-request add-header X-OVN-Network-ID 492ef1de-4a68-49e4-b736-13cdb2eb7b59 Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:25.999 158130 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59', 'env', 'PROCESS_TAG=haproxy-492ef1de-4a68-49e4-b736-13cdb2eb7b59', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/492ef1de-4a68-49e4-b736-13cdb2eb7b59.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Nov 28 05:01:26 localhost nova_compute[279673]: 2025-11-28 10:01:26.052 279685 DEBUG nova.virt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 28 05:01:26 localhost nova_compute[279673]: 2025-11-28 10:01:26.052 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] VM Started (Lifecycle Event)#033[00m Nov 28 05:01:26 localhost nova_compute[279673]: 2025-11-28 10:01:26.079 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 05:01:26 localhost nova_compute[279673]: 2025-11-28 10:01:26.087 279685 DEBUG nova.virt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Emitting event Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 28 05:01:26 localhost nova_compute[279673]: 2025-11-28 10:01:26.087 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] VM Paused (Lifecycle Event)#033[00m Nov 28 05:01:26 localhost nova_compute[279673]: 2025-11-28 10:01:26.109 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 05:01:26 localhost nova_compute[279673]: 2025-11-28 10:01:26.113 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Nov 28 05:01:26 localhost nova_compute[279673]: 2025-11-28 10:01:26.143 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m Nov 28 05:01:26 localhost systemd[1]: var-lib-containers-storage-overlay-776ee30c8f4b1d3b8a5504203661447b2ee50a3f6f2aafa660ced48066eed543-merged.mount: Deactivated successfully. Nov 28 05:01:26 localhost systemd[1]: run-netns-qdhcp\x2d0303a35a\x2daae2\x2d4e58\x2db0e5\x2d9091112c9857.mount: Deactivated successfully. Nov 28 05:01:26 localhost podman[310018]: Nov 28 05:01:26 localhost podman[310018]: 2025-11-28 10:01:26.434968655 +0000 UTC m=+0.095482339 container create ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:01:26 localhost systemd[1]: Started libpod-conmon-ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84.scope. Nov 28 05:01:26 localhost podman[310018]: 2025-11-28 10:01:26.386059385 +0000 UTC m=+0.046573119 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Nov 28 05:01:26 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:01:26.489 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:01:26 localhost systemd[1]: Started libcrun container. Nov 28 05:01:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c783556db9b2f3943c08dc4e97d79185a998a122f3fd4b4982860e000e73c01f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:01:26 localhost podman[310018]: 2025-11-28 10:01:26.51245729 +0000 UTC m=+0.172970984 container init ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:01:26 localhost podman[310018]: 2025-11-28 10:01:26.521639963 +0000 UTC m=+0.182153647 container start ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:01:26 localhost neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59[310032]: [NOTICE] (310036) : New worker (310038) forked Nov 28 05:01:26 localhost neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59[310032]: [NOTICE] (310036) : Loading success. Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.583 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 62b8533f-b250-4475-80c2-28c4543536b5 in datapath ad2d8cf7-987d-4804-acbd-9b3e248dc8cd unbound from our chassis#033[00m Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.587 158130 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ad2d8cf7-987d-4804-acbd-9b3e248dc8cd#033[00m Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.598 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[5314b3d9-4408-4641-84ea-c6114d5a98ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.599 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapad2d8cf7-91 in ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.601 158233 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapad2d8cf7-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.601 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[fbb9f30f-0ba7-47da-9d3a-ebf9dc870d6c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.603 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[f0dd62a3-5ace-4738-bc52-01b683d436e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.613 158264 DEBUG oslo.privsep.daemon [-] privsep: reply[b7301a2a-b922-48b8-9bde-7d3fa86df545]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.626 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[d0126d58-74ef-4a76-8b5e-db6d73769f70]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.654 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[8c0e96ed-55c7-43af-98ba-65ddf49055d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:26 localhost systemd-udevd[309929]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.661 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[1a5274ff-f0c8-4d02-bf88-327c42682adb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:26 localhost NetworkManager[5967]: [1764324086.6629] manager: (tapad2d8cf7-90): new Veth device (/org/freedesktop/NetworkManager/Devices/27) Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.700 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[c4eb4c5f-24c8-4e60-9961-053f92378d47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.704 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[cd6b197a-a339-410a-9db8-40ddde0cdf59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:26 localhost NetworkManager[5967]: [1764324086.7296] device (tapad2d8cf7-90): carrier: link connected Nov 28 05:01:26 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapad2d8cf7-90: link becomes ready Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.736 158244 DEBUG oslo.privsep.daemon [-] privsep: reply[bbc712f1-e71e-4cc1-9b23-6fffc67a1ff7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.753 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[5122f276-b575-4e2d-9aca-47d455547d90]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapad2d8cf7-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:14:78:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1192083, 'reachable_time': 38285, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 310082, 'error': None, 'target': 'ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.772 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[17a3e11f-1b62-40d0-8554-b959b5f2a6be]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe14:785b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1192083, 'tstamp': 1192083}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 310083, 'error': None, 'target': 'ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:26 localhost dnsmasq[309233]: exiting on receipt of SIGTERM Nov 28 05:01:26 localhost podman[310069]: 2025-11-28 10:01:26.772780383 +0000 UTC m=+0.064143258 container kill 2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b42f9a09-f299-4469-8a2a-b6b8c70a7aed, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:01:26 localhost systemd[1]: libpod-2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242.scope: Deactivated successfully. Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.788 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[3179db9c-f424-442f-b747-2576f9bb2e3a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapad2d8cf7-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:14:78:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1192083, 'reachable_time': 38285, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 310085, 'error': None, 'target': 'ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.817 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[f41d555c-d38e-4683-8b71-d1beb6b8c1f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:26 localhost podman[310086]: 2025-11-28 10:01:26.847862216 +0000 UTC m=+0.059493625 container died 2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b42f9a09-f299-4469-8a2a-b6b8c70a7aed, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.874 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[a458e751-30ce-4f1d-b41b-05ef7b2f8a8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.876 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad2d8cf7-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.876 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.877 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapad2d8cf7-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:01:26 localhost kernel: device tapad2d8cf7-90 entered promiscuous mode Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.884 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapad2d8cf7-90, col_values=(('external_ids', {'iface-id': 'acd4bbc3-c7c4-47d8-b58b-29abee48b714'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:01:26 localhost nova_compute[279673]: 2025-11-28 10:01:26.884 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:26 localhost ovn_controller[152322]: 2025-11-28T10:01:26Z|00133|binding|INFO|Releasing lport acd4bbc3-c7c4-47d8-b58b-29abee48b714 from this chassis (sb_readonly=0) Nov 28 05:01:26 localhost podman[310086]: 2025-11-28 10:01:26.891620438 +0000 UTC m=+0.103251807 container cleanup 2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b42f9a09-f299-4469-8a2a-b6b8c70a7aed, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 28 05:01:26 localhost systemd[1]: libpod-conmon-2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242.scope: Deactivated successfully. Nov 28 05:01:26 localhost nova_compute[279673]: 2025-11-28 10:01:26.897 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.900 158130 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ad2d8cf7-987d-4804-acbd-9b3e248dc8cd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ad2d8cf7-987d-4804-acbd-9b3e248dc8cd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.901 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[e7295a9a-cc9c-405e-bf3b-44dc69139025]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.904 158130 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: global Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: log /dev/log local0 debug Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: log-tag haproxy-metadata-proxy-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: user root Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: group root Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: maxconn 1024 Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: pidfile /var/lib/neutron/external/pids/ad2d8cf7-987d-4804-acbd-9b3e248dc8cd.pid.haproxy Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: daemon Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: defaults Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: log global Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: mode http Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: option httplog Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: option dontlognull Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: option http-server-close Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: option forwardfor Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: retries 3 Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: timeout http-request 30s Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: timeout connect 30s Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: timeout client 32s Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: timeout server 32s Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: timeout http-keep-alive 30s Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: listen listener Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: bind 169.254.169.254:80 Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: server metadata /var/lib/neutron/metadata_proxy Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: http-request add-header X-OVN-Network-ID ad2d8cf7-987d-4804-acbd-9b3e248dc8cd Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Nov 28 05:01:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:26.905 158130 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd', 'env', 'PROCESS_TAG=haproxy-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ad2d8cf7-987d-4804-acbd-9b3e248dc8cd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Nov 28 05:01:26 localhost podman[310088]: 2025-11-28 10:01:26.93666853 +0000 UTC m=+0.139666414 container remove 2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b42f9a09-f299-4469-8a2a-b6b8c70a7aed, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 28 05:01:27 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:01:27.157 261084 INFO neutron.agent.dhcp.agent [None req-01e144de-e5a2-4ccb-86d0-225b19311ea5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:01:27 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:01:27.159 261084 INFO neutron.agent.dhcp.agent [None req-01e144de-e5a2-4ccb-86d0-225b19311ea5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:01:27 localhost podman[310146]: Nov 28 05:01:27 localhost podman[310146]: 2025-11-28 10:01:27.352445939 +0000 UTC m=+0.094525190 container create 96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2) Nov 28 05:01:27 localhost systemd[1]: Started libpod-conmon-96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a.scope. Nov 28 05:01:27 localhost systemd[1]: var-lib-containers-storage-overlay-1d5f4174ee8c8f2aa1c8d3fbcaed9ef7f054f6f66b168db2dbfe4a5c6fb91850-merged.mount: Deactivated successfully. Nov 28 05:01:27 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2469adc38390e7842b1677a959dd45694075fcff55aac0a843082bc60cc47242-userdata-shm.mount: Deactivated successfully. Nov 28 05:01:27 localhost systemd[1]: run-netns-qdhcp\x2db42f9a09\x2df299\x2d4469\x2d8a2a\x2db6b8c70a7aed.mount: Deactivated successfully. Nov 28 05:01:27 localhost podman[310146]: 2025-11-28 10:01:27.308352457 +0000 UTC m=+0.050431748 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Nov 28 05:01:27 localhost systemd[1]: Started libcrun container. Nov 28 05:01:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fa67beacec21ebd5b294a9ebc5d587ffcfb9867f9ba8eaa2c201ba3d1c08b89b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:01:27 localhost podman[310146]: 2025-11-28 10:01:27.425171399 +0000 UTC m=+0.167250610 container init 96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 28 05:01:27 localhost podman[310146]: 2025-11-28 10:01:27.43434656 +0000 UTC m=+0.176425811 container start 96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 28 05:01:27 localhost neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd[310161]: [NOTICE] (310165) : New worker (310167) forked Nov 28 05:01:27 localhost neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd[310161]: [NOTICE] (310165) : Loading success. Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.054 279685 DEBUG nova.compute.manager [req-f0f34203-8778-40b9-ab00-13ff7807b3f1 req-0c037a56-9e5f-4e39-8cf1-9de1bb81aef0 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.055 279685 DEBUG oslo_concurrency.lockutils [req-f0f34203-8778-40b9-ab00-13ff7807b3f1 req-0c037a56-9e5f-4e39-8cf1-9de1bb81aef0 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.056 279685 DEBUG oslo_concurrency.lockutils [req-f0f34203-8778-40b9-ab00-13ff7807b3f1 req-0c037a56-9e5f-4e39-8cf1-9de1bb81aef0 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.056 279685 DEBUG oslo_concurrency.lockutils [req-f0f34203-8778-40b9-ab00-13ff7807b3f1 req-0c037a56-9e5f-4e39-8cf1-9de1bb81aef0 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.057 279685 DEBUG nova.compute.manager [req-f0f34203-8778-40b9-ab00-13ff7807b3f1 req-0c037a56-9e5f-4e39-8cf1-9de1bb81aef0 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Processing event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.059 279685 DEBUG nova.compute.manager [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.063 279685 DEBUG nova.virt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.064 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] VM Resumed (Lifecycle Event)#033[00m Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.068 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.073 279685 INFO nova.virt.libvirt.driver [-] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Instance spawned successfully.#033[00m Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.073 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.390 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.395 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.466 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.467 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.468 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.468 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.469 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.470 279685 DEBUG nova.virt.libvirt.driver [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.529 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.584 279685 INFO nova.compute.manager [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Took 8.59 seconds to spawn the instance on the hypervisor.#033[00m Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.585 279685 DEBUG nova.compute.manager [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.718 279685 INFO nova.compute.manager [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Took 9.61 seconds to build instance.#033[00m Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.788 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.809 279685 DEBUG oslo_concurrency.lockutils [None req-a3a18895-c076-456e-b839-5cbe1f76a7ca 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.771s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:29 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0. Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.859230) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40 Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324089859320, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 1068, "num_deletes": 251, "total_data_size": 891118, "memory_usage": 909088, "flush_reason": "Manual Compaction"} Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324089867528, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 599688, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23666, "largest_seqno": 24733, "table_properties": {"data_size": 595874, "index_size": 1477, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10741, "raw_average_key_size": 21, "raw_value_size": 587279, "raw_average_value_size": 1149, "num_data_blocks": 66, "num_entries": 511, "num_filter_entries": 511, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324014, "oldest_key_time": 1764324014, "file_creation_time": 1764324089, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}} Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 8379 microseconds, and 3319 cpu microseconds. Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.867609) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 599688 bytes OK Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.867647) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.869743) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.869766) EVENT_LOG_v1 {"time_micros": 1764324089869759, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.869790) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 886127, prev total WAL file size 886451, number of live WAL files 2. Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.870709) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033373630' seq:72057594037927935, type:22 .. '6D6772737461740034303132' seq:0, type:0; will stop at (end) Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(585KB)], [39(18MB)] Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324089870762, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 20054849, "oldest_snapshot_seqno": -1} Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 12083 keys, 18080755 bytes, temperature: kUnknown Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324089971265, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 18080755, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18012837, "index_size": 36649, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30213, "raw_key_size": 324104, "raw_average_key_size": 26, "raw_value_size": 17808190, "raw_average_value_size": 1473, "num_data_blocks": 1394, "num_entries": 12083, "num_filter_entries": 12083, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764324089, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}} Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.973429) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 18080755 bytes Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.975165) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 199.3 rd, 179.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 18.6 +0.0 blob) out(17.2 +0.0 blob), read-write-amplify(63.6) write-amplify(30.2) OK, records in: 12575, records dropped: 492 output_compression: NoCompression Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.975186) EVENT_LOG_v1 {"time_micros": 1764324089975178, "job": 22, "event": "compaction_finished", "compaction_time_micros": 100604, "compaction_time_cpu_micros": 32451, "output_level": 6, "num_output_files": 1, "total_output_size": 18080755, "num_input_records": 12575, "num_output_records": 12083, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324089975338, "job": 22, "event": "table_file_deletion", "file_number": 41} Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324089976866, "job": 22, "event": "table_file_deletion", "file_number": 39} Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.870608) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.976935) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.976941) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.976948) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.976957) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:01:29 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:01:29.976959) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:01:29 localhost nova_compute[279673]: 2025-11-28 10:01:29.992 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:31 localhost nova_compute[279673]: 2025-11-28 10:01:31.316 279685 DEBUG nova.compute.manager [req-409bddd0-cabe-4f03-b1f7-6643140a842b req-1dc67035-69e9-4136-a58a-9f3d55a280e5 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 28 05:01:31 localhost nova_compute[279673]: 2025-11-28 10:01:31.317 279685 DEBUG oslo_concurrency.lockutils [req-409bddd0-cabe-4f03-b1f7-6643140a842b req-1dc67035-69e9-4136-a58a-9f3d55a280e5 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:31 localhost nova_compute[279673]: 2025-11-28 10:01:31.317 279685 DEBUG oslo_concurrency.lockutils [req-409bddd0-cabe-4f03-b1f7-6643140a842b req-1dc67035-69e9-4136-a58a-9f3d55a280e5 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:31 localhost nova_compute[279673]: 2025-11-28 10:01:31.318 279685 DEBUG oslo_concurrency.lockutils [req-409bddd0-cabe-4f03-b1f7-6643140a842b req-1dc67035-69e9-4136-a58a-9f3d55a280e5 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:31 localhost nova_compute[279673]: 2025-11-28 10:01:31.318 279685 DEBUG nova.compute.manager [req-409bddd0-cabe-4f03-b1f7-6643140a842b req-1dc67035-69e9-4136-a58a-9f3d55a280e5 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] No waiting events found dispatching network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 28 05:01:31 localhost nova_compute[279673]: 2025-11-28 10:01:31.319 279685 WARNING nova.compute.manager [req-409bddd0-cabe-4f03-b1f7-6643140a842b req-1dc67035-69e9-4136-a58a-9f3d55a280e5 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received unexpected event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 for instance with vm_state active and task_state None.#033[00m Nov 28 05:01:31 localhost nova_compute[279673]: 2025-11-28 10:01:31.997 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:33 localhost nova_compute[279673]: 2025-11-28 10:01:33.825 279685 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Check if temp file /var/lib/nova/instances/tmpx5ac6ig2 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m Nov 28 05:01:33 localhost nova_compute[279673]: 2025-11-28 10:01:33.826 279685 DEBUG nova.compute.manager [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] source check data is LibvirtLiveMigrateData(bdms=,block_migration=False,disk_available_mb=13312,disk_over_commit=False,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmpx5ac6ig2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='c06e2ffc-a8af-41b6-ab88-680ef1f6fe50',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=,old_vol_attachment_ids=,serial_listen_addr=None,serial_listen_ports=,src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=,target_connect_addr=,vifs=[VIFMigrateData],wait_for_vif_plugged=) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m Nov 28 05:01:34 localhost nova_compute[279673]: 2025-11-28 10:01:34.822 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:34 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:01:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:01:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:01:34 localhost podman[310177]: 2025-11-28 10:01:34.935299881 +0000 UTC m=+0.067633825 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 28 05:01:34 localhost podman[310177]: 2025-11-28 10:01:34.944977097 +0000 UTC m=+0.077311041 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3) Nov 28 05:01:34 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:01:34 localhost nova_compute[279673]: 2025-11-28 10:01:34.994 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:35 localhost podman[310176]: 2025-11-28 10:01:35.027184009 +0000 UTC m=+0.154721235 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 05:01:35 localhost podman[310176]: 2025-11-28 10:01:35.063319687 +0000 UTC m=+0.190856933 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 28 05:01:35 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:01:35 localhost nova_compute[279673]: 2025-11-28 10:01:35.784 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:01:35 localhost nova_compute[279673]: 2025-11-28 10:01:35.785 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:01:35 localhost nova_compute[279673]: 2025-11-28 10:01:35.785 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 05:01:36 localhost nova_compute[279673]: 2025-11-28 10:01:36.959 279685 DEBUG nova.compute.manager [req-ef84a24d-3da5-4aee-ac61-428d0f1384a9 req-9b75021e-a1ef-4d2c-90cf-416bc75dd7f4 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received event network-vif-unplugged-62b8533f-b250-4475-80c2-28c4543536b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 28 05:01:36 localhost nova_compute[279673]: 2025-11-28 10:01:36.959 279685 DEBUG oslo_concurrency.lockutils [req-ef84a24d-3da5-4aee-ac61-428d0f1384a9 req-9b75021e-a1ef-4d2c-90cf-416bc75dd7f4 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:36 localhost nova_compute[279673]: 2025-11-28 10:01:36.959 279685 DEBUG oslo_concurrency.lockutils [req-ef84a24d-3da5-4aee-ac61-428d0f1384a9 req-9b75021e-a1ef-4d2c-90cf-416bc75dd7f4 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:36 localhost nova_compute[279673]: 2025-11-28 10:01:36.960 279685 DEBUG oslo_concurrency.lockutils [req-ef84a24d-3da5-4aee-ac61-428d0f1384a9 req-9b75021e-a1ef-4d2c-90cf-416bc75dd7f4 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:36 localhost nova_compute[279673]: 2025-11-28 10:01:36.960 279685 DEBUG nova.compute.manager [req-ef84a24d-3da5-4aee-ac61-428d0f1384a9 req-9b75021e-a1ef-4d2c-90cf-416bc75dd7f4 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] No waiting events found dispatching network-vif-unplugged-62b8533f-b250-4475-80c2-28c4543536b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 28 05:01:36 localhost nova_compute[279673]: 2025-11-28 10:01:36.960 279685 DEBUG nova.compute.manager [req-ef84a24d-3da5-4aee-ac61-428d0f1384a9 req-9b75021e-a1ef-4d2c-90cf-416bc75dd7f4 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received event network-vif-unplugged-62b8533f-b250-4475-80c2-28c4543536b5 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m Nov 28 05:01:37 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e97 do_prune osdmap full prune enabled Nov 28 05:01:37 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e98 e98: 6 total, 6 up, 6 in Nov 28 05:01:37 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e98: 6 total, 6 up, 6 in Nov 28 05:01:37 localhost nova_compute[279673]: 2025-11-28 10:01:37.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:01:38 localhost nova_compute[279673]: 2025-11-28 10:01:38.602 279685 INFO nova.compute.manager [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Took 4.21 seconds for pre_live_migration on destination host np0005538515.localdomain.#033[00m Nov 28 05:01:38 localhost nova_compute[279673]: 2025-11-28 10:01:38.603 279685 DEBUG nova.compute.manager [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Instance event wait completed in 0 seconds for wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m Nov 28 05:01:38 localhost nova_compute[279673]: 2025-11-28 10:01:38.634 279685 DEBUG nova.compute.manager [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=13312,disk_over_commit=False,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmpx5ac6ig2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='c06e2ffc-a8af-41b6-ab88-680ef1f6fe50',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(62fb7f70-bf44-4fcf-8c08-e096ee66cd99),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m Nov 28 05:01:38 localhost nova_compute[279673]: 2025-11-28 10:01:38.638 279685 DEBUG nova.objects.instance [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Lazy-loading 'migration_context' on Instance uuid c06e2ffc-a8af-41b6-ab88-680ef1f6fe50 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:01:38 localhost nova_compute[279673]: 2025-11-28 10:01:38.640 279685 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m Nov 28 05:01:38 localhost nova_compute[279673]: 2025-11-28 10:01:38.642 279685 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m Nov 28 05:01:38 localhost nova_compute[279673]: 2025-11-28 10:01:38.642 279685 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m Nov 28 05:01:38 localhost nova_compute[279673]: 2025-11-28 10:01:38.655 279685 DEBUG nova.virt.libvirt.vif [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T10:01:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-915340611',display_name='tempest-LiveMigrationTest-server-915340611',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005538513.localdomain',hostname='tempest-livemigrationtest-server-915340611',id=8,image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-28T10:01:29Z,launched_on='np0005538513.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005538513.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='310745a04bd441169ff77f55ccf6bd7b',ramdisk_id='',reservation_id='r-1g332w05',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-480152442',owner_user_name='tempest-LiveMigrationTest-480152442-project-member'},tags=,task_state='migrating',terminated_at=None,trusted_certs=,updated_at=2025-11-28T10:01:29Z,user_data=None,user_id='318114281cb649bc9eeed12ecdc7273f',uuid=c06e2ffc-a8af-41b6-ab88-680ef1f6fe50,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m Nov 28 05:01:38 localhost nova_compute[279673]: 2025-11-28 10:01:38.656 279685 DEBUG nova.network.os_vif_util [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Converting VIF {"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 28 05:01:38 localhost nova_compute[279673]: 2025-11-28 10:01:38.657 279685 DEBUG nova.network.os_vif_util [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=62b8533f-b250-4475-80c2-28c4543536b5,network=Network(ad2d8cf7-987d-4804-acbd-9b3e248dc8cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap62b8533f-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 28 05:01:38 localhost nova_compute[279673]: 2025-11-28 10:01:38.658 279685 DEBUG nova.virt.libvirt.migration [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Updating guest XML with vif config: Nov 28 05:01:38 localhost nova_compute[279673]: Nov 28 05:01:38 localhost nova_compute[279673]: Nov 28 05:01:38 localhost nova_compute[279673]: Nov 28 05:01:38 localhost nova_compute[279673]: Nov 28 05:01:38 localhost nova_compute[279673]: Nov 28 05:01:38 localhost nova_compute[279673]: Nov 28 05:01:38 localhost nova_compute[279673]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m Nov 28 05:01:38 localhost nova_compute[279673]: 2025-11-28 10:01:38.660 279685 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m Nov 28 05:01:39 localhost nova_compute[279673]: 2025-11-28 10:01:39.146 279685 DEBUG nova.virt.libvirt.migration [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Current None elapsed 0 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m Nov 28 05:01:39 localhost nova_compute[279673]: 2025-11-28 10:01:39.148 279685 INFO nova.virt.libvirt.migration [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m Nov 28 05:01:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e98 do_prune osdmap full prune enabled Nov 28 05:01:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e99 e99: 6 total, 6 up, 6 in Nov 28 05:01:39 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e99: 6 total, 6 up, 6 in Nov 28 05:01:39 localhost nova_compute[279673]: 2025-11-28 10:01:39.766 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:01:39 localhost nova_compute[279673]: 2025-11-28 10:01:39.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:01:39 localhost nova_compute[279673]: 2025-11-28 10:01:39.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:01:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e99 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:01:39 localhost nova_compute[279673]: 2025-11-28 10:01:39.859 279685 DEBUG nova.compute.manager [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 28 05:01:39 localhost nova_compute[279673]: 2025-11-28 10:01:39.859 279685 DEBUG oslo_concurrency.lockutils [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:39 localhost nova_compute[279673]: 2025-11-28 10:01:39.859 279685 DEBUG oslo_concurrency.lockutils [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:39 localhost nova_compute[279673]: 2025-11-28 10:01:39.860 279685 DEBUG oslo_concurrency.lockutils [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:39 localhost nova_compute[279673]: 2025-11-28 10:01:39.860 279685 DEBUG nova.compute.manager [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] No waiting events found dispatching network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 28 05:01:39 localhost nova_compute[279673]: 2025-11-28 10:01:39.860 279685 WARNING nova.compute.manager [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received unexpected event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 for instance with vm_state active and task_state migrating.#033[00m Nov 28 05:01:39 localhost nova_compute[279673]: 2025-11-28 10:01:39.861 279685 DEBUG nova.compute.manager [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received event network-changed-62b8533f-b250-4475-80c2-28c4543536b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 28 05:01:39 localhost nova_compute[279673]: 2025-11-28 10:01:39.861 279685 DEBUG nova.compute.manager [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Refreshing instance network info cache due to event network-changed-62b8533f-b250-4475-80c2-28c4543536b5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m Nov 28 05:01:39 localhost nova_compute[279673]: 2025-11-28 10:01:39.861 279685 DEBUG oslo_concurrency.lockutils [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "refresh_cache-c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 05:01:39 localhost nova_compute[279673]: 2025-11-28 10:01:39.861 279685 DEBUG oslo_concurrency.lockutils [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquired lock "refresh_cache-c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 05:01:39 localhost nova_compute[279673]: 2025-11-28 10:01:39.862 279685 DEBUG nova.network.neutron [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Refreshing network info cache for port 62b8533f-b250-4475-80c2-28c4543536b5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m Nov 28 05:01:39 localhost nova_compute[279673]: 2025-11-28 10:01:39.863 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:40 localhost nova_compute[279673]: 2025-11-28 10:01:39.998 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:40 localhost podman[238687]: time="2025-11-28T10:01:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:01:40 localhost podman[238687]: @ - - [28/Nov/2025:10:01:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158062 "" "Go-http-client/1.1" Nov 28 05:01:40 localhost podman[238687]: @ - - [28/Nov/2025:10:01:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20212 "" "Go-http-client/1.1" Nov 28 05:01:40 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e99 do_prune osdmap full prune enabled Nov 28 05:01:40 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e100 e100: 6 total, 6 up, 6 in Nov 28 05:01:40 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e100: 6 total, 6 up, 6 in Nov 28 05:01:40 localhost nova_compute[279673]: 2025-11-28 10:01:40.280 279685 DEBUG nova.virt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Emitting event Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 28 05:01:40 localhost nova_compute[279673]: 2025-11-28 10:01:40.281 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] VM Paused (Lifecycle Event)#033[00m Nov 28 05:01:40 localhost kernel: device tap62b8533f-b2 left promiscuous mode Nov 28 05:01:40 localhost NetworkManager[5967]: [1764324100.4345] device (tap62b8533f-b2): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed') Nov 28 05:01:40 localhost nova_compute[279673]: 2025-11-28 10:01:40.451 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:40 localhost ovn_controller[152322]: 2025-11-28T10:01:40Z|00134|binding|INFO|Releasing lport 62b8533f-b250-4475-80c2-28c4543536b5 from this chassis (sb_readonly=0) Nov 28 05:01:40 localhost ovn_controller[152322]: 2025-11-28T10:01:40Z|00135|binding|INFO|Setting lport 62b8533f-b250-4475-80c2-28c4543536b5 down in Southbound Nov 28 05:01:40 localhost ovn_controller[152322]: 2025-11-28T10:01:40Z|00136|binding|INFO|Releasing lport fc82099a-3702-4952-add7-ba3d39b895a0 from this chassis (sb_readonly=0) Nov 28 05:01:40 localhost ovn_controller[152322]: 2025-11-28T10:01:40Z|00137|binding|INFO|Setting lport fc82099a-3702-4952-add7-ba3d39b895a0 down in Southbound Nov 28 05:01:40 localhost ovn_controller[152322]: 2025-11-28T10:01:40Z|00138|binding|INFO|Removing iface tap62b8533f-b2 ovn-installed in OVS Nov 28 05:01:40 localhost nova_compute[279673]: 2025-11-28 10:01:40.456 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:40 localhost nova_compute[279673]: 2025-11-28 10:01:40.469 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:40 localhost systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000008.scope: Deactivated successfully. Nov 28 05:01:40 localhost systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000008.scope: Consumed 11.866s CPU time. Nov 28 05:01:40 localhost systemd-machined[83422]: Machine qemu-4-instance-00000008 terminated. Nov 28 05:01:40 localhost nova_compute[279673]: 2025-11-28 10:01:40.540 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 05:01:40 localhost journal[201490]: cannot parse process status data Nov 28 05:01:40 localhost journal[201490]: Unable to get XATTR trusted.libvirt.security.ref_selinux on vms/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk: No such file or directory Nov 28 05:01:40 localhost journal[201490]: Unable to get XATTR trusted.libvirt.security.ref_dac on vms/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_disk: No such file or directory Nov 28 05:01:40 localhost nova_compute[279673]: 2025-11-28 10:01:40.630 279685 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m Nov 28 05:01:40 localhost nova_compute[279673]: 2025-11-28 10:01:40.631 279685 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m Nov 28 05:01:40 localhost nova_compute[279673]: 2025-11-28 10:01:40.631 279685 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m Nov 28 05:01:41 localhost ovn_controller[152322]: 2025-11-28T10:01:41Z|00139|binding|INFO|Releasing lport 6838a8cb-20d7-44c7-aad3-e7f442484bd5 from this chassis (sb_readonly=0) Nov 28 05:01:41 localhost ovn_controller[152322]: 2025-11-28T10:01:41Z|00140|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:01:41 localhost ovn_controller[152322]: 2025-11-28T10:01:41Z|00141|binding|INFO|Releasing lport acd4bbc3-c7c4-47d8-b58b-29abee48b714 from this chassis (sb_readonly=0) Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.104 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:3c:a8 19.80.0.139'], port_security=['fa:16:3e:41:3c:a8 19.80.0.139'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['62b8533f-b250-4475-80c2-28c4543536b5'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-957922340', 'neutron:cidrs': '19.80.0.139/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-492ef1de-4a68-49e4-b736-13cdb2eb7b59', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-957922340', 'neutron:project_id': '310745a04bd441169ff77f55ccf6bd7b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '8cd6a72f-0cb3-42f5-95bb-7d1b962c8a1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=96ffb618-d617-4e8c-a498-acb365ae5313, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=fc82099a-3702-4952-add7-ba3d39b895a0) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.107 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:68:3c 10.100.0.12'], port_security=['fa:16:3e:58:68:3c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain,np0005538515.localdomain', 'activation-strategy': 'rarp', 'additional-chassis-activated': '62c03cad-89c1-4fd7-973b-8f2a608c71f1'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-191355626', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c06e2ffc-a8af-41b6-ab88-680ef1f6fe50', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-191355626', 'neutron:project_id': '310745a04bd441169ff77f55ccf6bd7b', 'neutron:revision_number': '8', 'neutron:security_group_ids': '8cd6a72f-0cb3-42f5-95bb-7d1b962c8a1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b393f93f-1891-43a2-aa26-a4cab2642f74, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=62b8533f-b250-4475-80c2-28c4543536b5) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.109 158130 INFO neutron.agent.ovn.metadata.agent [-] Port fc82099a-3702-4952-add7-ba3d39b895a0 in datapath 492ef1de-4a68-49e4-b736-13cdb2eb7b59 unbound from our chassis#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.112 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 492ef1de-4a68-49e4-b736-13cdb2eb7b59, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.113 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[cd0a85ab-b0bc-4f89-a58d-bf314de847c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.114 158130 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59 namespace which is not needed anymore#033[00m Nov 28 05:01:41 localhost nova_compute[279673]: 2025-11-28 10:01:41.151 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:41 localhost nova_compute[279673]: 2025-11-28 10:01:41.170 279685 INFO nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m Nov 28 05:01:41 localhost neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59[310032]: [NOTICE] (310036) : haproxy version is 2.8.14-c23fe91 Nov 28 05:01:41 localhost neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59[310032]: [NOTICE] (310036) : path to executable is /usr/sbin/haproxy Nov 28 05:01:41 localhost neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59[310032]: [WARNING] (310036) : Exiting Master process... Nov 28 05:01:41 localhost neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59[310032]: [ALERT] (310036) : Current worker (310038) exited with code 143 (Terminated) Nov 28 05:01:41 localhost neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59[310032]: [WARNING] (310036) : All workers exited. Exiting... (0) Nov 28 05:01:41 localhost systemd[1]: libpod-ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84.scope: Deactivated successfully. Nov 28 05:01:41 localhost podman[310256]: 2025-11-28 10:01:41.328864742 +0000 UTC m=+0.080643514 container died ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 28 05:01:41 localhost podman[310256]: 2025-11-28 10:01:41.37447547 +0000 UTC m=+0.126254242 container cleanup ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 05:01:41 localhost podman[310270]: 2025-11-28 10:01:41.406516833 +0000 UTC m=+0.066904423 container cleanup ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 05:01:41 localhost systemd[1]: libpod-conmon-ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84.scope: Deactivated successfully. Nov 28 05:01:41 localhost podman[310284]: 2025-11-28 10:01:41.476717086 +0000 UTC m=+0.079337784 container remove ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125) Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.481 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[a905a408-3a39-42c9-a5e1-144d9c989d4c]: (4, ('Fri Nov 28 10:01:41 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59 (ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84)\nea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84\nFri Nov 28 10:01:41 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59 (ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84)\nea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.484 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[6a54f5bd-7940-4219-93e2-18f9d8429bcb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.486 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap492ef1de-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:01:41 localhost nova_compute[279673]: 2025-11-28 10:01:41.489 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:41 localhost kernel: device tap492ef1de-40 left promiscuous mode Nov 28 05:01:41 localhost nova_compute[279673]: 2025-11-28 10:01:41.505 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.509 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[6e8a54b3-b31c-4d50-8454-03e6644c6bd9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.521 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[07f4f61b-6ebf-4cae-ba75-1ddbc8ba7733]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.523 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[e48753c8-cf17-4a1c-8849-ccac5c016f00]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.536 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[3b20980e-dfd4-4a66-9f6c-d5d90903c704]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1191985, 'reachable_time': 34462, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 310307, 'error': None, 'target': 'ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.538 158264 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-492ef1de-4a68-49e4-b736-13cdb2eb7b59 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.538 158264 DEBUG oslo.privsep.daemon [-] privsep: reply[e143b6e3-229f-4c90-b936-91f519f1f2ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.540 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 62b8533f-b250-4475-80c2-28c4543536b5 in datapath ad2d8cf7-987d-4804-acbd-9b3e248dc8cd unbound from our chassis#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.542 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ad2d8cf7-987d-4804-acbd-9b3e248dc8cd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.543 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[86ae7403-81ed-438d-bfea-edd54df14eac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.544 158130 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd namespace which is not needed anymore#033[00m Nov 28 05:01:41 localhost nova_compute[279673]: 2025-11-28 10:01:41.674 279685 DEBUG nova.virt.libvirt.guest [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'c06e2ffc-a8af-41b6-ab88-680ef1f6fe50' (instance-00000008) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m Nov 28 05:01:41 localhost nova_compute[279673]: 2025-11-28 10:01:41.675 279685 INFO nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Migration operation has completed#033[00m Nov 28 05:01:41 localhost nova_compute[279673]: 2025-11-28 10:01:41.677 279685 INFO nova.compute.manager [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] _post_live_migration() is started..#033[00m Nov 28 05:01:41 localhost neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd[310161]: [NOTICE] (310165) : haproxy version is 2.8.14-c23fe91 Nov 28 05:01:41 localhost neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd[310161]: [NOTICE] (310165) : path to executable is /usr/sbin/haproxy Nov 28 05:01:41 localhost neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd[310161]: [WARNING] (310165) : Exiting Master process... Nov 28 05:01:41 localhost neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd[310161]: [WARNING] (310165) : Exiting Master process... Nov 28 05:01:41 localhost neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd[310161]: [ALERT] (310165) : Current worker (310167) exited with code 143 (Terminated) Nov 28 05:01:41 localhost neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd[310161]: [WARNING] (310165) : All workers exited. Exiting... (0) Nov 28 05:01:41 localhost systemd[1]: libpod-96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a.scope: Deactivated successfully. Nov 28 05:01:41 localhost podman[310325]: 2025-11-28 10:01:41.735760359 +0000 UTC m=+0.078354084 container died 96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:01:41 localhost podman[310325]: 2025-11-28 10:01:41.777737326 +0000 UTC m=+0.120331021 container cleanup 96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Nov 28 05:01:41 localhost podman[310339]: 2025-11-28 10:01:41.80817986 +0000 UTC m=+0.069880564 container cleanup 96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:01:41 localhost systemd[1]: libpod-conmon-96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a.scope: Deactivated successfully. Nov 28 05:01:41 localhost podman[310354]: 2025-11-28 10:01:41.871552594 +0000 UTC m=+0.073673550 container remove 96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.876 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[23c33e84-7605-4370-a13d-acbce6066084]: (4, ('Fri Nov 28 10:01:41 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd (96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a)\n96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a\nFri Nov 28 10:01:41 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd (96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a)\n96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.878 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[3d6fd427-ac2d-4505-93c3-b0b66b4fa54b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.879 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapad2d8cf7-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:01:41 localhost nova_compute[279673]: 2025-11-28 10:01:41.882 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:41 localhost kernel: device tapad2d8cf7-90 left promiscuous mode Nov 28 05:01:41 localhost nova_compute[279673]: 2025-11-28 10:01:41.896 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.902 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[98e8c24c-16e6-494c-a02e-27763f191546]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.915 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[61c3b1fc-8513-4496-994e-f53512de474a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.916 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[b33ac3f5-e491-4973-aa11-eaf6cbf8dc85]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.932 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[262ba370-bf46-4d53-b903-ec6f96cbb1a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1192075, 'reachable_time': 43246, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 310377, 'error': None, 'target': 'ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.935 158264 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ad2d8cf7-987d-4804-acbd-9b3e248dc8cd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Nov 28 05:01:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:41.935 158264 DEBUG oslo.privsep.daemon [-] privsep: reply[55f54df8-7e95-432b-babd-ce20cd6749ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:42 localhost systemd[1]: var-lib-containers-storage-overlay-fa67beacec21ebd5b294a9ebc5d587ffcfb9867f9ba8eaa2c201ba3d1c08b89b-merged.mount: Deactivated successfully. Nov 28 05:01:42 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-96769bdd9f5e24c0c8db2cb6e54f5b73b59308d39c1b4f742e5ac964d183f15a-userdata-shm.mount: Deactivated successfully. Nov 28 05:01:42 localhost systemd[1]: run-netns-ovnmeta\x2dad2d8cf7\x2d987d\x2d4804\x2dacbd\x2d9b3e248dc8cd.mount: Deactivated successfully. Nov 28 05:01:42 localhost systemd[1]: var-lib-containers-storage-overlay-c783556db9b2f3943c08dc4e97d79185a998a122f3fd4b4982860e000e73c01f-merged.mount: Deactivated successfully. Nov 28 05:01:42 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ea1165d4ce9dd4e144a392a12c54b085b9a405b05eab9aaf73c2d083b4c59d84-userdata-shm.mount: Deactivated successfully. Nov 28 05:01:42 localhost systemd[1]: run-netns-ovnmeta\x2d492ef1de\x2d4a68\x2d49e4\x2db736\x2d13cdb2eb7b59.mount: Deactivated successfully. Nov 28 05:01:42 localhost nova_compute[279673]: 2025-11-28 10:01:42.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:01:42 localhost nova_compute[279673]: 2025-11-28 10:01:42.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:01:42 localhost nova_compute[279673]: 2025-11-28 10:01:42.881 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:42 localhost nova_compute[279673]: 2025-11-28 10:01:42.882 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:42 localhost nova_compute[279673]: 2025-11-28 10:01:42.882 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:42 localhost nova_compute[279673]: 2025-11-28 10:01:42.883 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 05:01:42 localhost nova_compute[279673]: 2025-11-28 10:01:42.883 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:01:43 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:43.072 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:01:43 localhost nova_compute[279673]: 2025-11-28 10:01:43.072 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:43 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:43.074 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 28 05:01:43 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:01:43 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4067164618' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:01:43 localhost nova_compute[279673]: 2025-11-28 10:01:43.348 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:01:43 localhost nova_compute[279673]: 2025-11-28 10:01:43.431 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:01:43 localhost nova_compute[279673]: 2025-11-28 10:01:43.431 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:01:43 localhost nova_compute[279673]: 2025-11-28 10:01:43.667 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 05:01:43 localhost nova_compute[279673]: 2025-11-28 10:01:43.670 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11236MB free_disk=41.63758850097656GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 05:01:43 localhost nova_compute[279673]: 2025-11-28 10:01:43.670 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:43 localhost nova_compute[279673]: 2025-11-28 10:01:43.671 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:43 localhost nova_compute[279673]: 2025-11-28 10:01:43.760 279685 INFO nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Updating resource usage from migration 62fb7f70-bf44-4fcf-8c08-e096ee66cd99#033[00m Nov 28 05:01:43 localhost nova_compute[279673]: 2025-11-28 10:01:43.799 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 05:01:43 localhost nova_compute[279673]: 2025-11-28 10:01:43.800 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Migration 62fb7f70-bf44-4fcf-8c08-e096ee66cd99 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m Nov 28 05:01:43 localhost nova_compute[279673]: 2025-11-28 10:01:43.800 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 05:01:43 localhost nova_compute[279673]: 2025-11-28 10:01:43.800 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1152MB phys_disk=41GB used_disk=3GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 05:01:43 localhost nova_compute[279673]: 2025-11-28 10:01:43.862 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.129 279685 DEBUG nova.compute.manager [req-384758b2-e7d4-4609-94ca-471d68e41979 req-2ce278cb-d756-499c-a7b6-939243af4700 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received event network-vif-unplugged-62b8533f-b250-4475-80c2-28c4543536b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.130 279685 DEBUG oslo_concurrency.lockutils [req-384758b2-e7d4-4609-94ca-471d68e41979 req-2ce278cb-d756-499c-a7b6-939243af4700 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.131 279685 DEBUG oslo_concurrency.lockutils [req-384758b2-e7d4-4609-94ca-471d68e41979 req-2ce278cb-d756-499c-a7b6-939243af4700 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.132 279685 DEBUG oslo_concurrency.lockutils [req-384758b2-e7d4-4609-94ca-471d68e41979 req-2ce278cb-d756-499c-a7b6-939243af4700 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.132 279685 DEBUG nova.compute.manager [req-384758b2-e7d4-4609-94ca-471d68e41979 req-2ce278cb-d756-499c-a7b6-939243af4700 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] No waiting events found dispatching network-vif-unplugged-62b8533f-b250-4475-80c2-28c4543536b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.133 279685 DEBUG nova.compute.manager [req-384758b2-e7d4-4609-94ca-471d68e41979 req-2ce278cb-d756-499c-a7b6-939243af4700 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received event network-vif-unplugged-62b8533f-b250-4475-80c2-28c4543536b5 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.248 279685 DEBUG nova.network.neutron [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Updated VIF entry in instance network info cache for port 62b8533f-b250-4475-80c2-28c4543536b5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.249 279685 DEBUG nova.network.neutron [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Updating instance_info_cache with network_info: [{"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "np0005538515.localdomain"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.274 279685 DEBUG oslo_concurrency.lockutils [req-c9215294-8049-4b3d-9aab-200ccce35078 req-e4383052-725d-4c0e-a6f0-f22f58c07d5b 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Releasing lock "refresh_cache-c06e2ffc-a8af-41b6-ab88-680ef1f6fe50" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 05:01:44 localhost ovn_controller[152322]: 2025-11-28T10:01:44Z|00142|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:01:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:01:44 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1891951526' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.342 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.380 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.382 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.419 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.507 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.508 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.837s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.822 279685 DEBUG nova.network.neutron [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Activated binding for port 62b8533f-b250-4475-80c2-28c4543536b5 and host np0005538515.localdomain migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.823 279685 DEBUG nova.compute.manager [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.824 279685 DEBUG nova.virt.libvirt.vif [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T10:01:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-915340611',display_name='tempest-LiveMigrationTest-server-915340611',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005538513.localdomain',hostname='tempest-livemigrationtest-server-915340611',id=8,image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-11-28T10:01:29Z,launched_on='np0005538513.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005538513.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='310745a04bd441169ff77f55ccf6bd7b',ramdisk_id='',reservation_id='r-1g332w05',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='85968a96-5a0e-43a4-9c04-3954f640a7ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-480152442',owner_user_name='tempest-LiveMigrationTest-480152442-project-member'},tags=,task_state='migrating',terminated_at=None,trusted_certs=,updated_at=2025-11-28T10:01:33Z,user_data=None,user_id='318114281cb649bc9eeed12ecdc7273f',uuid=c06e2ffc-a8af-41b6-ab88-680ef1f6fe50,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.824 279685 DEBUG nova.network.os_vif_util [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Converting VIF {"id": "62b8533f-b250-4475-80c2-28c4543536b5", "address": "fa:16:3e:58:68:3c", "network": {"id": "ad2d8cf7-987d-4804-acbd-9b3e248dc8cd", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1085829019-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "310745a04bd441169ff77f55ccf6bd7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62b8533f-b2", "ovs_interfaceid": "62b8533f-b250-4475-80c2-28c4543536b5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.830 279685 DEBUG nova.network.os_vif_util [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=62b8533f-b250-4475-80c2-28c4543536b5,network=Network(ad2d8cf7-987d-4804-acbd-9b3e248dc8cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap62b8533f-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.830 279685 DEBUG os_vif [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=62b8533f-b250-4475-80c2-28c4543536b5,network=Network(ad2d8cf7-987d-4804-acbd-9b3e248dc8cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap62b8533f-b2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m Nov 28 05:01:44 localhost podman[310423]: 2025-11-28 10:01:44.831707718 +0000 UTC m=+0.071083621 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, io.openshift.expose-services=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, name=ubi9-minimal, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container) Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.833 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.834 279685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62b8533f-b2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.836 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.840 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.843 279685 INFO os_vif [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:68:3c,bridge_name='br-int',has_traffic_filtering=True,id=62b8533f-b250-4475-80c2-28c4543536b5,network=Network(ad2d8cf7-987d-4804-acbd-9b3e248dc8cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap62b8533f-b2')#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.843 279685 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.843 279685 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.843 279685 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.844 279685 DEBUG nova.compute.manager [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.844 279685 INFO nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Deleting instance files /var/lib/nova/instances/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_del#033[00m Nov 28 05:01:44 localhost nova_compute[279673]: 2025-11-28 10:01:44.844 279685 INFO nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Deletion of /var/lib/nova/instances/c06e2ffc-a8af-41b6-ab88-680ef1f6fe50_del complete#033[00m Nov 28 05:01:44 localhost podman[310423]: 2025-11-28 10:01:44.849566496 +0000 UTC m=+0.088942399 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, architecture=x86_64, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9) Nov 28 05:01:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:01:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e100 do_prune osdmap full prune enabled Nov 28 05:01:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e101 e101: 6 total, 6 up, 6 in Nov 28 05:01:44 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:01:44 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e101: 6 total, 6 up, 6 in Nov 28 05:01:45 localhost nova_compute[279673]: 2025-11-28 10:01:45.004 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:45 localhost nova_compute[279673]: 2025-11-28 10:01:45.024 279685 DEBUG nova.compute.manager [req-ba2daeac-bca5-4b68-a4b7-b11188075754 req-0c29f3af-8b17-4500-9f55-b6c7a52059b7 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 28 05:01:45 localhost nova_compute[279673]: 2025-11-28 10:01:45.025 279685 DEBUG oslo_concurrency.lockutils [req-ba2daeac-bca5-4b68-a4b7-b11188075754 req-0c29f3af-8b17-4500-9f55-b6c7a52059b7 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:45 localhost nova_compute[279673]: 2025-11-28 10:01:45.026 279685 DEBUG oslo_concurrency.lockutils [req-ba2daeac-bca5-4b68-a4b7-b11188075754 req-0c29f3af-8b17-4500-9f55-b6c7a52059b7 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:45 localhost nova_compute[279673]: 2025-11-28 10:01:45.026 279685 DEBUG oslo_concurrency.lockutils [req-ba2daeac-bca5-4b68-a4b7-b11188075754 req-0c29f3af-8b17-4500-9f55-b6c7a52059b7 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:45 localhost nova_compute[279673]: 2025-11-28 10:01:45.027 279685 DEBUG nova.compute.manager [req-ba2daeac-bca5-4b68-a4b7-b11188075754 req-0c29f3af-8b17-4500-9f55-b6c7a52059b7 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] No waiting events found dispatching network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 28 05:01:45 localhost nova_compute[279673]: 2025-11-28 10:01:45.027 279685 WARNING nova.compute.manager [req-ba2daeac-bca5-4b68-a4b7-b11188075754 req-0c29f3af-8b17-4500-9f55-b6c7a52059b7 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received unexpected event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 for instance with vm_state active and task_state migrating.#033[00m Nov 28 05:01:46 localhost nova_compute[279673]: 2025-11-28 10:01:46.509 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:01:46 localhost nova_compute[279673]: 2025-11-28 10:01:46.510 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 05:01:46 localhost nova_compute[279673]: 2025-11-28 10:01:46.510 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 05:01:46 localhost nova_compute[279673]: 2025-11-28 10:01:46.593 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 05:01:46 localhost nova_compute[279673]: 2025-11-28 10:01:46.594 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 05:01:46 localhost nova_compute[279673]: 2025-11-28 10:01:46.595 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 05:01:46 localhost nova_compute[279673]: 2025-11-28 10:01:46.596 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:01:47 localhost nova_compute[279673]: 2025-11-28 10:01:47.075 279685 DEBUG nova.compute.manager [req-fe34f289-5840-4fcb-aae1-c6fea9e23f9a req-9cecaca3-b598-4b80-bb4c-f0e6e3f3a95a 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 28 05:01:47 localhost nova_compute[279673]: 2025-11-28 10:01:47.076 279685 DEBUG oslo_concurrency.lockutils [req-fe34f289-5840-4fcb-aae1-c6fea9e23f9a req-9cecaca3-b598-4b80-bb4c-f0e6e3f3a95a 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:47 localhost nova_compute[279673]: 2025-11-28 10:01:47.076 279685 DEBUG oslo_concurrency.lockutils [req-fe34f289-5840-4fcb-aae1-c6fea9e23f9a req-9cecaca3-b598-4b80-bb4c-f0e6e3f3a95a 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:47 localhost nova_compute[279673]: 2025-11-28 10:01:47.077 279685 DEBUG oslo_concurrency.lockutils [req-fe34f289-5840-4fcb-aae1-c6fea9e23f9a req-9cecaca3-b598-4b80-bb4c-f0e6e3f3a95a 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:47 localhost nova_compute[279673]: 2025-11-28 10:01:47.077 279685 DEBUG nova.compute.manager [req-fe34f289-5840-4fcb-aae1-c6fea9e23f9a req-9cecaca3-b598-4b80-bb4c-f0e6e3f3a95a 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] No waiting events found dispatching network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 28 05:01:47 localhost nova_compute[279673]: 2025-11-28 10:01:47.078 279685 WARNING nova.compute.manager [req-fe34f289-5840-4fcb-aae1-c6fea9e23f9a req-9cecaca3-b598-4b80-bb4c-f0e6e3f3a95a 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received unexpected event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 for instance with vm_state active and task_state migrating.#033[00m Nov 28 05:01:47 localhost nova_compute[279673]: 2025-11-28 10:01:47.200 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:01:47 localhost nova_compute[279673]: 2025-11-28 10:01:47.224 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 05:01:47 localhost nova_compute[279673]: 2025-11-28 10:01:47.224 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 05:01:48 localhost openstack_network_exporter[240658]: ERROR 10:01:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:01:48 localhost openstack_network_exporter[240658]: ERROR 10:01:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:01:48 localhost openstack_network_exporter[240658]: ERROR 10:01:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:01:48 localhost openstack_network_exporter[240658]: ERROR 10:01:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:01:48 localhost openstack_network_exporter[240658]: Nov 28 05:01:48 localhost openstack_network_exporter[240658]: ERROR 10:01:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:01:48 localhost openstack_network_exporter[240658]: Nov 28 05:01:48 localhost nova_compute[279673]: 2025-11-28 10:01:48.400 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:01:48 localhost podman[310444]: 2025-11-28 10:01:48.845612646 +0000 UTC m=+0.084692068 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 28 05:01:48 localhost podman[310444]: 2025-11-28 10:01:48.855056706 +0000 UTC m=+0.094136128 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 05:01:48 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:01:49 localhost nova_compute[279673]: 2025-11-28 10:01:49.211 279685 DEBUG nova.compute.manager [req-79b26847-8288-4a75-84b9-cfb0ab3a3c84 req-909ec57b-1e3a-4128-b351-55bf34dc3c34 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Nov 28 05:01:49 localhost nova_compute[279673]: 2025-11-28 10:01:49.212 279685 DEBUG oslo_concurrency.lockutils [req-79b26847-8288-4a75-84b9-cfb0ab3a3c84 req-909ec57b-1e3a-4128-b351-55bf34dc3c34 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Acquiring lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:49 localhost nova_compute[279673]: 2025-11-28 10:01:49.212 279685 DEBUG oslo_concurrency.lockutils [req-79b26847-8288-4a75-84b9-cfb0ab3a3c84 req-909ec57b-1e3a-4128-b351-55bf34dc3c34 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:49 localhost nova_compute[279673]: 2025-11-28 10:01:49.213 279685 DEBUG oslo_concurrency.lockutils [req-79b26847-8288-4a75-84b9-cfb0ab3a3c84 req-909ec57b-1e3a-4128-b351-55bf34dc3c34 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:49 localhost nova_compute[279673]: 2025-11-28 10:01:49.213 279685 DEBUG nova.compute.manager [req-79b26847-8288-4a75-84b9-cfb0ab3a3c84 req-909ec57b-1e3a-4128-b351-55bf34dc3c34 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] No waiting events found dispatching network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Nov 28 05:01:49 localhost nova_compute[279673]: 2025-11-28 10:01:49.213 279685 WARNING nova.compute.manager [req-79b26847-8288-4a75-84b9-cfb0ab3a3c84 req-909ec57b-1e3a-4128-b351-55bf34dc3c34 0d543a6dcb564de5b39062ca08440499 e2c76e4d27554fd5a4f85cce208b136f - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Received unexpected event network-vif-plugged-62b8533f-b250-4475-80c2-28c4543536b5 for instance with vm_state active and task_state migrating.#033[00m Nov 28 05:01:49 localhost nova_compute[279673]: 2025-11-28 10:01:49.836 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:01:50 localhost nova_compute[279673]: 2025-11-28 10:01:50.005 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:50 localhost nova_compute[279673]: 2025-11-28 10:01:50.482 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:01:50 localhost nova_compute[279673]: 2025-11-28 10:01:50.520 279685 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Acquiring lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:50 localhost nova_compute[279673]: 2025-11-28 10:01:50.521 279685 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:50 localhost nova_compute[279673]: 2025-11-28 10:01:50.521 279685 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Lock "c06e2ffc-a8af-41b6-ab88-680ef1f6fe50-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:50 localhost nova_compute[279673]: 2025-11-28 10:01:50.540 279685 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:50 localhost nova_compute[279673]: 2025-11-28 10:01:50.541 279685 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:50 localhost nova_compute[279673]: 2025-11-28 10:01:50.542 279685 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:50 localhost nova_compute[279673]: 2025-11-28 10:01:50.542 279685 DEBUG nova.compute.resource_tracker [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 05:01:50 localhost nova_compute[279673]: 2025-11-28 10:01:50.543 279685 DEBUG oslo_concurrency.processutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:01:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:50.841 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:50.842 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:50.843 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:50 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:01:50 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3559263610' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:01:50 localhost nova_compute[279673]: 2025-11-28 10:01:50.997 279685 DEBUG oslo_concurrency.processutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:01:51 localhost nova_compute[279673]: 2025-11-28 10:01:51.052 279685 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:01:51 localhost nova_compute[279673]: 2025-11-28 10:01:51.053 279685 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:01:51 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:51.076 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:01:51 localhost nova_compute[279673]: 2025-11-28 10:01:51.217 279685 WARNING nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 05:01:51 localhost nova_compute[279673]: 2025-11-28 10:01:51.218 279685 DEBUG nova.compute.resource_tracker [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11214MB free_disk=41.700828552246094GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 05:01:51 localhost nova_compute[279673]: 2025-11-28 10:01:51.219 279685 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:01:51 localhost nova_compute[279673]: 2025-11-28 10:01:51.219 279685 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:01:51 localhost nova_compute[279673]: 2025-11-28 10:01:51.264 279685 DEBUG nova.compute.resource_tracker [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Migration for instance c06e2ffc-a8af-41b6-ab88-680ef1f6fe50 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m Nov 28 05:01:51 localhost nova_compute[279673]: 2025-11-28 10:01:51.284 279685 DEBUG nova.compute.resource_tracker [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m Nov 28 05:01:51 localhost nova_compute[279673]: 2025-11-28 10:01:51.330 279685 DEBUG nova.compute.resource_tracker [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 05:01:51 localhost nova_compute[279673]: 2025-11-28 10:01:51.331 279685 DEBUG nova.compute.resource_tracker [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Migration 62fb7f70-bf44-4fcf-8c08-e096ee66cd99 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m Nov 28 05:01:51 localhost nova_compute[279673]: 2025-11-28 10:01:51.331 279685 DEBUG nova.compute.resource_tracker [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 05:01:51 localhost nova_compute[279673]: 2025-11-28 10:01:51.332 279685 DEBUG nova.compute.resource_tracker [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 05:01:51 localhost nova_compute[279673]: 2025-11-28 10:01:51.400 279685 DEBUG oslo_concurrency.processutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:01:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:01:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:01:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:01:51 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1128492412' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:01:51 localhost podman[310508]: 2025-11-28 10:01:51.857012142 +0000 UTC m=+0.091646630 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS) Nov 28 05:01:51 localhost nova_compute[279673]: 2025-11-28 10:01:51.858 279685 DEBUG oslo_concurrency.processutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:01:51 localhost nova_compute[279673]: 2025-11-28 10:01:51.868 279685 DEBUG nova.compute.provider_tree [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 05:01:51 localhost nova_compute[279673]: 2025-11-28 10:01:51.900 279685 DEBUG nova.scheduler.client.report [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 05:01:51 localhost podman[310509]: 2025-11-28 10:01:51.916724804 +0000 UTC m=+0.148556976 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:01:51 localhost podman[310508]: 2025-11-28 10:01:51.925352699 +0000 UTC m=+0.159987127 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller) Nov 28 05:01:51 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:01:51 localhost nova_compute[279673]: 2025-11-28 10:01:51.946 279685 DEBUG nova.compute.resource_tracker [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 05:01:51 localhost nova_compute[279673]: 2025-11-28 10:01:51.946 279685 DEBUG oslo_concurrency.lockutils [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.727s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:01:51 localhost podman[310509]: 2025-11-28 10:01:51.949505379 +0000 UTC m=+0.181337541 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:01:51 localhost nova_compute[279673]: 2025-11-28 10:01:51.955 279685 INFO nova.compute.manager [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Migrating instance to np0005538515.localdomain finished successfully.#033[00m Nov 28 05:01:51 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:01:51.958 261084 INFO neutron.agent.linux.ip_lib [None req-0fb38773-599a-4dc4-8224-09918295a826 - - - - - -] Device tap516917c4-99 cannot be used as it has no MAC address#033[00m Nov 28 05:01:51 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:01:51 localhost nova_compute[279673]: 2025-11-28 10:01:51.982 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:51 localhost kernel: device tap516917c4-99 entered promiscuous mode Nov 28 05:01:51 localhost NetworkManager[5967]: [1764324111.9889] manager: (tap516917c4-99): new Generic device (/org/freedesktop/NetworkManager/Devices/28) Nov 28 05:01:52 localhost ovn_controller[152322]: 2025-11-28T10:01:51Z|00143|binding|INFO|Claiming lport 516917c4-995e-4297-af25-c4f8499fcc7d for this chassis. Nov 28 05:01:52 localhost nova_compute[279673]: 2025-11-28 10:01:51.989 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:52 localhost ovn_controller[152322]: 2025-11-28T10:01:51Z|00144|binding|INFO|516917c4-995e-4297-af25-c4f8499fcc7d: Claiming unknown Nov 28 05:01:52 localhost systemd-udevd[310558]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:01:52 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:52.006 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1f9b84b894e641c4bee3ebcd1409ad9f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4106ac0-e782-4268-8bb4-37fc3096f0bc, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=516917c4-995e-4297-af25-c4f8499fcc7d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:01:52 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:52.012 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 516917c4-995e-4297-af25-c4f8499fcc7d in datapath b1696f4c-80ce-491f-ad1c-cc7f5b6700ba bound to our chassis#033[00m Nov 28 05:01:52 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:52.013 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b1696f4c-80ce-491f-ad1c-cc7f5b6700ba or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:01:52 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:52.013 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[f81a9356-1f1a-49f6-a305-e3c46a738cc2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:52 localhost journal[227875]: ethtool ioctl error on tap516917c4-99: No such device Nov 28 05:01:52 localhost ovn_controller[152322]: 2025-11-28T10:01:52Z|00145|binding|INFO|Setting lport 516917c4-995e-4297-af25-c4f8499fcc7d ovn-installed in OVS Nov 28 05:01:52 localhost ovn_controller[152322]: 2025-11-28T10:01:52Z|00146|binding|INFO|Setting lport 516917c4-995e-4297-af25-c4f8499fcc7d up in Southbound Nov 28 05:01:52 localhost journal[227875]: ethtool ioctl error on tap516917c4-99: No such device Nov 28 05:01:52 localhost nova_compute[279673]: 2025-11-28 10:01:52.032 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:52 localhost journal[227875]: ethtool ioctl error on tap516917c4-99: No such device Nov 28 05:01:52 localhost journal[227875]: ethtool ioctl error on tap516917c4-99: No such device Nov 28 05:01:52 localhost journal[227875]: ethtool ioctl error on tap516917c4-99: No such device Nov 28 05:01:52 localhost nova_compute[279673]: 2025-11-28 10:01:52.052 279685 INFO nova.scheduler.client.report [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] Deleted allocation for migration 62fb7f70-bf44-4fcf-8c08-e096ee66cd99#033[00m Nov 28 05:01:52 localhost nova_compute[279673]: 2025-11-28 10:01:52.052 279685 DEBUG nova.virt.libvirt.driver [None req-d01cd8c7-9a11-4b90-a050-431eb8c7d5f5 13de913d83d648919edb1da3601d106c f1bee3918a2345388c202f74e60af9c5 - - default default] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m Nov 28 05:01:52 localhost journal[227875]: ethtool ioctl error on tap516917c4-99: No such device Nov 28 05:01:52 localhost journal[227875]: ethtool ioctl error on tap516917c4-99: No such device Nov 28 05:01:52 localhost nova_compute[279673]: 2025-11-28 10:01:52.062 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:52 localhost journal[227875]: ethtool ioctl error on tap516917c4-99: No such device Nov 28 05:01:52 localhost nova_compute[279673]: 2025-11-28 10:01:52.119 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:52 localhost podman[310629]: Nov 28 05:01:52 localhost podman[310629]: 2025-11-28 10:01:52.988080458 +0000 UTC m=+0.104603049 container create 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:01:53 localhost podman[310629]: 2025-11-28 10:01:52.936929099 +0000 UTC m=+0.053451720 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:01:53 localhost systemd[1]: Started libpod-conmon-804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd.scope. Nov 28 05:01:53 localhost systemd[1]: Started libcrun container. Nov 28 05:01:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e03c2285c42fffa1cd27962b49feefea5575696a1d40702567a3737442c3ea1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:01:53 localhost podman[310629]: 2025-11-28 10:01:53.074470847 +0000 UTC m=+0.190993428 container init 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2) Nov 28 05:01:53 localhost podman[310629]: 2025-11-28 10:01:53.081240614 +0000 UTC m=+0.197763195 container start 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 28 05:01:53 localhost dnsmasq[310647]: started, version 2.85 cachesize 150 Nov 28 05:01:53 localhost dnsmasq[310647]: DNS service limited to local subnets Nov 28 05:01:53 localhost dnsmasq[310647]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:01:53 localhost dnsmasq[310647]: warning: no upstream servers configured Nov 28 05:01:53 localhost dnsmasq-dhcp[310647]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:01:53 localhost dnsmasq[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/addn_hosts - 0 addresses Nov 28 05:01:53 localhost dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/host Nov 28 05:01:53 localhost dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/opts Nov 28 05:01:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:01:53.260 261084 INFO neutron.agent.dhcp.agent [None req-a0457f08-cd61-4e70-9710-d84cc62f8162 - - - - - -] DHCP configuration for ports {'d866b7da-b4ec-4c1e-9c58-e58b19fd6a55'} is completed#033[00m Nov 28 05:01:53 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e101 do_prune osdmap full prune enabled Nov 28 05:01:53 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e102 e102: 6 total, 6 up, 6 in Nov 28 05:01:53 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e102: 6 total, 6 up, 6 in Nov 28 05:01:54 localhost nova_compute[279673]: 2025-11-28 10:01:54.736 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:01:54 localhost podman[310648]: 2025-11-28 10:01:54.82111265 +0000 UTC m=+0.057260827 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 05:01:54 localhost podman[310648]: 2025-11-28 10:01:54.833428348 +0000 UTC m=+0.069576575 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 05:01:54 localhost nova_compute[279673]: 2025-11-28 10:01:54.837 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:54 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:01:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:01:55 localhost nova_compute[279673]: 2025-11-28 10:01:55.008 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:55 localhost nova_compute[279673]: 2025-11-28 10:01:55.366 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:55 localhost nova_compute[279673]: 2025-11-28 10:01:55.618 279685 DEBUG nova.virt.driver [-] Emitting event Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 28 05:01:55 localhost nova_compute[279673]: 2025-11-28 10:01:55.619 279685 INFO nova.compute.manager [-] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] VM Stopped (Lifecycle Event)#033[00m Nov 28 05:01:55 localhost nova_compute[279673]: 2025-11-28 10:01:55.643 279685 DEBUG nova.compute.manager [None req-5639471a-f41a-4d05-8dbe-5f39c4734ce6 - - - - - -] [instance: c06e2ffc-a8af-41b6-ab88-680ef1f6fe50] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 05:01:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:01:56.079 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:01:55Z, description=, device_id=ff13b2c3-ffbb-486b-ba3a-fa0f2960342d, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=38f26b93-3884-4247-b638-2104f92bdcaf, ip_allocation=immediate, mac_address=fa:16:3e:58:21:6e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:01:49Z, description=, dns_domain=, id=b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-101229426-network, port_security_enabled=True, project_id=1f9b84b894e641c4bee3ebcd1409ad9f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62044, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=730, status=ACTIVE, subnets=['bac26e98-4c61-47a1-b281-0a4613971f3f'], tags=[], tenant_id=1f9b84b894e641c4bee3ebcd1409ad9f, updated_at=2025-11-28T10:01:50Z, vlan_transparent=None, network_id=b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, port_security_enabled=False, project_id=1f9b84b894e641c4bee3ebcd1409ad9f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=767, status=DOWN, tags=[], tenant_id=1f9b84b894e641c4bee3ebcd1409ad9f, updated_at=2025-11-28T10:01:55Z on network b1696f4c-80ce-491f-ad1c-cc7f5b6700ba#033[00m Nov 28 05:01:56 localhost dnsmasq[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/addn_hosts - 1 addresses Nov 28 05:01:56 localhost dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/host Nov 28 05:01:56 localhost dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/opts Nov 28 05:01:56 localhost podman[310685]: 2025-11-28 10:01:56.33814094 +0000 UTC m=+0.071692350 container kill 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 28 05:01:56 localhost neutron_sriov_agent[254147]: 2025-11-28 10:01:56.412 2 INFO neutron.agent.securitygroups_rpc [req-49cc58ff-e4e8-45be-b0c0-595b2c881c34 req-2439bbec-210c-4eb9-989c-4cbc137e5d8d 76caaf04f9e5427ca10e0bb020dbffa2 6fec370fed684ed6ba04de00336f61ee - - default default] Security group rule updated ['4f7b9341-d4bb-4bbc-a8bf-917ce0b68881']#033[00m Nov 28 05:01:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:01:56.531 261084 INFO neutron.agent.dhcp.agent [None req-abacf628-896f-4f36-ae8c-95ee3fcf07c0 - - - - - -] DHCP configuration for ports {'38f26b93-3884-4247-b638-2104f92bdcaf'} is completed#033[00m Nov 28 05:01:56 localhost neutron_sriov_agent[254147]: 2025-11-28 10:01:56.954 2 INFO neutron.agent.securitygroups_rpc [req-d6a9207a-32bb-417d-a1a8-33a725f0d00f req-76509a4e-6eff-4420-ad31-f2903ff65806 76caaf04f9e5427ca10e0bb020dbffa2 6fec370fed684ed6ba04de00336f61ee - - default default] Security group rule updated ['4f7b9341-d4bb-4bbc-a8bf-917ce0b68881']#033[00m Nov 28 05:01:57 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:01:57.143 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:01:55Z, description=, device_id=ff13b2c3-ffbb-486b-ba3a-fa0f2960342d, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=38f26b93-3884-4247-b638-2104f92bdcaf, ip_allocation=immediate, mac_address=fa:16:3e:58:21:6e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:01:49Z, description=, dns_domain=, id=b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-101229426-network, port_security_enabled=True, project_id=1f9b84b894e641c4bee3ebcd1409ad9f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62044, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=730, status=ACTIVE, subnets=['bac26e98-4c61-47a1-b281-0a4613971f3f'], tags=[], tenant_id=1f9b84b894e641c4bee3ebcd1409ad9f, updated_at=2025-11-28T10:01:50Z, vlan_transparent=None, network_id=b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, port_security_enabled=False, project_id=1f9b84b894e641c4bee3ebcd1409ad9f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=767, status=DOWN, tags=[], tenant_id=1f9b84b894e641c4bee3ebcd1409ad9f, updated_at=2025-11-28T10:01:55Z on network b1696f4c-80ce-491f-ad1c-cc7f5b6700ba#033[00m Nov 28 05:01:57 localhost dnsmasq[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/addn_hosts - 1 addresses Nov 28 05:01:57 localhost dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/host Nov 28 05:01:57 localhost podman[310721]: 2025-11-28 10:01:57.360302714 +0000 UTC m=+0.070829192 container kill 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3) Nov 28 05:01:57 localhost dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/opts Nov 28 05:01:57 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:01:57.593 261084 INFO neutron.agent.dhcp.agent [None req-511587f2-b9ab-455a-89d5-73247e9b2efc - - - - - -] DHCP configuration for ports {'38f26b93-3884-4247-b638-2104f92bdcaf'} is completed#033[00m Nov 28 05:01:57 localhost neutron_sriov_agent[254147]: 2025-11-28 10:01:57.738 2 INFO neutron.agent.securitygroups_rpc [None req-e0232781-0774-46d7-9ff8-6308f0f3831b 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Security group member updated ['8cd6a72f-0cb3-42f5-95bb-7d1b962c8a1e']#033[00m Nov 28 05:01:58 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:01:58.125 261084 INFO neutron.agent.linux.ip_lib [None req-11028aa0-241c-4cba-b68c-25c21bc3bb21 - - - - - -] Device tap54867331-d2 cannot be used as it has no MAC address#033[00m Nov 28 05:01:58 localhost nova_compute[279673]: 2025-11-28 10:01:58.181 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:58 localhost kernel: device tap54867331-d2 entered promiscuous mode Nov 28 05:01:58 localhost NetworkManager[5967]: [1764324118.1897] manager: (tap54867331-d2): new Generic device (/org/freedesktop/NetworkManager/Devices/29) Nov 28 05:01:58 localhost ovn_controller[152322]: 2025-11-28T10:01:58Z|00147|binding|INFO|Claiming lport 54867331-d2d2-4007-8751-6825f0370005 for this chassis. Nov 28 05:01:58 localhost ovn_controller[152322]: 2025-11-28T10:01:58Z|00148|binding|INFO|54867331-d2d2-4007-8751-6825f0370005: Claiming unknown Nov 28 05:01:58 localhost nova_compute[279673]: 2025-11-28 10:01:58.190 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:58 localhost systemd-udevd[310751]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:01:58 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:58.203 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-291dc1ac-5414-4421-8e5e-126d810812c9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-291dc1ac-5414-4421-8e5e-126d810812c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b9d5d0d5dc28445f854288051977b3d1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46ecb4cc-6f9f-41cb-ba67-522f7eda61f5, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=54867331-d2d2-4007-8751-6825f0370005) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:01:58 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:58.205 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 54867331-d2d2-4007-8751-6825f0370005 in datapath 291dc1ac-5414-4421-8e5e-126d810812c9 bound to our chassis#033[00m Nov 28 05:01:58 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:58.207 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 291dc1ac-5414-4421-8e5e-126d810812c9 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:01:58 localhost ovn_metadata_agent[158125]: 2025-11-28 10:01:58.208 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[68a719c5-0bf9-42f1-9756-0a6bf202d3b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:01:58 localhost journal[227875]: ethtool ioctl error on tap54867331-d2: No such device Nov 28 05:01:58 localhost journal[227875]: ethtool ioctl error on tap54867331-d2: No such device Nov 28 05:01:58 localhost ovn_controller[152322]: 2025-11-28T10:01:58Z|00149|binding|INFO|Setting lport 54867331-d2d2-4007-8751-6825f0370005 ovn-installed in OVS Nov 28 05:01:58 localhost ovn_controller[152322]: 2025-11-28T10:01:58Z|00150|binding|INFO|Setting lport 54867331-d2d2-4007-8751-6825f0370005 up in Southbound Nov 28 05:01:58 localhost journal[227875]: ethtool ioctl error on tap54867331-d2: No such device Nov 28 05:01:58 localhost nova_compute[279673]: 2025-11-28 10:01:58.230 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:58 localhost journal[227875]: ethtool ioctl error on tap54867331-d2: No such device Nov 28 05:01:58 localhost journal[227875]: ethtool ioctl error on tap54867331-d2: No such device Nov 28 05:01:58 localhost journal[227875]: ethtool ioctl error on tap54867331-d2: No such device Nov 28 05:01:58 localhost journal[227875]: ethtool ioctl error on tap54867331-d2: No such device Nov 28 05:01:58 localhost journal[227875]: ethtool ioctl error on tap54867331-d2: No such device Nov 28 05:01:58 localhost nova_compute[279673]: 2025-11-28 10:01:58.263 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:58 localhost nova_compute[279673]: 2025-11-28 10:01:58.289 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:58 localhost snmpd[66832]: empty variable list in _query Nov 28 05:01:58 localhost snmpd[66832]: empty variable list in _query Nov 28 05:01:58 localhost snmpd[66832]: empty variable list in _query Nov 28 05:01:58 localhost snmpd[66832]: empty variable list in _query Nov 28 05:01:58 localhost snmpd[66832]: empty variable list in _query Nov 28 05:01:58 localhost snmpd[66832]: empty variable list in _query Nov 28 05:01:59 localhost podman[310822]: Nov 28 05:01:59 localhost podman[310822]: 2025-11-28 10:01:59.139832024 +0000 UTC m=+0.089997971 container create ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-291dc1ac-5414-4421-8e5e-126d810812c9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2) Nov 28 05:01:59 localhost systemd[1]: Started libpod-conmon-ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381.scope. Nov 28 05:01:59 localhost podman[310822]: 2025-11-28 10:01:59.09795526 +0000 UTC m=+0.048121257 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:01:59 localhost systemd[1]: Started libcrun container. Nov 28 05:01:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e086de59da47ef628d8f8ee81d1bcf4e96528abb7fd65b1cc4ec3d44ec3ea5b1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:01:59 localhost podman[310822]: 2025-11-28 10:01:59.218395753 +0000 UTC m=+0.168561690 container init ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-291dc1ac-5414-4421-8e5e-126d810812c9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:01:59 localhost podman[310822]: 2025-11-28 10:01:59.228105592 +0000 UTC m=+0.178271529 container start ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-291dc1ac-5414-4421-8e5e-126d810812c9, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Nov 28 05:01:59 localhost dnsmasq[310840]: started, version 2.85 cachesize 150 Nov 28 05:01:59 localhost dnsmasq[310840]: DNS service limited to local subnets Nov 28 05:01:59 localhost dnsmasq[310840]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:01:59 localhost dnsmasq[310840]: warning: no upstream servers configured Nov 28 05:01:59 localhost dnsmasq-dhcp[310840]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:01:59 localhost dnsmasq[310840]: read /var/lib/neutron/dhcp/291dc1ac-5414-4421-8e5e-126d810812c9/addn_hosts - 0 addresses Nov 28 05:01:59 localhost dnsmasq-dhcp[310840]: read /var/lib/neutron/dhcp/291dc1ac-5414-4421-8e5e-126d810812c9/host Nov 28 05:01:59 localhost dnsmasq-dhcp[310840]: read /var/lib/neutron/dhcp/291dc1ac-5414-4421-8e5e-126d810812c9/opts Nov 28 05:01:59 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:01:59.352 261084 INFO neutron.agent.dhcp.agent [None req-a3adf305-0a5b-4653-82d7-cee7b7a2063d - - - - - -] DHCP configuration for ports {'b5a6badf-b758-4c4f-b162-a463f94ddb2e'} is completed#033[00m Nov 28 05:01:59 localhost nova_compute[279673]: 2025-11-28 10:01:59.840 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:01:59 localhost neutron_sriov_agent[254147]: 2025-11-28 10:01:59.857 2 INFO neutron.agent.securitygroups_rpc [None req-34f90ada-ae7e-4d6e-90c9-94029146836e 318114281cb649bc9eeed12ecdc7273f 310745a04bd441169ff77f55ccf6bd7b - - default default] Security group member updated ['8cd6a72f-0cb3-42f5-95bb-7d1b962c8a1e']#033[00m Nov 28 05:01:59 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:01:59 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e102 do_prune osdmap full prune enabled Nov 28 05:01:59 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e103 e103: 6 total, 6 up, 6 in Nov 28 05:01:59 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e103: 6 total, 6 up, 6 in Nov 28 05:02:00 localhost nova_compute[279673]: 2025-11-28 10:02:00.010 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.675 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.677 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.682 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a6fefa1-5605-4469-acf2-a10536ee15c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:02:00.677589', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '48531d60-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.849510782, 'message_signature': 'ed0b8275d3b9de0f4cfb71eefc8d699f4fe5b7a46334681c67dc041ed07fd16d'}]}, 'timestamp': '2025-11-28 10:02:00.683552', '_unique_id': 'ce27b9d1b2d5488fa93e03e94e29bbcb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.685 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.686 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.699 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.700 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '850f14f7-0321-4e9e-a395-5d73e66157c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:02:00.686713', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4855a9fe-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.858632422, 'message_signature': 'a31a3b2a28d6925eeb19d5b8a60498456ee900c24abc19ce444a50eb9fec1bf3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:02:00.686713', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4855c2a4-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.858632422, 'message_signature': '023f22cd9346db0800a8dce97952ff063acabcfadf3607daa4c494463d1af78b'}]}, 'timestamp': '2025-11-28 10:02:00.700890', '_unique_id': 'bdd444813d794476b533f59ae6e816ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.702 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.703 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.703 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.703 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.733 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.734 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce5b4000-2618-44b5-978f-8ccf11a9ea5d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:02:00.703795', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '485ae798-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.875718105, 'message_signature': '4ed2c1710d03b58f78fd3af1022f94114abe929b8a9d021addd2de1e606471ae'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:02:00.703795', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '485b0106-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.875718105, 'message_signature': '8a441c4feeef2f1ec7f9856af711b967e24cd7ac4d001ecba32f3bd8b99e26c1'}]}, 'timestamp': '2025-11-28 10:02:00.735270', '_unique_id': '8aeca1093fcf4f91a1f1590c2c53b78c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.736 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.737 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.753 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69264395-95e8-4f92-aae2-e58e6eb5b6e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:02:00.737980', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '485dee34-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.925610406, 'message_signature': '996191eb67ea52d85432ac08196c0ac4ce64b5ac87559586a51bd15b78a90957'}]}, 'timestamp': '2025-11-28 10:02:00.754410', '_unique_id': '65c5679ac41a45d685406ef6cca41952'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.755 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.756 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.756 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.757 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eae21a89-472b-4903-af50-95bb8514d6df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:02:00.756780', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '485e5e64-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.858632422, 'message_signature': '64ca1c18b4b967d3098ac7eff89af00088f873caf6290e4a2e9c9dbcc5626acc'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:02:00.756780', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '485e700c-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.858632422, 'message_signature': '9c19714837de4b89c8264f64f95c5cc543b59526c7d9d76a70ca47db42b92cdd'}]}, 'timestamp': '2025-11-28 10:02:00.757655', '_unique_id': '578f3b15746648a7b956e20cc796a112'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.758 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.759 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.759 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.759 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.760 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc2e27a0-dc5b-4955-bbca-206a737f3bb0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:02:00.759924', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '485edaa6-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.875718105, 'message_signature': '1bb2fd7c6b2b989c22488dec2e5c0f02f3092b6c3aad4b6bb25bad81b13909dd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:02:00.759924', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '485eeb54-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.875718105, 'message_signature': '00544548394dd27c67e0f1a399ba94306e5b3dc4f3b2d6cc0c7f6dce53ffacfa'}]}, 'timestamp': '2025-11-28 10:02:00.760812', '_unique_id': '14428ccd932b4e7cabe44ea582127024'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.761 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.762 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.762 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc4f37cb-ea4a-4ca2-b760-20eeb28de1dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:02:00.762922', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '485f5026-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.849510782, 'message_signature': 'ea1bb469332eac19f16040247826fa82e0cdacfeb531d5191d5e014c67de4c84'}]}, 'timestamp': '2025-11-28 10:02:00.763435', '_unique_id': '06147871029d4d30ad928ba040add18e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.764 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.765 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.765 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '923e3cfa-18d9-4c18-a645-82e9204c6cfe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:02:00.765518', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '485fb3b8-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.849510782, 'message_signature': '378dc273f4e7634ccfa973409214e2ba23e43ea10baf69367cbebda6b28a1655'}]}, 'timestamp': '2025-11-28 10:02:00.765970', '_unique_id': '4a104847a46d4372be37d8cb2ad0ade6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.766 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.767 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.768 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a833ccf2-fe7b-4074-beb0-dd2a0ba07a97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:02:00.768068', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '48601786-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.849510782, 'message_signature': 'edee6ffb91e1497cdaf6d5884c746c505b7cbf0f75c9ce706c2fa4eab4c883b7'}]}, 'timestamp': '2025-11-28 10:02:00.768551', '_unique_id': '4565083ce1904707883a2451fb31abdb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.770 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.770 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.771 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8fee6ff-ffc8-4044-9c11-b67dde5a18d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:02:00.770609', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '48607a46-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.858632422, 'message_signature': 'be9987b1301a77906dfc1d4d4b6b652d0fa6d216c6428031b2f35ec6b31fbafe'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:02:00.770609', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '48608b8a-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.858632422, 'message_signature': '29a1a7c1346ee0553297d55a5f7aef2d9958df23da1eae3a0c2a6f90273fb4d5'}]}, 'timestamp': '2025-11-28 10:02:00.771468', '_unique_id': 'fcf7db19477440d08c9e95e5277cf484'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.773 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.773 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12500514-64de-4a70-a57d-4ab72b5b3e0b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:02:00.773607', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '4860ef94-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.849510782, 'message_signature': 'e66ce4b549e9e754799c15eeb9b9b9c820b4781116630e6e69ea490c9d4c5670'}]}, 'timestamp': '2025-11-28 10:02:00.774088', '_unique_id': 'c10008e8862f4b92a02ce57d0f97333c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.774 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.775 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.776 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.776 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af9f3725-9ab0-4d76-ad42-1636ec547845', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:02:00.776143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '48615344-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.875718105, 'message_signature': 'efd1534974420a710e84d24d57b753963e1172e6b44d59dafd4208295ef262c4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:02:00.776143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '48616302-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.875718105, 'message_signature': 'e5c1117bfad0b121efe60bf62201d2bb8b7996d978a3ab35a0ec520b36389124'}]}, 'timestamp': '2025-11-28 10:02:00.776981', '_unique_id': '685ec28cb69543fcb9546fdfcf01dcbf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.777 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.778 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.779 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64be9e07-33f1-4b2d-ac9c-10b0eeb6973d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:02:00.779107', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '4861c6b2-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.849510782, 'message_signature': '23794143cb9be5e477e59eaea27e328b628b7b197a552456f75239baf0c08abc'}]}, 'timestamp': '2025-11-28 10:02:00.779561', '_unique_id': '2b8fea4bf6114392b33b59642c22c2ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.780 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.781 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.781 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5cf1600c-325e-4e21-85cf-7678831d8ad0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:02:00.781637', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '4862292c-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.849510782, 'message_signature': '91e0b2c1fa6d0e40b37fb083b7dfbdda8ee69fcf4971e2e4bdd79c8f0aa20671'}]}, 'timestamp': '2025-11-28 10:02:00.782118', '_unique_id': '88a292825e1d4d9ea9827ea367b8c66a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.782 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.784 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.784 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.784 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6a2d9f2-d22b-4bef-bf80-857f730d07b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:02:00.784172', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '48628d22-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.875718105, 'message_signature': '433d20ee13b167b567f0c178338c2611c14ab7fe085f2d015666aafe45d1e25d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:02:00.784172', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '48629ce0-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.875718105, 'message_signature': '5478a5d9b8e2c383318551a5423b4a393663f06400229807ef6b15486712c39e'}]}, 'timestamp': '2025-11-28 10:02:00.785045', '_unique_id': '51216bc176f84401b2b62a95c61ca4fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.785 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.786 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.787 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.787 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1d97f35-11f5-4bfa-a3e7-9f56facbfb53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:02:00.787284', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '48630612-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.849510782, 'message_signature': '7d68f65c7bb12e79559855b1d9186e008adcf0f180a52407cead0784ec3f8c0d'}]}, 'timestamp': '2025-11-28 10:02:00.787739', '_unique_id': '4c87b5bebceb4ba59a8e7fda64727f30'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.788 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.789 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.789 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '100fd371-2f7c-4f24-a970-b01e98a5955e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:02:00.789823', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '486369fe-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.849510782, 'message_signature': '77ad94ddd83dd072ac94f45a7ef0ab820df81c5b513b6cc04f4f49542c0a2130'}]}, 'timestamp': '2025-11-28 10:02:00.790300', '_unique_id': 'a003b889ff124fb99fdcf1843f830599'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.791 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.792 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.792 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 15730000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e244453-4044-4467-ae2c-fc059b9c4263', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15730000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:02:00.792376', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '4863cc82-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.925610406, 'message_signature': '1254a75d90f1eb619a4ae58ccd7bd56dde4454d3f9f8e7c765f70b6dd922a8a3'}]}, 'timestamp': '2025-11-28 10:02:00.792805', '_unique_id': '74a420ead7ba489dabfe5007f99200a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.793 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.794 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.794 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.794 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86caf319-d10e-4b98-8352-506082c4a869', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:02:00.794352', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4864169c-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.875718105, 'message_signature': '07827867b9c37b8da78c0c771ec0990274c277f3538a90acd5467e1daf590bc9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:02:00.794352', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '48642074-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.875718105, 'message_signature': '555fdfa734c6ebcd3ee8f0465af0998b6a78ecc9f467f471b1c5fde27d2853ff'}]}, 'timestamp': '2025-11-28 10:02:00.794863', '_unique_id': '65418ba3c5394c57a0d65daa92af1c07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.795 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.796 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.796 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.796 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15c3754d-069d-41f3-880b-961922c30441', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:02:00.796167', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '48645e2c-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.875718105, 'message_signature': '7fc2155ca9654eafab178b28d9776fdffd69932b730d6facb16577b5043b61c0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:02:00.796167', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '486467fa-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.875718105, 'message_signature': 'b74500b808f06bb6ad38581905715726861ec08f40d7ed44529319328dfaa679'}]}, 'timestamp': '2025-11-28 10:02:00.796695', '_unique_id': '2e8445f210784649aad56aa3f06dadfa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.797 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40622aa4-440c-4ade-a4c1-2326bc5d4a3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:02:00.797977', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '4864a5c6-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 11954.849510782, 'message_signature': 'd5d528b35fbb3bbf05b5803a80441de58a62c4532a21da7ce4335b582c4f08ea'}]}, 'timestamp': '2025-11-28 10:02:00.798298', '_unique_id': '58fc1e9104cb4af49925933a570afd28'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:02:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:02:00.798 12 ERROR oslo_messaging.notify.messaging Nov 28 05:02:00 localhost neutron_sriov_agent[254147]: 2025-11-28 10:02:00.865 2 INFO neutron.agent.securitygroups_rpc [None req-42bd8e77-bdc1-4bfe-abe6-7d585fdf99bb 75ac6a26227c40ba81e61e610018d23f 1f9b84b894e641c4bee3ebcd1409ad9f - - default default] Security group rule updated ['6deb8732-9203-448a-b0a5-cf6a0375d009']#033[00m Nov 28 05:02:01 localhost nova_compute[279673]: 2025-11-28 10:02:01.024 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:01 localhost neutron_sriov_agent[254147]: 2025-11-28 10:02:01.497 2 INFO neutron.agent.securitygroups_rpc [None req-3458faa2-903e-46ff-96c1-5776090af93b 75ac6a26227c40ba81e61e610018d23f 1f9b84b894e641c4bee3ebcd1409ad9f - - default default] Security group rule updated ['6deb8732-9203-448a-b0a5-cf6a0375d009']#033[00m Nov 28 05:02:02 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:02.707 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:02Z, description=, device_id=8d6dcd20-92ab-47ad-ac9d-52244fd1b9b4, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f270f680-72b3-4958-a0e3-4e2fbae9a975, ip_allocation=immediate, mac_address=fa:16:3e:eb:a0:b1, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:01:56Z, description=, dns_domain=, id=291dc1ac-5414-4421-8e5e-126d810812c9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsNegativeTestJSON-1342624790-network, port_security_enabled=True, project_id=b9d5d0d5dc28445f854288051977b3d1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=39438, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=777, status=ACTIVE, subnets=['72948917-b3da-47be-87d8-60087f12ee07'], tags=[], tenant_id=b9d5d0d5dc28445f854288051977b3d1, updated_at=2025-11-28T10:01:57Z, vlan_transparent=None, network_id=291dc1ac-5414-4421-8e5e-126d810812c9, port_security_enabled=False, project_id=b9d5d0d5dc28445f854288051977b3d1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=832, status=DOWN, tags=[], tenant_id=b9d5d0d5dc28445f854288051977b3d1, updated_at=2025-11-28T10:02:02Z on network 291dc1ac-5414-4421-8e5e-126d810812c9#033[00m Nov 28 05:02:02 localhost systemd[1]: tmp-crun.ux8WEm.mount: Deactivated successfully. Nov 28 05:02:02 localhost dnsmasq[310840]: read /var/lib/neutron/dhcp/291dc1ac-5414-4421-8e5e-126d810812c9/addn_hosts - 1 addresses Nov 28 05:02:02 localhost dnsmasq-dhcp[310840]: read /var/lib/neutron/dhcp/291dc1ac-5414-4421-8e5e-126d810812c9/host Nov 28 05:02:02 localhost podman[310859]: 2025-11-28 10:02:02.936057007 +0000 UTC m=+0.068516911 container kill ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-291dc1ac-5414-4421-8e5e-126d810812c9, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:02:02 localhost dnsmasq-dhcp[310840]: read /var/lib/neutron/dhcp/291dc1ac-5414-4421-8e5e-126d810812c9/opts Nov 28 05:02:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:03.104 261084 INFO neutron.agent.dhcp.agent [None req-377252fe-49a6-4699-ab57-168d7f3b0adb - - - - - -] DHCP configuration for ports {'f270f680-72b3-4958-a0e3-4e2fbae9a975'} is completed#033[00m Nov 28 05:02:04 localhost ovn_controller[152322]: 2025-11-28T10:02:04Z|00151|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:02:04 localhost nova_compute[279673]: 2025-11-28 10:02:04.808 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:04 localhost nova_compute[279673]: 2025-11-28 10:02:04.842 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:04 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:02:05 localhost nova_compute[279673]: 2025-11-28 10:02:05.012 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:05 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:05.082 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:02Z, description=, device_id=8d6dcd20-92ab-47ad-ac9d-52244fd1b9b4, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f270f680-72b3-4958-a0e3-4e2fbae9a975, ip_allocation=immediate, mac_address=fa:16:3e:eb:a0:b1, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:01:56Z, description=, dns_domain=, id=291dc1ac-5414-4421-8e5e-126d810812c9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsNegativeTestJSON-1342624790-network, port_security_enabled=True, project_id=b9d5d0d5dc28445f854288051977b3d1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=39438, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=777, status=ACTIVE, subnets=['72948917-b3da-47be-87d8-60087f12ee07'], tags=[], tenant_id=b9d5d0d5dc28445f854288051977b3d1, updated_at=2025-11-28T10:01:57Z, vlan_transparent=None, network_id=291dc1ac-5414-4421-8e5e-126d810812c9, port_security_enabled=False, project_id=b9d5d0d5dc28445f854288051977b3d1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=832, status=DOWN, tags=[], tenant_id=b9d5d0d5dc28445f854288051977b3d1, updated_at=2025-11-28T10:02:02Z on network 291dc1ac-5414-4421-8e5e-126d810812c9#033[00m Nov 28 05:02:05 localhost dnsmasq[310840]: read /var/lib/neutron/dhcp/291dc1ac-5414-4421-8e5e-126d810812c9/addn_hosts - 1 addresses Nov 28 05:02:05 localhost dnsmasq-dhcp[310840]: read /var/lib/neutron/dhcp/291dc1ac-5414-4421-8e5e-126d810812c9/host Nov 28 05:02:05 localhost dnsmasq-dhcp[310840]: read /var/lib/neutron/dhcp/291dc1ac-5414-4421-8e5e-126d810812c9/opts Nov 28 05:02:05 localhost podman[310896]: 2025-11-28 10:02:05.334159217 +0000 UTC m=+0.067952994 container kill ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-291dc1ac-5414-4421-8e5e-126d810812c9, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:02:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:02:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:02:05 localhost podman[310908]: 2025-11-28 10:02:05.47609731 +0000 UTC m=+0.111895112 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 05:02:05 localhost podman[310909]: 2025-11-28 10:02:05.491976338 +0000 UTC m=+0.120388673 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible) Nov 28 05:02:05 localhost podman[310909]: 2025-11-28 10:02:05.50968099 +0000 UTC m=+0.138093325 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 28 05:02:05 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:02:05 localhost podman[310908]: 2025-11-28 10:02:05.560108866 +0000 UTC m=+0.195906658 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 05:02:05 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:05.565 261084 INFO neutron.agent.dhcp.agent [None req-dc780ffc-b30d-4974-8415-da373f47dbb8 - - - - - -] DHCP configuration for ports {'f270f680-72b3-4958-a0e3-4e2fbae9a975'} is completed#033[00m Nov 28 05:02:05 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:02:06 localhost ovn_controller[152322]: 2025-11-28T10:02:06Z|00152|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:02:06 localhost nova_compute[279673]: 2025-11-28 10:02:06.276 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:06 localhost systemd[1]: tmp-crun.GC5j3p.mount: Deactivated successfully. Nov 28 05:02:08 localhost neutron_sriov_agent[254147]: 2025-11-28 10:02:08.256 2 INFO neutron.agent.securitygroups_rpc [req-6bffedb9-405b-4a40-9982-68d686e88a5f req-5df2fd06-5333-4972-81c1-a0ccb5870973 75ac6a26227c40ba81e61e610018d23f 1f9b84b894e641c4bee3ebcd1409ad9f - - default default] Security group member updated ['6deb8732-9203-448a-b0a5-cf6a0375d009']#033[00m Nov 28 05:02:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:08.320 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:07Z, description=, device_id=bbf7ad79-0406-4158-8a09-075ba873c1fd, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=72569922-3c02-4d13-b171-27f6f957e54c, ip_allocation=immediate, mac_address=fa:16:3e:23:23:a2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:01:49Z, description=, dns_domain=, id=b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-101229426-network, port_security_enabled=True, project_id=1f9b84b894e641c4bee3ebcd1409ad9f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62044, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=730, status=ACTIVE, subnets=['bac26e98-4c61-47a1-b281-0a4613971f3f'], tags=[], tenant_id=1f9b84b894e641c4bee3ebcd1409ad9f, updated_at=2025-11-28T10:01:50Z, vlan_transparent=None, network_id=b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, port_security_enabled=True, project_id=1f9b84b894e641c4bee3ebcd1409ad9f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['6deb8732-9203-448a-b0a5-cf6a0375d009'], standard_attr_id=864, status=DOWN, tags=[], tenant_id=1f9b84b894e641c4bee3ebcd1409ad9f, updated_at=2025-11-28T10:02:07Z on network b1696f4c-80ce-491f-ad1c-cc7f5b6700ba#033[00m Nov 28 05:02:08 localhost dnsmasq[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/addn_hosts - 2 addresses Nov 28 05:02:08 localhost dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/host Nov 28 05:02:08 localhost podman[310972]: 2025-11-28 10:02:08.543169313 +0000 UTC m=+0.067901933 container kill 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:02:08 localhost dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/opts Nov 28 05:02:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:08.883 261084 INFO neutron.agent.dhcp.agent [None req-c3d929fd-8152-4f5c-8f84-cd745f2000df - - - - - -] DHCP configuration for ports {'72569922-3c02-4d13-b171-27f6f957e54c'} is completed#033[00m Nov 28 05:02:09 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:09.271 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005538514.localdomain, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:07Z, description=, device_id=bbf7ad79-0406-4158-8a09-075ba873c1fd, device_owner=compute:nova, dns_assignment=[], dns_domain=, dns_name=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com, extra_dhcp_opts=[], fixed_ips=[], id=72569922-3c02-4d13-b171-27f6f957e54c, ip_allocation=immediate, mac_address=fa:16:3e:23:23:a2, name=, network_id=b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, port_security_enabled=True, project_id=1f9b84b894e641c4bee3ebcd1409ad9f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['6deb8732-9203-448a-b0a5-cf6a0375d009'], standard_attr_id=864, status=DOWN, tags=[], tenant_id=1f9b84b894e641c4bee3ebcd1409ad9f, updated_at=2025-11-28T10:02:08Z on network b1696f4c-80ce-491f-ad1c-cc7f5b6700ba#033[00m Nov 28 05:02:09 localhost dnsmasq[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/addn_hosts - 2 addresses Nov 28 05:02:09 localhost dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/host Nov 28 05:02:09 localhost podman[311009]: 2025-11-28 10:02:09.500754368 +0000 UTC m=+0.061086304 container kill 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3) Nov 28 05:02:09 localhost dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/opts Nov 28 05:02:09 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:09.715 261084 INFO neutron.agent.dhcp.agent [None req-a706aba6-e5b7-4b45-acd4-353416be1706 - - - - - -] DHCP configuration for ports {'72569922-3c02-4d13-b171-27f6f957e54c'} is completed#033[00m Nov 28 05:02:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e103 do_prune osdmap full prune enabled Nov 28 05:02:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e104 e104: 6 total, 6 up, 6 in Nov 28 05:02:09 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e104: 6 total, 6 up, 6 in Nov 28 05:02:09 localhost nova_compute[279673]: 2025-11-28 10:02:09.869 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e104 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:02:10 localhost nova_compute[279673]: 2025-11-28 10:02:10.016 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:10 localhost podman[238687]: time="2025-11-28T10:02:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:02:10 localhost podman[238687]: @ - - [28/Nov/2025:10:02:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159336 "" "Go-http-client/1.1" Nov 28 05:02:10 localhost podman[238687]: @ - - [28/Nov/2025:10:02:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20212 "" "Go-http-client/1.1" Nov 28 05:02:10 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e104 do_prune osdmap full prune enabled Nov 28 05:02:10 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e105 e105: 6 total, 6 up, 6 in Nov 28 05:02:10 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e105: 6 total, 6 up, 6 in Nov 28 05:02:10 localhost dnsmasq[310840]: read /var/lib/neutron/dhcp/291dc1ac-5414-4421-8e5e-126d810812c9/addn_hosts - 0 addresses Nov 28 05:02:10 localhost dnsmasq-dhcp[310840]: read /var/lib/neutron/dhcp/291dc1ac-5414-4421-8e5e-126d810812c9/host Nov 28 05:02:10 localhost dnsmasq-dhcp[310840]: read /var/lib/neutron/dhcp/291dc1ac-5414-4421-8e5e-126d810812c9/opts Nov 28 05:02:10 localhost podman[311049]: 2025-11-28 10:02:10.979232451 +0000 UTC m=+0.063592392 container kill ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-291dc1ac-5414-4421-8e5e-126d810812c9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:02:11 localhost kernel: device tap54867331-d2 left promiscuous mode Nov 28 05:02:11 localhost nova_compute[279673]: 2025-11-28 10:02:11.187 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:11 localhost ovn_controller[152322]: 2025-11-28T10:02:11Z|00153|binding|INFO|Releasing lport 54867331-d2d2-4007-8751-6825f0370005 from this chassis (sb_readonly=0) Nov 28 05:02:11 localhost ovn_controller[152322]: 2025-11-28T10:02:11Z|00154|binding|INFO|Setting lport 54867331-d2d2-4007-8751-6825f0370005 down in Southbound Nov 28 05:02:11 localhost ovn_metadata_agent[158125]: 2025-11-28 10:02:11.204 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-291dc1ac-5414-4421-8e5e-126d810812c9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-291dc1ac-5414-4421-8e5e-126d810812c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b9d5d0d5dc28445f854288051977b3d1', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46ecb4cc-6f9f-41cb-ba67-522f7eda61f5, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=54867331-d2d2-4007-8751-6825f0370005) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:02:11 localhost nova_compute[279673]: 2025-11-28 10:02:11.206 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:11 localhost ovn_metadata_agent[158125]: 2025-11-28 10:02:11.210 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 54867331-d2d2-4007-8751-6825f0370005 in datapath 291dc1ac-5414-4421-8e5e-126d810812c9 unbound from our chassis#033[00m Nov 28 05:02:11 localhost ovn_metadata_agent[158125]: 2025-11-28 10:02:11.215 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 291dc1ac-5414-4421-8e5e-126d810812c9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:02:11 localhost ovn_metadata_agent[158125]: 2025-11-28 10:02:11.216 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[5db0e8f2-4ce7-4faf-9a0d-3ad76f444a59]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:02:11 localhost ovn_controller[152322]: 2025-11-28T10:02:11Z|00155|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:02:11 localhost nova_compute[279673]: 2025-11-28 10:02:11.891 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e105 do_prune osdmap full prune enabled Nov 28 05:02:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e106 e106: 6 total, 6 up, 6 in Nov 28 05:02:11 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e106: 6 total, 6 up, 6 in Nov 28 05:02:13 localhost ovn_controller[152322]: 2025-11-28T10:02:13Z|00156|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:02:13 localhost nova_compute[279673]: 2025-11-28 10:02:13.531 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:14 localhost dnsmasq[310840]: exiting on receipt of SIGTERM Nov 28 05:02:14 localhost systemd[1]: tmp-crun.LLf7Wk.mount: Deactivated successfully. Nov 28 05:02:14 localhost podman[311090]: 2025-11-28 10:02:14.088291998 +0000 UTC m=+0.074838866 container kill ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-291dc1ac-5414-4421-8e5e-126d810812c9, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 28 05:02:14 localhost systemd[1]: libpod-ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381.scope: Deactivated successfully. Nov 28 05:02:14 localhost podman[311102]: 2025-11-28 10:02:14.158096168 +0000 UTC m=+0.056306804 container died ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-291dc1ac-5414-4421-8e5e-126d810812c9, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 28 05:02:14 localhost podman[311102]: 2025-11-28 10:02:14.195296394 +0000 UTC m=+0.093506960 container cleanup ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-291dc1ac-5414-4421-8e5e-126d810812c9, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:02:14 localhost systemd[1]: libpod-conmon-ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381.scope: Deactivated successfully. Nov 28 05:02:14 localhost podman[311108]: 2025-11-28 10:02:14.221472305 +0000 UTC m=+0.106948076 container remove ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-291dc1ac-5414-4421-8e5e-126d810812c9, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:02:14 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:14.473 261084 INFO neutron.agent.dhcp.agent [None req-350b1190-06e0-4de2-b6f3-8032ed6b7c95 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:02:14 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:14.515 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:02:14 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:14.835 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:02:14 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:02:14 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e106 do_prune osdmap full prune enabled Nov 28 05:02:14 localhost nova_compute[279673]: 2025-11-28 10:02:14.904 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:14 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e107 e107: 6 total, 6 up, 6 in Nov 28 05:02:14 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e107: 6 total, 6 up, 6 in Nov 28 05:02:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:02:15 localhost nova_compute[279673]: 2025-11-28 10:02:15.024 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:15 localhost systemd[1]: var-lib-containers-storage-overlay-e086de59da47ef628d8f8ee81d1bcf4e96528abb7fd65b1cc4ec3d44ec3ea5b1-merged.mount: Deactivated successfully. Nov 28 05:02:15 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ce8cb25d7aeec767c77ed01628b5dd608b7c8735b49714382793dbad2af6e381-userdata-shm.mount: Deactivated successfully. Nov 28 05:02:15 localhost systemd[1]: run-netns-qdhcp\x2d291dc1ac\x2d5414\x2d4421\x2d8e5e\x2d126d810812c9.mount: Deactivated successfully. Nov 28 05:02:15 localhost podman[311132]: 2025-11-28 10:02:15.115703748 +0000 UTC m=+0.103037254 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 28 05:02:15 localhost podman[311132]: 2025-11-28 10:02:15.129241256 +0000 UTC m=+0.116574782 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, version=9.6, architecture=x86_64, managed_by=edpm_ansible, container_name=openstack_network_exporter, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, config_id=edpm) Nov 28 05:02:15 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:02:15 localhost nova_compute[279673]: 2025-11-28 10:02:15.186 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Acquiring lock "7292509e-f294-4159-96e5-22d4712df2a0" by "nova.compute.manager.ComputeManager.unshelve_instance..do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:02:15 localhost nova_compute[279673]: 2025-11-28 10:02:15.187 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lock "7292509e-f294-4159-96e5-22d4712df2a0" acquired by "nova.compute.manager.ComputeManager.unshelve_instance..do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:02:15 localhost nova_compute[279673]: 2025-11-28 10:02:15.188 279685 INFO nova.compute.manager [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Unshelving#033[00m Nov 28 05:02:15 localhost nova_compute[279673]: 2025-11-28 10:02:15.296 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:02:15 localhost nova_compute[279673]: 2025-11-28 10:02:15.297 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:02:15 localhost nova_compute[279673]: 2025-11-28 10:02:15.302 279685 DEBUG nova.objects.instance [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lazy-loading 'pci_requests' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:02:15 localhost nova_compute[279673]: 2025-11-28 10:02:15.316 279685 DEBUG nova.objects.instance [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lazy-loading 'numa_topology' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:02:15 localhost nova_compute[279673]: 2025-11-28 10:02:15.327 279685 DEBUG nova.virt.hardware [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m Nov 28 05:02:15 localhost nova_compute[279673]: 2025-11-28 10:02:15.328 279685 INFO nova.compute.claims [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Claim successful on node np0005538513.localdomain#033[00m Nov 28 05:02:15 localhost nova_compute[279673]: 2025-11-28 10:02:15.446 279685 DEBUG oslo_concurrency.processutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:02:15 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:15.733 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:02:15 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:02:15 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2145567881' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:02:15 localhost nova_compute[279673]: 2025-11-28 10:02:15.973 279685 DEBUG oslo_concurrency.processutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:02:15 localhost nova_compute[279673]: 2025-11-28 10:02:15.983 279685 DEBUG nova.compute.provider_tree [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 05:02:16 localhost nova_compute[279673]: 2025-11-28 10:02:16.005 279685 DEBUG nova.scheduler.client.report [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0. Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.023925) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43 Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324136024088, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 887, "num_deletes": 254, "total_data_size": 689676, "memory_usage": 705848, "flush_reason": "Manual Compaction"} Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started Nov 28 05:02:16 localhost nova_compute[279673]: 2025-11-28 10:02:16.027 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.730s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324136032934, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 674308, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24734, "largest_seqno": 25620, "table_properties": {"data_size": 670184, "index_size": 1787, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10053, "raw_average_key_size": 20, "raw_value_size": 661649, "raw_average_value_size": 1358, "num_data_blocks": 78, "num_entries": 487, "num_filter_entries": 487, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324089, "oldest_key_time": 1764324089, "file_creation_time": 1764324136, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}} Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 9114 microseconds, and 4803 cpu microseconds. Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.033004) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 674308 bytes OK Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.033087) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.035342) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.035366) EVENT_LOG_v1 {"time_micros": 1764324136035358, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.035399) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 685289, prev total WAL file size 685289, number of live WAL files 2. Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.036100) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131353436' seq:72057594037927935, type:22 .. '7061786F73003131373938' seq:0, type:0; will stop at (end) Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(658KB)], [42(17MB)] Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324136036168, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 18755063, "oldest_snapshot_seqno": -1} Nov 28 05:02:16 localhost nova_compute[279673]: 2025-11-28 10:02:16.066 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Acquiring lock "refresh_cache-7292509e-f294-4159-96e5-22d4712df2a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 05:02:16 localhost nova_compute[279673]: 2025-11-28 10:02:16.067 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Acquired lock "refresh_cache-7292509e-f294-4159-96e5-22d4712df2a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 05:02:16 localhost nova_compute[279673]: 2025-11-28 10:02:16.067 279685 DEBUG nova.network.neutron [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Nov 28 05:02:16 localhost nova_compute[279673]: 2025-11-28 10:02:16.115 279685 DEBUG nova.network.neutron [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 12045 keys, 16120567 bytes, temperature: kUnknown Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324136128247, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 16120567, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16054212, "index_size": 35150, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30149, "raw_key_size": 323922, "raw_average_key_size": 26, "raw_value_size": 15851387, "raw_average_value_size": 1316, "num_data_blocks": 1327, "num_entries": 12045, "num_filter_entries": 12045, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764324136, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}} Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.128783) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 16120567 bytes Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.130879) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 203.5 rd, 174.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 17.2 +0.0 blob) out(15.4 +0.0 blob), read-write-amplify(51.7) write-amplify(23.9) OK, records in: 12570, records dropped: 525 output_compression: NoCompression Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.130911) EVENT_LOG_v1 {"time_micros": 1764324136130898, "job": 24, "event": "compaction_finished", "compaction_time_micros": 92182, "compaction_time_cpu_micros": 41068, "output_level": 6, "num_output_files": 1, "total_output_size": 16120567, "num_input_records": 12570, "num_output_records": 12045, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324136131146, "job": 24, "event": "table_file_deletion", "file_number": 44} Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324136132931, "job": 24, "event": "table_file_deletion", "file_number": 42} Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.035957) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.133046) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.133055) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.133059) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.133062) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:02:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:02:16.133066) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:02:16 localhost nova_compute[279673]: 2025-11-28 10:02:16.283 279685 DEBUG nova.network.neutron [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:02:16 localhost nova_compute[279673]: 2025-11-28 10:02:16.298 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Releasing lock "refresh_cache-7292509e-f294-4159-96e5-22d4712df2a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 05:02:16 localhost nova_compute[279673]: 2025-11-28 10:02:16.300 279685 DEBUG nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m Nov 28 05:02:16 localhost nova_compute[279673]: 2025-11-28 10:02:16.301 279685 INFO nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Creating image(s)#033[00m Nov 28 05:02:16 localhost nova_compute[279673]: 2025-11-28 10:02:16.342 279685 DEBUG nova.storage.rbd_utils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 28 05:02:16 localhost nova_compute[279673]: 2025-11-28 10:02:16.348 279685 DEBUG nova.objects.instance [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:02:16 localhost nova_compute[279673]: 2025-11-28 10:02:16.398 279685 DEBUG nova.storage.rbd_utils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 28 05:02:16 localhost nova_compute[279673]: 2025-11-28 10:02:16.441 279685 DEBUG nova.storage.rbd_utils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 28 05:02:16 localhost nova_compute[279673]: 2025-11-28 10:02:16.447 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Acquiring lock "e38b87c2132e46140fd3315a4a63ceb8d23e3d74" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:02:16 localhost nova_compute[279673]: 2025-11-28 10:02:16.448 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lock "e38b87c2132e46140fd3315a4a63ceb8d23e3d74" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:02:16 localhost nova_compute[279673]: 2025-11-28 10:02:16.505 279685 DEBUG nova.virt.libvirt.imagebackend [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Image locations are: [{'url': 'rbd://2c5417c9-00eb-57d5-a565-ddecbc7995c1/images/a2def208-be38-4da4-a3f2-d5c5045455ca/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://2c5417c9-00eb-57d5-a565-ddecbc7995c1/images/a2def208-be38-4da4-a3f2-d5c5045455ca/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m Nov 28 05:02:16 localhost nova_compute[279673]: 2025-11-28 10:02:16.596 279685 DEBUG nova.virt.libvirt.imagebackend [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Selected location: {'url': 'rbd://2c5417c9-00eb-57d5-a565-ddecbc7995c1/images/a2def208-be38-4da4-a3f2-d5c5045455ca/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m Nov 28 05:02:16 localhost nova_compute[279673]: 2025-11-28 10:02:16.597 279685 DEBUG nova.storage.rbd_utils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] cloning images/a2def208-be38-4da4-a3f2-d5c5045455ca@snap to None/7292509e-f294-4159-96e5-22d4712df2a0_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m Nov 28 05:02:16 localhost nova_compute[279673]: 2025-11-28 10:02:16.824 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lock "e38b87c2132e46140fd3315a4a63ceb8d23e3d74" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.375s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.101 279685 DEBUG nova.objects.instance [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lazy-loading 'migration_context' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.203 279685 DEBUG nova.storage.rbd_utils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] flattening vms/7292509e-f294-4159-96e5-22d4712df2a0_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m Nov 28 05:02:17 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 05:02:17 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.948 279685 DEBUG nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Image rbd:vms/7292509e-f294-4159-96e5-22d4712df2a0_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.948 279685 DEBUG nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.949 279685 DEBUG nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Ensure instance console log exists: /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.949 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.950 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.950 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.953 279685 DEBUG nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2025-11-28T10:01:56Z,direct_url=,disk_format='raw',id=a2def208-be38-4da4-a3f2-d5c5045455ca,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-650509197-shelved',owner='a30386ba68ee46f4a1bac43cf415f3a4',properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=2025-11-28T10:02:12Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'encryption_options': None, 'guest_format': None, 'encryption_secret_uuid': None, 'boot_index': 0, 'device_type': 'disk', 'encrypted': False, 'image_id': '85968a96-5a0e-43a4-9c04-3954f640a7ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.960 279685 WARNING nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.963 279685 DEBUG nova.virt.libvirt.host [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Searching host: 'np0005538513.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.964 279685 DEBUG nova.virt.libvirt.host [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.966 279685 DEBUG nova.virt.libvirt.host [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Searching host: 'np0005538513.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.967 279685 DEBUG nova.virt.libvirt.host [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.967 279685 DEBUG nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.968 279685 DEBUG nova.virt.hardware [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T09:59:43Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='98f289d4-5c06-4ab5-9089-7b580870d676',id=5,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2025-11-28T10:01:56Z,direct_url=,disk_format='raw',id=a2def208-be38-4da4-a3f2-d5c5045455ca,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-650509197-shelved',owner='a30386ba68ee46f4a1bac43cf415f3a4',properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=2025-11-28T10:02:12Z,virtual_size=,visibility=), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.968 279685 DEBUG nova.virt.hardware [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.969 279685 DEBUG nova.virt.hardware [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.969 279685 DEBUG nova.virt.hardware [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.970 279685 DEBUG nova.virt.hardware [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.970 279685 DEBUG nova.virt.hardware [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.970 279685 DEBUG nova.virt.hardware [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.971 279685 DEBUG nova.virt.hardware [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.971 279685 DEBUG nova.virt.hardware [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.972 279685 DEBUG nova.virt.hardware [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.972 279685 DEBUG nova.virt.hardware [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.973 279685 DEBUG nova.objects.instance [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:02:17 localhost nova_compute[279673]: 2025-11-28 10:02:17.998 279685 DEBUG oslo_concurrency.processutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:02:18 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 05:02:18 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:02:18 localhost openstack_network_exporter[240658]: ERROR 10:02:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:02:18 localhost openstack_network_exporter[240658]: ERROR 10:02:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:02:18 localhost openstack_network_exporter[240658]: ERROR 10:02:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:02:18 localhost openstack_network_exporter[240658]: ERROR 10:02:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:02:18 localhost openstack_network_exporter[240658]: Nov 28 05:02:18 localhost openstack_network_exporter[240658]: ERROR 10:02:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:02:18 localhost openstack_network_exporter[240658]: Nov 28 05:02:18 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 28 05:02:18 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3129734523' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 28 05:02:18 localhost nova_compute[279673]: 2025-11-28 10:02:18.521 279685 DEBUG oslo_concurrency.processutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:02:18 localhost nova_compute[279673]: 2025-11-28 10:02:18.566 279685 DEBUG nova.storage.rbd_utils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 28 05:02:18 localhost nova_compute[279673]: 2025-11-28 10:02:18.573 279685 DEBUG oslo_concurrency.processutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:02:19 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 28 05:02:19 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3253647983' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 28 05:02:19 localhost nova_compute[279673]: 2025-11-28 10:02:19.036 279685 DEBUG oslo_concurrency.processutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:02:19 localhost nova_compute[279673]: 2025-11-28 10:02:19.039 279685 DEBUG nova.objects.instance [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:02:19 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e107 do_prune osdmap full prune enabled Nov 28 05:02:19 localhost nova_compute[279673]: 2025-11-28 10:02:19.063 279685 DEBUG nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] End _get_guest_xml xml= Nov 28 05:02:19 localhost nova_compute[279673]: 7292509e-f294-4159-96e5-22d4712df2a0 Nov 28 05:02:19 localhost nova_compute[279673]: instance-00000007 Nov 28 05:02:19 localhost nova_compute[279673]: 131072 Nov 28 05:02:19 localhost nova_compute[279673]: 1 Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: tempest-UnshelveToHostMultiNodesTest-server-650509197 Nov 28 05:02:19 localhost nova_compute[279673]: 2025-11-28 10:02:17 Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: 128 Nov 28 05:02:19 localhost nova_compute[279673]: 1 Nov 28 05:02:19 localhost nova_compute[279673]: 0 Nov 28 05:02:19 localhost nova_compute[279673]: 0 Nov 28 05:02:19 localhost nova_compute[279673]: 1 Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: tempest-UnshelveToHostMultiNodesTest-426973173-project-member Nov 28 05:02:19 localhost nova_compute[279673]: tempest-UnshelveToHostMultiNodesTest-426973173 Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: RDO Nov 28 05:02:19 localhost nova_compute[279673]: OpenStack Compute Nov 28 05:02:19 localhost nova_compute[279673]: 27.5.2-0.20250829104910.6f8decf.el9 Nov 28 05:02:19 localhost nova_compute[279673]: 7292509e-f294-4159-96e5-22d4712df2a0 Nov 28 05:02:19 localhost nova_compute[279673]: 7292509e-f294-4159-96e5-22d4712df2a0 Nov 28 05:02:19 localhost nova_compute[279673]: Virtual Machine Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: hvm Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: /dev/urandom Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: Nov 28 05:02:19 localhost nova_compute[279673]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m Nov 28 05:02:19 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e108 e108: 6 total, 6 up, 6 in Nov 28 05:02:19 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e108: 6 total, 6 up, 6 in Nov 28 05:02:19 localhost nova_compute[279673]: 2025-11-28 10:02:19.134 279685 DEBUG nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m Nov 28 05:02:19 localhost nova_compute[279673]: 2025-11-28 10:02:19.135 279685 DEBUG nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m Nov 28 05:02:19 localhost nova_compute[279673]: 2025-11-28 10:02:19.136 279685 INFO nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Using config drive#033[00m Nov 28 05:02:19 localhost nova_compute[279673]: 2025-11-28 10:02:19.177 279685 DEBUG nova.storage.rbd_utils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 28 05:02:19 localhost nova_compute[279673]: 2025-11-28 10:02:19.210 279685 DEBUG nova.objects.instance [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:02:19 localhost nova_compute[279673]: 2025-11-28 10:02:19.258 279685 DEBUG nova.objects.instance [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lazy-loading 'keypairs' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:02:19 localhost nova_compute[279673]: 2025-11-28 10:02:19.324 279685 INFO nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Creating config drive at /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/disk.config#033[00m Nov 28 05:02:19 localhost nova_compute[279673]: 2025-11-28 10:02:19.333 279685 DEBUG oslo_concurrency.processutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwx8dh7be execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:02:19 localhost nova_compute[279673]: 2025-11-28 10:02:19.467 279685 DEBUG oslo_concurrency.processutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwx8dh7be" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:02:19 localhost nova_compute[279673]: 2025-11-28 10:02:19.514 279685 DEBUG nova.storage.rbd_utils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] rbd image 7292509e-f294-4159-96e5-22d4712df2a0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Nov 28 05:02:19 localhost nova_compute[279673]: 2025-11-28 10:02:19.519 279685 DEBUG oslo_concurrency.processutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/disk.config 7292509e-f294-4159-96e5-22d4712df2a0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:02:19 localhost nova_compute[279673]: 2025-11-28 10:02:19.741 279685 DEBUG oslo_concurrency.processutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/disk.config 7292509e-f294-4159-96e5-22d4712df2a0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.222s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:02:19 localhost nova_compute[279673]: 2025-11-28 10:02:19.742 279685 INFO nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Deleting local config drive /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0/disk.config because it was imported into RBD.#033[00m Nov 28 05:02:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:02:19 localhost systemd-machined[83422]: New machine qemu-5-instance-00000007. Nov 28 05:02:19 localhost systemd[1]: Started Virtual Machine qemu-5-instance-00000007. Nov 28 05:02:19 localhost podman[311593]: 2025-11-28 10:02:19.864095777 +0000 UTC m=+0.097645859 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 28 05:02:19 localhost podman[311593]: 2025-11-28 10:02:19.879498488 +0000 UTC m=+0.113048600 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 05:02:19 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:02:19 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:02:19 localhost nova_compute[279673]: 2025-11-28 10:02:19.941 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:20 localhost nova_compute[279673]: 2025-11-28 10:02:20.027 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:20 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e108 do_prune osdmap full prune enabled Nov 28 05:02:20 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e109 e109: 6 total, 6 up, 6 in Nov 28 05:02:20 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e109: 6 total, 6 up, 6 in Nov 28 05:02:20 localhost nova_compute[279673]: 2025-11-28 10:02:20.213 279685 DEBUG nova.virt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 28 05:02:20 localhost nova_compute[279673]: 2025-11-28 10:02:20.213 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] VM Resumed (Lifecycle Event)#033[00m Nov 28 05:02:20 localhost nova_compute[279673]: 2025-11-28 10:02:20.217 279685 DEBUG nova.compute.manager [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance event wait completed in 0 seconds for wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m Nov 28 05:02:20 localhost nova_compute[279673]: 2025-11-28 10:02:20.218 279685 DEBUG nova.virt.libvirt.driver [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m Nov 28 05:02:20 localhost nova_compute[279673]: 2025-11-28 10:02:20.224 279685 INFO nova.virt.libvirt.driver [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance spawned successfully.#033[00m Nov 28 05:02:20 localhost nova_compute[279673]: 2025-11-28 10:02:20.238 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 05:02:20 localhost nova_compute[279673]: 2025-11-28 10:02:20.242 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Nov 28 05:02:20 localhost nova_compute[279673]: 2025-11-28 10:02:20.270 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m Nov 28 05:02:20 localhost nova_compute[279673]: 2025-11-28 10:02:20.271 279685 DEBUG nova.virt.driver [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 28 05:02:20 localhost nova_compute[279673]: 2025-11-28 10:02:20.271 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] VM Started (Lifecycle Event)#033[00m Nov 28 05:02:20 localhost nova_compute[279673]: 2025-11-28 10:02:20.292 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 05:02:20 localhost nova_compute[279673]: 2025-11-28 10:02:20.297 279685 DEBUG nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Nov 28 05:02:20 localhost nova_compute[279673]: 2025-11-28 10:02:20.318 279685 INFO nova.compute.manager [None req-adfb225f-53ba-484d-ab6e-0fa6d1783373 - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m Nov 28 05:02:20 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:20.442 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:02:20 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 28 05:02:20 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:02:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e109 do_prune osdmap full prune enabled Nov 28 05:02:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e110 e110: 6 total, 6 up, 6 in Nov 28 05:02:21 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e110: 6 total, 6 up, 6 in Nov 28 05:02:21 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:02:21 localhost nova_compute[279673]: 2025-11-28 10:02:21.731 279685 DEBUG nova.compute.manager [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 05:02:21 localhost nova_compute[279673]: 2025-11-28 10:02:21.812 279685 DEBUG oslo_concurrency.lockutils [None req-ef7f9168-d2f1-48b0-ae22-eb775bb7590a 734ca2d323084815a7ec954d4e95f7c1 cee9624834b94e80a7bbe35b2f1a1739 - - default default] Lock "7292509e-f294-4159-96e5-22d4712df2a0" "released" by "nova.compute.manager.ComputeManager.unshelve_instance..do_unshelve_instance" :: held 6.624s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:02:21 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:21.908 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:02:22 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e110 do_prune osdmap full prune enabled Nov 28 05:02:22 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e111 e111: 6 total, 6 up, 6 in Nov 28 05:02:22 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e111: 6 total, 6 up, 6 in Nov 28 05:02:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:02:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:02:22 localhost podman[311673]: 2025-11-28 10:02:22.847169824 +0000 UTC m=+0.072381715 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS) Nov 28 05:02:22 localhost systemd[1]: tmp-crun.wVsfFi.mount: Deactivated successfully. Nov 28 05:02:22 localhost podman[311674]: 2025-11-28 10:02:22.956394454 +0000 UTC m=+0.182396177 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Nov 28 05:02:22 localhost podman[311673]: 2025-11-28 10:02:22.990662116 +0000 UTC m=+0.215874057 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 28 05:02:23 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:02:23 localhost podman[311674]: 2025-11-28 10:02:23.043450188 +0000 UTC m=+0.269451961 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:02:23 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:02:23 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e111 do_prune osdmap full prune enabled Nov 28 05:02:23 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e112 e112: 6 total, 6 up, 6 in Nov 28 05:02:23 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e112: 6 total, 6 up, 6 in Nov 28 05:02:23 localhost nova_compute[279673]: 2025-11-28 10:02:23.203 279685 DEBUG oslo_concurrency.lockutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Acquiring lock "7292509e-f294-4159-96e5-22d4712df2a0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:02:23 localhost nova_compute[279673]: 2025-11-28 10:02:23.204 279685 DEBUG oslo_concurrency.lockutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "7292509e-f294-4159-96e5-22d4712df2a0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:02:23 localhost nova_compute[279673]: 2025-11-28 10:02:23.204 279685 DEBUG oslo_concurrency.lockutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Acquiring lock "7292509e-f294-4159-96e5-22d4712df2a0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:02:23 localhost nova_compute[279673]: 2025-11-28 10:02:23.205 279685 DEBUG oslo_concurrency.lockutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "7292509e-f294-4159-96e5-22d4712df2a0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:02:23 localhost nova_compute[279673]: 2025-11-28 10:02:23.205 279685 DEBUG oslo_concurrency.lockutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "7292509e-f294-4159-96e5-22d4712df2a0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:02:23 localhost nova_compute[279673]: 2025-11-28 10:02:23.208 279685 INFO nova.compute.manager [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Terminating instance#033[00m Nov 28 05:02:23 localhost nova_compute[279673]: 2025-11-28 10:02:23.209 279685 DEBUG oslo_concurrency.lockutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Acquiring lock "refresh_cache-7292509e-f294-4159-96e5-22d4712df2a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 05:02:23 localhost nova_compute[279673]: 2025-11-28 10:02:23.210 279685 DEBUG oslo_concurrency.lockutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Acquired lock "refresh_cache-7292509e-f294-4159-96e5-22d4712df2a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 05:02:23 localhost nova_compute[279673]: 2025-11-28 10:02:23.211 279685 DEBUG nova.network.neutron [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Nov 28 05:02:23 localhost nova_compute[279673]: 2025-11-28 10:02:23.433 279685 DEBUG nova.network.neutron [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m Nov 28 05:02:23 localhost nova_compute[279673]: 2025-11-28 10:02:23.607 279685 DEBUG nova.network.neutron [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:02:23 localhost nova_compute[279673]: 2025-11-28 10:02:23.641 279685 DEBUG oslo_concurrency.lockutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Releasing lock "refresh_cache-7292509e-f294-4159-96e5-22d4712df2a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 05:02:23 localhost nova_compute[279673]: 2025-11-28 10:02:23.642 279685 DEBUG nova.compute.manager [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m Nov 28 05:02:23 localhost systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000007.scope: Deactivated successfully. Nov 28 05:02:23 localhost systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000007.scope: Consumed 4.092s CPU time. Nov 28 05:02:23 localhost systemd-machined[83422]: Machine qemu-5-instance-00000007 terminated. Nov 28 05:02:23 localhost nova_compute[279673]: 2025-11-28 10:02:23.866 279685 INFO nova.virt.libvirt.driver [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance destroyed successfully.#033[00m Nov 28 05:02:23 localhost nova_compute[279673]: 2025-11-28 10:02:23.867 279685 DEBUG nova.objects.instance [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lazy-loading 'resources' on Instance uuid 7292509e-f294-4159-96e5-22d4712df2a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:02:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e112 do_prune osdmap full prune enabled Nov 28 05:02:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e113 e113: 6 total, 6 up, 6 in Nov 28 05:02:24 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e113: 6 total, 6 up, 6 in Nov 28 05:02:24 localhost nova_compute[279673]: 2025-11-28 10:02:24.878 279685 INFO nova.virt.libvirt.driver [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Deleting instance files /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0_del#033[00m Nov 28 05:02:24 localhost nova_compute[279673]: 2025-11-28 10:02:24.879 279685 INFO nova.virt.libvirt.driver [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Deletion of /var/lib/nova/instances/7292509e-f294-4159-96e5-22d4712df2a0_del complete#033[00m Nov 28 05:02:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e113 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:02:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e113 do_prune osdmap full prune enabled Nov 28 05:02:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e114 e114: 6 total, 6 up, 6 in Nov 28 05:02:24 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e114: 6 total, 6 up, 6 in Nov 28 05:02:24 localhost nova_compute[279673]: 2025-11-28 10:02:24.932 279685 INFO nova.compute.manager [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Took 1.29 seconds to destroy the instance on the hypervisor.#033[00m Nov 28 05:02:24 localhost nova_compute[279673]: 2025-11-28 10:02:24.933 279685 DEBUG oslo.service.loopingcall [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m Nov 28 05:02:24 localhost nova_compute[279673]: 2025-11-28 10:02:24.933 279685 DEBUG nova.compute.manager [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m Nov 28 05:02:24 localhost nova_compute[279673]: 2025-11-28 10:02:24.933 279685 DEBUG nova.network.neutron [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m Nov 28 05:02:24 localhost nova_compute[279673]: 2025-11-28 10:02:24.969 279685 DEBUG nova.network.neutron [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m Nov 28 05:02:24 localhost nova_compute[279673]: 2025-11-28 10:02:24.978 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:24 localhost nova_compute[279673]: 2025-11-28 10:02:24.985 279685 DEBUG nova.network.neutron [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:02:25 localhost nova_compute[279673]: 2025-11-28 10:02:25.002 279685 INFO nova.compute.manager [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Took 0.07 seconds to deallocate network for instance.#033[00m Nov 28 05:02:25 localhost nova_compute[279673]: 2025-11-28 10:02:25.028 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:25 localhost nova_compute[279673]: 2025-11-28 10:02:25.052 279685 DEBUG oslo_concurrency.lockutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:02:25 localhost nova_compute[279673]: 2025-11-28 10:02:25.053 279685 DEBUG oslo_concurrency.lockutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:02:25 localhost nova_compute[279673]: 2025-11-28 10:02:25.174 279685 DEBUG oslo_concurrency.processutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:02:25 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:02:25 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2043476067' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:02:25 localhost nova_compute[279673]: 2025-11-28 10:02:25.628 279685 DEBUG oslo_concurrency.processutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:02:25 localhost nova_compute[279673]: 2025-11-28 10:02:25.636 279685 DEBUG nova.compute.provider_tree [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 05:02:25 localhost nova_compute[279673]: 2025-11-28 10:02:25.652 279685 DEBUG nova.scheduler.client.report [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 05:02:25 localhost nova_compute[279673]: 2025-11-28 10:02:25.674 279685 DEBUG oslo_concurrency.lockutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.621s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:02:25 localhost nova_compute[279673]: 2025-11-28 10:02:25.710 279685 INFO nova.scheduler.client.report [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Deleted allocations for instance 7292509e-f294-4159-96e5-22d4712df2a0#033[00m Nov 28 05:02:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:02:25 localhost nova_compute[279673]: 2025-11-28 10:02:25.787 279685 DEBUG oslo_concurrency.lockutils [None req-ee4f872f-a4b9-432b-828c-ce36e53f2d72 28578129c91d407a92af609ba8bac430 a30386ba68ee46f4a1bac43cf415f3a4 - - default default] Lock "7292509e-f294-4159-96e5-22d4712df2a0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.583s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:02:25 localhost podman[311759]: 2025-11-28 10:02:25.853878828 +0000 UTC m=+0.092943745 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 28 05:02:25 localhost podman[311759]: 2025-11-28 10:02:25.889808877 +0000 UTC m=+0.128873794 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 28 05:02:25 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:02:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e114 do_prune osdmap full prune enabled Nov 28 05:02:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e115 e115: 6 total, 6 up, 6 in Nov 28 05:02:26 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e115: 6 total, 6 up, 6 in Nov 28 05:02:27 localhost nova_compute[279673]: 2025-11-28 10:02:27.023 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:27 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e115 do_prune osdmap full prune enabled Nov 28 05:02:27 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e116 e116: 6 total, 6 up, 6 in Nov 28 05:02:27 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e116: 6 total, 6 up, 6 in Nov 28 05:02:28 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e116 do_prune osdmap full prune enabled Nov 28 05:02:28 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e117 e117: 6 total, 6 up, 6 in Nov 28 05:02:28 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e117: 6 total, 6 up, 6 in Nov 28 05:02:29 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e117 do_prune osdmap full prune enabled Nov 28 05:02:29 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e118 e118: 6 total, 6 up, 6 in Nov 28 05:02:29 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e118: 6 total, 6 up, 6 in Nov 28 05:02:29 localhost neutron_sriov_agent[254147]: 2025-11-28 10:02:29.863 2 INFO neutron.agent.securitygroups_rpc [None req-163713b6-af4d-4d16-9097-b3cd54a25f68 078cec78b66d44acb2dcf304e572f2cf dba68040958c4e4c89f84cd27a771cd2 - - default default] Security group member updated ['a0bf5ab5-c355-48ac-a40e-9473d4858766']#033[00m Nov 28 05:02:29 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e118 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:02:29 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e118 do_prune osdmap full prune enabled Nov 28 05:02:29 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e119 e119: 6 total, 6 up, 6 in Nov 28 05:02:29 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e119: 6 total, 6 up, 6 in Nov 28 05:02:30 localhost nova_compute[279673]: 2025-11-28 10:02:30.024 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:30 localhost nova_compute[279673]: 2025-11-28 10:02:30.032 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:30 localhost neutron_sriov_agent[254147]: 2025-11-28 10:02:30.382 2 INFO neutron.agent.securitygroups_rpc [None req-59eaff10-1680-4aeb-97dc-49cab4063acc 078cec78b66d44acb2dcf304e572f2cf dba68040958c4e4c89f84cd27a771cd2 - - default default] Security group member updated ['a0bf5ab5-c355-48ac-a40e-9473d4858766']#033[00m Nov 28 05:02:30 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:30.411 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:02:30 localhost nova_compute[279673]: 2025-11-28 10:02:30.415 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:30 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e119 do_prune osdmap full prune enabled Nov 28 05:02:30 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e120 e120: 6 total, 6 up, 6 in Nov 28 05:02:30 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e120: 6 total, 6 up, 6 in Nov 28 05:02:32 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e120 do_prune osdmap full prune enabled Nov 28 05:02:32 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e121 e121: 6 total, 6 up, 6 in Nov 28 05:02:32 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e121: 6 total, 6 up, 6 in Nov 28 05:02:33 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:33.965 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:02:34 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:34.690 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:02:34 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:02:34 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e121 do_prune osdmap full prune enabled Nov 28 05:02:34 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e122 e122: 6 total, 6 up, 6 in Nov 28 05:02:34 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e122: 6 total, 6 up, 6 in Nov 28 05:02:35 localhost nova_compute[279673]: 2025-11-28 10:02:35.063 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:35 localhost nova_compute[279673]: 2025-11-28 10:02:35.067 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:35 localhost ovn_controller[152322]: 2025-11-28T10:02:35Z|00157|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:02:35 localhost nova_compute[279673]: 2025-11-28 10:02:35.538 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:02:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:02:35 localhost podman[311778]: 2025-11-28 10:02:35.858519463 +0000 UTC m=+0.092535552 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 05:02:35 localhost podman[311778]: 2025-11-28 10:02:35.893835376 +0000 UTC m=+0.127851485 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 28 05:02:35 localhost podman[311779]: 2025-11-28 10:02:35.907326393 +0000 UTC m=+0.137271115 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125) Nov 28 05:02:35 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:02:35 localhost podman[311779]: 2025-11-28 10:02:35.922537608 +0000 UTC m=+0.152482370 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd) Nov 28 05:02:35 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:02:36 localhost nova_compute[279673]: 2025-11-28 10:02:36.357 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:37 localhost nova_compute[279673]: 2025-11-28 10:02:37.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:02:37 localhost nova_compute[279673]: 2025-11-28 10:02:37.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:02:37 localhost nova_compute[279673]: 2025-11-28 10:02:37.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:02:37 localhost nova_compute[279673]: 2025-11-28 10:02:37.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 05:02:38 localhost nova_compute[279673]: 2025-11-28 10:02:38.864 279685 DEBUG nova.virt.driver [-] Emitting event Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Nov 28 05:02:38 localhost nova_compute[279673]: 2025-11-28 10:02:38.864 279685 INFO nova.compute.manager [-] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] VM Stopped (Lifecycle Event)#033[00m Nov 28 05:02:38 localhost nova_compute[279673]: 2025-11-28 10:02:38.888 279685 DEBUG nova.compute.manager [None req-92031a13-6d60-4444-b1e9-b0c375630dd1 - - - - - -] [instance: 7292509e-f294-4159-96e5-22d4712df2a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Nov 28 05:02:39 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 05:02:39 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 3084 writes, 26K keys, 3084 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.08 MB/s#012Cumulative WAL: 3084 writes, 3084 syncs, 1.00 writes per sync, written: 0.05 GB, 0.08 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3084 writes, 26K keys, 3084 commit groups, 1.0 writes per commit group, ingest: 46.97 MB, 0.08 MB/s#012Interval WAL: 3084 writes, 3084 syncs, 1.00 writes per sync, written: 0.05 GB, 0.08 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 136.2 0.25 0.09 12 0.021 0 0 0.0 0.0#012 L6 1/0 15.37 MB 0.0 0.2 0.0 0.2 0.2 0.0 0.0 5.3 168.3 152.5 1.19 0.47 11 0.108 128K 5632 0.0 0.0#012 Sum 1/0 15.37 MB 0.0 0.2 0.0 0.2 0.2 0.0 0.0 6.3 139.0 149.7 1.44 0.56 23 0.062 128K 5632 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.2 0.0 0.2 0.2 0.0 0.0 6.3 139.4 150.1 1.43 0.56 22 0.065 128K 5632 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low 0/0 0.00 KB 0.0 0.2 0.0 0.2 0.2 0.0 0.0 0.0 168.3 152.5 1.19 0.47 11 0.108 128K 5632 0.0 0.0#012High 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 138.6 0.25 0.09 11 0.022 0 0 0.0 0.0#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4 0.00 0.00 1 0.004 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.033, interval 0.033#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.21 GB write, 0.36 MB/s write, 0.19 GB read, 0.33 MB/s read, 1.4 seconds#012Interval compaction: 0.21 GB write, 0.36 MB/s write, 0.19 GB read, 0.33 MB/s read, 1.4 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b5bb679350#2 capacity: 308.00 MB usage: 46.03 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.000342 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3114,45.19 MB,14.6732%) FilterBlock(23,374.42 KB,0.118716%) IndexBlock(23,485.39 KB,0.153901%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Nov 28 05:02:39 localhost nova_compute[279673]: 2025-11-28 10:02:39.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:02:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e122 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:02:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e122 do_prune osdmap full prune enabled Nov 28 05:02:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e123 e123: 6 total, 6 up, 6 in Nov 28 05:02:39 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e123: 6 total, 6 up, 6 in Nov 28 05:02:39 localhost neutron_sriov_agent[254147]: 2025-11-28 10:02:39.942 2 INFO neutron.agent.securitygroups_rpc [None req-c410e527-579f-4d7d-bb14-04bb4c79dd9f b97430f38d544448bcb1f84d60affd50 f23b7feb8db740db9eea6302444ed3a8 - - default default] Security group member updated ['84bc6ad8-56a1-4678-950f-738b55ff6708']#033[00m Nov 28 05:02:40 localhost podman[238687]: time="2025-11-28T10:02:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:02:40 localhost nova_compute[279673]: 2025-11-28 10:02:40.089 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:40 localhost podman[238687]: @ - - [28/Nov/2025:10:02:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157512 "" "Go-http-client/1.1" Nov 28 05:02:40 localhost podman[238687]: @ - - [28/Nov/2025:10:02:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19749 "" "Go-http-client/1.1" Nov 28 05:02:40 localhost nova_compute[279673]: 2025-11-28 10:02:40.747 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:40 localhost nova_compute[279673]: 2025-11-28 10:02:40.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:02:41 localhost nova_compute[279673]: 2025-11-28 10:02:41.767 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:02:42 localhost nova_compute[279673]: 2025-11-28 10:02:42.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:02:42 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e123 do_prune osdmap full prune enabled Nov 28 05:02:42 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e124 e124: 6 total, 6 up, 6 in Nov 28 05:02:42 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e124: 6 total, 6 up, 6 in Nov 28 05:02:43 localhost nova_compute[279673]: 2025-11-28 10:02:43.256 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:43 localhost nova_compute[279673]: 2025-11-28 10:02:43.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:02:43 localhost nova_compute[279673]: 2025-11-28 10:02:43.794 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:02:43 localhost nova_compute[279673]: 2025-11-28 10:02:43.794 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:02:43 localhost nova_compute[279673]: 2025-11-28 10:02:43.795 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:02:43 localhost nova_compute[279673]: 2025-11-28 10:02:43.795 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 05:02:43 localhost nova_compute[279673]: 2025-11-28 10:02:43.796 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:02:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:02:44 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3290241926' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:02:44 localhost nova_compute[279673]: 2025-11-28 10:02:44.257 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:02:44 localhost nova_compute[279673]: 2025-11-28 10:02:44.335 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:02:44 localhost nova_compute[279673]: 2025-11-28 10:02:44.335 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:02:44 localhost nova_compute[279673]: 2025-11-28 10:02:44.546 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 05:02:44 localhost nova_compute[279673]: 2025-11-28 10:02:44.548 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11201MB free_disk=41.700096130371094GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 05:02:44 localhost nova_compute[279673]: 2025-11-28 10:02:44.549 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:02:44 localhost nova_compute[279673]: 2025-11-28 10:02:44.549 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:02:44 localhost nova_compute[279673]: 2025-11-28 10:02:44.598 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:44 localhost ovn_metadata_agent[158125]: 2025-11-28 10:02:44.599 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:02:44 localhost ovn_metadata_agent[158125]: 2025-11-28 10:02:44.600 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 28 05:02:44 localhost nova_compute[279673]: 2025-11-28 10:02:44.701 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 05:02:44 localhost nova_compute[279673]: 2025-11-28 10:02:44.702 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 05:02:44 localhost nova_compute[279673]: 2025-11-28 10:02:44.703 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 05:02:44 localhost nova_compute[279673]: 2025-11-28 10:02:44.759 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:02:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e124 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:02:45 localhost nova_compute[279673]: 2025-11-28 10:02:45.092 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:02:45 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1391681796' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:02:45 localhost nova_compute[279673]: 2025-11-28 10:02:45.180 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:02:45 localhost nova_compute[279673]: 2025-11-28 10:02:45.186 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 05:02:45 localhost nova_compute[279673]: 2025-11-28 10:02:45.212 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 05:02:45 localhost nova_compute[279673]: 2025-11-28 10:02:45.247 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 05:02:45 localhost nova_compute[279673]: 2025-11-28 10:02:45.248 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.698s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:02:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:02:45 localhost neutron_sriov_agent[254147]: 2025-11-28 10:02:45.846 2 INFO neutron.agent.securitygroups_rpc [None req-7370f7f5-c105-405f-816d-670eb41986b4 b97430f38d544448bcb1f84d60affd50 f23b7feb8db740db9eea6302444ed3a8 - - default default] Security group member updated ['84bc6ad8-56a1-4678-950f-738b55ff6708']#033[00m Nov 28 05:02:45 localhost systemd[1]: tmp-crun.jbaE0o.mount: Deactivated successfully. Nov 28 05:02:45 localhost podman[311866]: 2025-11-28 10:02:45.874282158 +0000 UTC m=+0.101472490 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Nov 28 05:02:45 localhost podman[311866]: 2025-11-28 10:02:45.916644501 +0000 UTC m=+0.143834863 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, container_name=openstack_network_exporter, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.component=ubi9-minimal-container) Nov 28 05:02:45 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:02:46 localhost ovn_controller[152322]: 2025-11-28T10:02:46Z|00158|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:02:46 localhost nova_compute[279673]: 2025-11-28 10:02:46.578 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:46 localhost nova_compute[279673]: 2025-11-28 10:02:46.979 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:48 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e124 do_prune osdmap full prune enabled Nov 28 05:02:48 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e125 e125: 6 total, 6 up, 6 in Nov 28 05:02:48 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e125: 6 total, 6 up, 6 in Nov 28 05:02:48 localhost openstack_network_exporter[240658]: ERROR 10:02:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:02:48 localhost openstack_network_exporter[240658]: ERROR 10:02:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:02:48 localhost openstack_network_exporter[240658]: ERROR 10:02:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:02:48 localhost openstack_network_exporter[240658]: ERROR 10:02:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:02:48 localhost openstack_network_exporter[240658]: Nov 28 05:02:48 localhost openstack_network_exporter[240658]: ERROR 10:02:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:02:48 localhost openstack_network_exporter[240658]: Nov 28 05:02:48 localhost nova_compute[279673]: 2025-11-28 10:02:48.249 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:02:48 localhost nova_compute[279673]: 2025-11-28 10:02:48.250 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 05:02:48 localhost nova_compute[279673]: 2025-11-28 10:02:48.302 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m Nov 28 05:02:49 localhost ovn_controller[152322]: 2025-11-28T10:02:49Z|00159|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:02:49 localhost nova_compute[279673]: 2025-11-28 10:02:49.498 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e125 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:02:50 localhost nova_compute[279673]: 2025-11-28 10:02:50.096 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:50 localhost ovn_controller[152322]: 2025-11-28T10:02:50Z|00160|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:02:50 localhost nova_compute[279673]: 2025-11-28 10:02:50.391 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:02:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:02:50.841 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:02:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:02:50.842 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:02:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:02:50.842 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:02:50 localhost systemd[1]: tmp-crun.7UaIji.mount: Deactivated successfully. Nov 28 05:02:50 localhost podman[311885]: 2025-11-28 10:02:50.849153167 +0000 UTC m=+0.085515032 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 05:02:50 localhost podman[311885]: 2025-11-28 10:02:50.88453118 +0000 UTC m=+0.120893035 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 05:02:50 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:02:51 localhost neutron_sriov_agent[254147]: 2025-11-28 10:02:51.013 2 INFO neutron.agent.securitygroups_rpc [req-bb7f0ac8-504e-4783-80de-f00563b1098a req-aad0b688-0986-452a-b92d-7d53ff4d1361 75ac6a26227c40ba81e61e610018d23f 1f9b84b894e641c4bee3ebcd1409ad9f - - default default] Security group member updated ['6deb8732-9203-448a-b0a5-cf6a0375d009']#033[00m Nov 28 05:02:51 localhost dnsmasq[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/addn_hosts - 1 addresses Nov 28 05:02:51 localhost dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/host Nov 28 05:02:51 localhost dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/opts Nov 28 05:02:51 localhost podman[311925]: 2025-11-28 10:02:51.29074383 +0000 UTC m=+0.048290515 container kill 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:02:52 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:52.574 261084 INFO neutron.agent.linux.ip_lib [None req-03bc2737-23b5-4d2f-80f4-f2b9089fd4e7 - - - - - -] Device tap8c96f24c-a8 cannot be used as it has no MAC address#033[00m Nov 28 05:02:52 localhost nova_compute[279673]: 2025-11-28 10:02:52.632 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:52 localhost kernel: device tap8c96f24c-a8 entered promiscuous mode Nov 28 05:02:52 localhost NetworkManager[5967]: [1764324172.6392] manager: (tap8c96f24c-a8): new Generic device (/org/freedesktop/NetworkManager/Devices/30) Nov 28 05:02:52 localhost ovn_controller[152322]: 2025-11-28T10:02:52Z|00161|binding|INFO|Claiming lport 8c96f24c-a809-498c-a368-b04d504c0694 for this chassis. Nov 28 05:02:52 localhost ovn_controller[152322]: 2025-11-28T10:02:52Z|00162|binding|INFO|8c96f24c-a809-498c-a368-b04d504c0694: Claiming unknown Nov 28 05:02:52 localhost nova_compute[279673]: 2025-11-28 10:02:52.639 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:52 localhost systemd-udevd[311956]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:02:52 localhost journal[227875]: ethtool ioctl error on tap8c96f24c-a8: No such device Nov 28 05:02:52 localhost ovn_controller[152322]: 2025-11-28T10:02:52Z|00163|binding|INFO|Setting lport 8c96f24c-a809-498c-a368-b04d504c0694 ovn-installed in OVS Nov 28 05:02:52 localhost nova_compute[279673]: 2025-11-28 10:02:52.678 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:52 localhost journal[227875]: ethtool ioctl error on tap8c96f24c-a8: No such device Nov 28 05:02:52 localhost journal[227875]: ethtool ioctl error on tap8c96f24c-a8: No such device Nov 28 05:02:52 localhost journal[227875]: ethtool ioctl error on tap8c96f24c-a8: No such device Nov 28 05:02:52 localhost journal[227875]: ethtool ioctl error on tap8c96f24c-a8: No such device Nov 28 05:02:52 localhost ovn_controller[152322]: 2025-11-28T10:02:52Z|00164|binding|INFO|Setting lport 8c96f24c-a809-498c-a368-b04d504c0694 up in Southbound Nov 28 05:02:52 localhost ovn_metadata_agent[158125]: 2025-11-28 10:02:52.701 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-2df2b9d7-92bb-4c3f-a2c7-b313541a7942', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2df2b9d7-92bb-4c3f-a2c7-b313541a7942', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2cfa67078b8440d0bf985b2a5e0e5558', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c3a1ad44-5623-4936-b8c1-f0d1c2dea95d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8c96f24c-a809-498c-a368-b04d504c0694) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:02:52 localhost journal[227875]: ethtool ioctl error on tap8c96f24c-a8: No such device Nov 28 05:02:52 localhost ovn_metadata_agent[158125]: 2025-11-28 10:02:52.704 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 8c96f24c-a809-498c-a368-b04d504c0694 in datapath 2df2b9d7-92bb-4c3f-a2c7-b313541a7942 bound to our chassis#033[00m Nov 28 05:02:52 localhost ovn_metadata_agent[158125]: 2025-11-28 10:02:52.706 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2df2b9d7-92bb-4c3f-a2c7-b313541a7942 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:02:52 localhost ovn_metadata_agent[158125]: 2025-11-28 10:02:52.707 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[40788f2e-889a-407c-9606-c131620f6e42]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:02:52 localhost journal[227875]: ethtool ioctl error on tap8c96f24c-a8: No such device Nov 28 05:02:52 localhost journal[227875]: ethtool ioctl error on tap8c96f24c-a8: No such device Nov 28 05:02:52 localhost nova_compute[279673]: 2025-11-28 10:02:52.716 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:52 localhost nova_compute[279673]: 2025-11-28 10:02:52.733 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:53 localhost podman[312027]: Nov 28 05:02:53 localhost podman[312027]: 2025-11-28 10:02:53.598393163 +0000 UTC m=+0.093867012 container create 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 05:02:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:02:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:02:53 localhost systemd[1]: Started libpod-conmon-61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31.scope. Nov 28 05:02:53 localhost systemd[1]: tmp-crun.inI0BC.mount: Deactivated successfully. Nov 28 05:02:53 localhost podman[312027]: 2025-11-28 10:02:53.551744106 +0000 UTC m=+0.047217995 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:02:53 localhost systemd[1]: Started libcrun container. Nov 28 05:02:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b372ebb8673178fa80e5f637587aa00b946a13d3ac306690ce5e3c373a5ccdef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:02:53 localhost podman[312041]: 2025-11-28 10:02:53.725429732 +0000 UTC m=+0.090675649 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller) Nov 28 05:02:53 localhost podman[312027]: 2025-11-28 10:02:53.733066241 +0000 UTC m=+0.228540090 container init 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:02:53 localhost podman[312027]: 2025-11-28 10:02:53.746884707 +0000 UTC m=+0.242358546 container start 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3) Nov 28 05:02:53 localhost dnsmasq[312077]: started, version 2.85 cachesize 150 Nov 28 05:02:53 localhost dnsmasq[312077]: DNS service limited to local subnets Nov 28 05:02:53 localhost dnsmasq[312077]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:02:53 localhost dnsmasq[312077]: warning: no upstream servers configured Nov 28 05:02:53 localhost dnsmasq-dhcp[312077]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:02:53 localhost dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 0 addresses Nov 28 05:02:53 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host Nov 28 05:02:53 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts Nov 28 05:02:53 localhost podman[312042]: 2025-11-28 10:02:53.806897577 +0000 UTC m=+0.171715371 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Nov 28 05:02:53 localhost podman[312041]: 2025-11-28 10:02:53.840418037 +0000 UTC m=+0.205663944 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller) Nov 28 05:02:53 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:02:53 localhost podman[312042]: 2025-11-28 10:02:53.891057718 +0000 UTC m=+0.255875462 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 05:02:53 localhost neutron_sriov_agent[254147]: 2025-11-28 10:02:53.890 2 INFO neutron.agent.securitygroups_rpc [None req-ca5b8c5c-4a7b-4773-b7e8-8e9eb8c79737 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']#033[00m Nov 28 05:02:53 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:02:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:53.903 261084 INFO neutron.agent.dhcp.agent [None req-093128e4-5820-4fa4-94fb-6d9383a381b9 - - - - - -] DHCP configuration for ports {'a9422cdc-c436-4a76-bbaf-9159623fa972'} is completed#033[00m Nov 28 05:02:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:53.917 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:53Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d6c2b851-701e-4d45-b5a9-391ca9c93d44, ip_allocation=immediate, mac_address=fa:16:3e:4b:a6:76, name=tempest-AllowedAddressPairTestJSON-485609447, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:02:50Z, description=, dns_domain=, id=2df2b9d7-92bb-4c3f-a2c7-b313541a7942, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-984695291, port_security_enabled=True, project_id=2cfa67078b8440d0bf985b2a5e0e5558, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63025, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1126, status=ACTIVE, subnets=['b57966e7-777d-4ac7-b284-15bb4a0b37f9'], tags=[], tenant_id=2cfa67078b8440d0bf985b2a5e0e5558, updated_at=2025-11-28T10:02:51Z, vlan_transparent=None, network_id=2df2b9d7-92bb-4c3f-a2c7-b313541a7942, port_security_enabled=True, project_id=2cfa67078b8440d0bf985b2a5e0e5558, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d'], standard_attr_id=1154, status=DOWN, tags=[], tenant_id=2cfa67078b8440d0bf985b2a5e0e5558, updated_at=2025-11-28T10:02:53Z on network 2df2b9d7-92bb-4c3f-a2c7-b313541a7942#033[00m Nov 28 05:02:54 localhost dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 1 addresses Nov 28 05:02:54 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host Nov 28 05:02:54 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts Nov 28 05:02:54 localhost podman[312106]: 2025-11-28 10:02:54.125476256 +0000 UTC m=+0.070756679 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:02:54 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:54.337 261084 INFO neutron.agent.dhcp.agent [None req-13c0a229-aff3-4670-960a-93731a131398 - - - - - -] DHCP configuration for ports {'d6c2b851-701e-4d45-b5a9-391ca9c93d44'} is completed#033[00m Nov 28 05:02:54 localhost ovn_metadata_agent[158125]: 2025-11-28 10:02:54.604 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:02:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e125 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:02:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e125 do_prune osdmap full prune enabled Nov 28 05:02:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e126 e126: 6 total, 6 up, 6 in Nov 28 05:02:54 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e126: 6 total, 6 up, 6 in Nov 28 05:02:55 localhost nova_compute[279673]: 2025-11-28 10:02:55.128 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:55 localhost dnsmasq[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/addn_hosts - 0 addresses Nov 28 05:02:55 localhost podman[312143]: 2025-11-28 10:02:55.41617684 +0000 UTC m=+0.065287122 container kill 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125) Nov 28 05:02:55 localhost dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/host Nov 28 05:02:55 localhost dnsmasq-dhcp[310647]: read /var/lib/neutron/dhcp/b1696f4c-80ce-491f-ad1c-cc7f5b6700ba/opts Nov 28 05:02:55 localhost nova_compute[279673]: 2025-11-28 10:02:55.723 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:55 localhost kernel: device tap516917c4-99 left promiscuous mode Nov 28 05:02:55 localhost ovn_controller[152322]: 2025-11-28T10:02:55Z|00165|binding|INFO|Releasing lport 516917c4-995e-4297-af25-c4f8499fcc7d from this chassis (sb_readonly=0) Nov 28 05:02:55 localhost ovn_controller[152322]: 2025-11-28T10:02:55Z|00166|binding|INFO|Setting lport 516917c4-995e-4297-af25-c4f8499fcc7d down in Southbound Nov 28 05:02:55 localhost ovn_metadata_agent[158125]: 2025-11-28 10:02:55.738 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1f9b84b894e641c4bee3ebcd1409ad9f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4106ac0-e782-4268-8bb4-37fc3096f0bc, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=516917c4-995e-4297-af25-c4f8499fcc7d) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:02:55 localhost ovn_metadata_agent[158125]: 2025-11-28 10:02:55.740 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 516917c4-995e-4297-af25-c4f8499fcc7d in datapath b1696f4c-80ce-491f-ad1c-cc7f5b6700ba unbound from our chassis#033[00m Nov 28 05:02:55 localhost ovn_metadata_agent[158125]: 2025-11-28 10:02:55.742 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:02:55 localhost nova_compute[279673]: 2025-11-28 10:02:55.743 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:02:55 localhost ovn_metadata_agent[158125]: 2025-11-28 10:02:55.745 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[80c990ad-9bc1-4866-b35d-79c4592f8261]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:02:55 localhost neutron_sriov_agent[254147]: 2025-11-28 10:02:55.789 2 INFO neutron.agent.securitygroups_rpc [None req-0a1122e3-48a9-4fdd-9791-f33fb613b799 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']#033[00m Nov 28 05:02:55 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:55.837 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:55Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=6391e909-e95a-4e76-b60a-9bc64a9f9f1b, ip_allocation=immediate, mac_address=fa:16:3e:b5:5b:ef, name=tempest-AllowedAddressPairTestJSON-1597087428, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:02:50Z, description=, dns_domain=, id=2df2b9d7-92bb-4c3f-a2c7-b313541a7942, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-984695291, port_security_enabled=True, project_id=2cfa67078b8440d0bf985b2a5e0e5558, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63025, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1126, status=ACTIVE, subnets=['b57966e7-777d-4ac7-b284-15bb4a0b37f9'], tags=[], tenant_id=2cfa67078b8440d0bf985b2a5e0e5558, updated_at=2025-11-28T10:02:51Z, vlan_transparent=None, network_id=2df2b9d7-92bb-4c3f-a2c7-b313541a7942, port_security_enabled=True, project_id=2cfa67078b8440d0bf985b2a5e0e5558, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d'], standard_attr_id=1175, status=DOWN, tags=[], tenant_id=2cfa67078b8440d0bf985b2a5e0e5558, updated_at=2025-11-28T10:02:55Z on network 2df2b9d7-92bb-4c3f-a2c7-b313541a7942#033[00m Nov 28 05:02:56 localhost podman[312184]: 2025-11-28 10:02:56.075528852 +0000 UTC m=+0.062667836 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:02:56 localhost dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 2 addresses Nov 28 05:02:56 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host Nov 28 05:02:56 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts Nov 28 05:02:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:02:56 localhost podman[312197]: 2025-11-28 10:02:56.190355843 +0000 UTC m=+0.086587733 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125) Nov 28 05:02:56 localhost podman[312197]: 2025-11-28 10:02:56.206571358 +0000 UTC m=+0.102803248 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Nov 28 05:02:56 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:02:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:56.625 261084 INFO neutron.agent.dhcp.agent [None req-77bcac77-fb24-4edb-b698-1e97c40ced18 - - - - - -] DHCP configuration for ports {'6391e909-e95a-4e76-b60a-9bc64a9f9f1b'} is completed#033[00m Nov 28 05:02:57 localhost neutron_sriov_agent[254147]: 2025-11-28 10:02:57.780 2 INFO neutron.agent.securitygroups_rpc [None req-2d11ad2b-bc0b-4803-8bd7-bbf5b227318c 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']#033[00m Nov 28 05:02:58 localhost dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 1 addresses Nov 28 05:02:58 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host Nov 28 05:02:58 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts Nov 28 05:02:58 localhost podman[312239]: 2025-11-28 10:02:58.114718513 +0000 UTC m=+0.061068921 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 28 05:02:58 localhost neutron_sriov_agent[254147]: 2025-11-28 10:02:58.910 2 INFO neutron.agent.securitygroups_rpc [None req-15306174-a853-47d1-9333-4213f5fad357 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']#033[00m Nov 28 05:02:59 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:59.035 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:02:58Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d566639b-ef8e-4d02-a570-43c5e19e05b4, ip_allocation=immediate, mac_address=fa:16:3e:84:93:df, name=tempest-AllowedAddressPairTestJSON-2119547774, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:02:50Z, description=, dns_domain=, id=2df2b9d7-92bb-4c3f-a2c7-b313541a7942, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-984695291, port_security_enabled=True, project_id=2cfa67078b8440d0bf985b2a5e0e5558, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63025, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1126, status=ACTIVE, subnets=['b57966e7-777d-4ac7-b284-15bb4a0b37f9'], tags=[], tenant_id=2cfa67078b8440d0bf985b2a5e0e5558, updated_at=2025-11-28T10:02:51Z, vlan_transparent=None, network_id=2df2b9d7-92bb-4c3f-a2c7-b313541a7942, port_security_enabled=True, project_id=2cfa67078b8440d0bf985b2a5e0e5558, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d'], standard_attr_id=1179, status=DOWN, tags=[], tenant_id=2cfa67078b8440d0bf985b2a5e0e5558, updated_at=2025-11-28T10:02:58Z on network 2df2b9d7-92bb-4c3f-a2c7-b313541a7942#033[00m Nov 28 05:02:59 localhost dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 2 addresses Nov 28 05:02:59 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host Nov 28 05:02:59 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts Nov 28 05:02:59 localhost podman[312277]: 2025-11-28 10:02:59.246537094 +0000 UTC m=+0.059107115 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2) Nov 28 05:02:59 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:02:59.499 261084 INFO neutron.agent.dhcp.agent [None req-6d8c023e-265a-4d6d-bb6b-9ffba3d27396 - - - - - -] DHCP configuration for ports {'d566639b-ef8e-4d02-a570-43c5e19e05b4'} is completed#033[00m Nov 28 05:02:59 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:03:00 localhost nova_compute[279673]: 2025-11-28 10:03:00.161 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:00 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:00.206 2 INFO neutron.agent.securitygroups_rpc [None req-f1e38bd4-3201-4ca6-aca5-e6cf8d3e47ff 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']#033[00m Nov 28 05:03:00 localhost ovn_controller[152322]: 2025-11-28T10:03:00Z|00167|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:03:00 localhost nova_compute[279673]: 2025-11-28 10:03:00.498 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:00 localhost dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 1 addresses Nov 28 05:03:00 localhost podman[312314]: 2025-11-28 10:03:00.550928359 +0000 UTC m=+0.068696470 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 05:03:00 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host Nov 28 05:03:00 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts Nov 28 05:03:00 localhost nova_compute[279673]: 2025-11-28 10:03:00.975 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:01 localhost dnsmasq[310647]: exiting on receipt of SIGTERM Nov 28 05:03:01 localhost podman[312352]: 2025-11-28 10:03:01.443953958 +0000 UTC m=+0.062399390 container kill 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 05:03:01 localhost systemd[1]: libpod-804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd.scope: Deactivated successfully. Nov 28 05:03:01 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:01.468 2 INFO neutron.agent.securitygroups_rpc [None req-06213f27-8bbf-4f60-8df9-0ce6274952ed 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']#033[00m Nov 28 05:03:01 localhost podman[312365]: 2025-11-28 10:03:01.507118967 +0000 UTC m=+0.047297716 container died 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:03:01 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:01.557 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:00Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=234a27f0-462c-462d-8b5e-a906ee88990b, ip_allocation=immediate, mac_address=fa:16:3e:4a:e2:fc, name=tempest-AllowedAddressPairTestJSON-632019140, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:02:50Z, description=, dns_domain=, id=2df2b9d7-92bb-4c3f-a2c7-b313541a7942, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-984695291, port_security_enabled=True, project_id=2cfa67078b8440d0bf985b2a5e0e5558, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63025, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1126, status=ACTIVE, subnets=['b57966e7-777d-4ac7-b284-15bb4a0b37f9'], tags=[], tenant_id=2cfa67078b8440d0bf985b2a5e0e5558, updated_at=2025-11-28T10:02:51Z, vlan_transparent=None, network_id=2df2b9d7-92bb-4c3f-a2c7-b313541a7942, port_security_enabled=True, project_id=2cfa67078b8440d0bf985b2a5e0e5558, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d'], standard_attr_id=1184, status=DOWN, tags=[], tenant_id=2cfa67078b8440d0bf985b2a5e0e5558, updated_at=2025-11-28T10:03:01Z on network 2df2b9d7-92bb-4c3f-a2c7-b313541a7942#033[00m Nov 28 05:03:01 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd-userdata-shm.mount: Deactivated successfully. Nov 28 05:03:01 localhost systemd[1]: var-lib-containers-storage-overlay-7e03c2285c42fffa1cd27962b49feefea5575696a1d40702567a3737442c3ea1-merged.mount: Deactivated successfully. Nov 28 05:03:01 localhost podman[312365]: 2025-11-28 10:03:01.593479572 +0000 UTC m=+0.133658271 container cleanup 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 28 05:03:01 localhost systemd[1]: libpod-conmon-804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd.scope: Deactivated successfully. Nov 28 05:03:01 localhost podman[312366]: 2025-11-28 10:03:01.624145261 +0000 UTC m=+0.157534295 container remove 804f30d65b97a841097825fe92768ab967c658a2f6c565925eb954ff8bd671fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b1696f4c-80ce-491f-ad1c-cc7f5b6700ba, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:03:01 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:01.652 261084 INFO neutron.agent.dhcp.agent [None req-59d44550-eaf9-4c77-8e92-857e51298a03 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:03:01 localhost dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 2 addresses Nov 28 05:03:01 localhost podman[312409]: 2025-11-28 10:03:01.779126961 +0000 UTC m=+0.059203617 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 28 05:03:01 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host Nov 28 05:03:01 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts Nov 28 05:03:02 localhost systemd[1]: run-netns-qdhcp\x2db1696f4c\x2d80ce\x2d491f\x2dad1c\x2dcc7f5b6700ba.mount: Deactivated successfully. Nov 28 05:03:02 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:02.754 261084 INFO neutron.agent.dhcp.agent [None req-4568dc87-1e38-4410-879d-5eb6d4a332bc - - - - - -] DHCP configuration for ports {'234a27f0-462c-462d-8b5e-a906ee88990b'} is completed#033[00m Nov 28 05:03:02 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:02.778 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:03:03 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:03.310 2 INFO neutron.agent.securitygroups_rpc [None req-58637b77-ae6c-405f-99c5-e20fa41f4923 f6a5516f43fb48ebaa16e4040dd82b84 cd7c5d213d924d3d9d4428db9d286082 - - default default] Security group member updated ['9f37640b-8d78-40f1-9b7c-3ec3fef04776']#033[00m Nov 28 05:03:04 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:04.504 2 INFO neutron.agent.securitygroups_rpc [None req-9701a6f5-02eb-46da-bd51-76f4153e4e2b db6b2950549b47a2a2693ecffb5083c4 cd9b97e6d04840f3a546b260a8ee9b24 - - default default] Security group member updated ['7c5e1d73-494f-47ff-9f16-a2cff6e79638']#033[00m Nov 28 05:03:04 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:03:05 localhost nova_compute[279673]: 2025-11-28 10:03:05.197 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:06 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:06.147 2 INFO neutron.agent.securitygroups_rpc [None req-42b0499f-37f4-4061-a4df-d49e7a70a2c4 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']#033[00m Nov 28 05:03:06 localhost nova_compute[279673]: 2025-11-28 10:03:06.634 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:06 localhost dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 1 addresses Nov 28 05:03:06 localhost podman[312444]: 2025-11-28 10:03:06.657109586 +0000 UTC m=+0.047601795 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Nov 28 05:03:06 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host Nov 28 05:03:06 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts Nov 28 05:03:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:03:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:03:06 localhost systemd[1]: tmp-crun.ex1TCr.mount: Deactivated successfully. Nov 28 05:03:06 localhost podman[312461]: 2025-11-28 10:03:06.782352345 +0000 UTC m=+0.095091436 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd) Nov 28 05:03:06 localhost podman[312461]: 2025-11-28 10:03:06.822197446 +0000 UTC m=+0.134936537 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2) Nov 28 05:03:06 localhost podman[312459]: 2025-11-28 10:03:06.832649186 +0000 UTC m=+0.146027825 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 05:03:06 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:03:06 localhost podman[312459]: 2025-11-28 10:03:06.846535184 +0000 UTC m=+0.159913823 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 05:03:06 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:03:07 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:07.276 2 INFO neutron.agent.securitygroups_rpc [None req-374ec1da-a6ee-43ec-aeb4-2a3037224eb2 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']#033[00m Nov 28 05:03:07 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:07.352 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:06Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=031c62e9-b76b-49bd-ad97-981393fcbd5a, ip_allocation=immediate, mac_address=fa:16:3e:91:ad:d2, name=tempest-AllowedAddressPairTestJSON-431195025, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:02:50Z, description=, dns_domain=, id=2df2b9d7-92bb-4c3f-a2c7-b313541a7942, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-984695291, port_security_enabled=True, project_id=2cfa67078b8440d0bf985b2a5e0e5558, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63025, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1126, status=ACTIVE, subnets=['b57966e7-777d-4ac7-b284-15bb4a0b37f9'], tags=[], tenant_id=2cfa67078b8440d0bf985b2a5e0e5558, updated_at=2025-11-28T10:02:51Z, vlan_transparent=None, network_id=2df2b9d7-92bb-4c3f-a2c7-b313541a7942, port_security_enabled=True, project_id=2cfa67078b8440d0bf985b2a5e0e5558, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d'], standard_attr_id=1212, status=DOWN, tags=[], tenant_id=2cfa67078b8440d0bf985b2a5e0e5558, updated_at=2025-11-28T10:03:07Z on network 2df2b9d7-92bb-4c3f-a2c7-b313541a7942#033[00m Nov 28 05:03:07 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:07.567 2 INFO neutron.agent.securitygroups_rpc [None req-5f1d0dc7-c78c-4e13-8de3-56bbcc932539 db6b2950549b47a2a2693ecffb5083c4 cd9b97e6d04840f3a546b260a8ee9b24 - - default default] Security group member updated ['7c5e1d73-494f-47ff-9f16-a2cff6e79638']#033[00m Nov 28 05:03:07 localhost dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 2 addresses Nov 28 05:03:07 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host Nov 28 05:03:07 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts Nov 28 05:03:07 localhost podman[312520]: 2025-11-28 10:03:07.581224846 +0000 UTC m=+0.063876521 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 28 05:03:07 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:07.840 261084 INFO neutron.agent.dhcp.agent [None req-af679853-d0d1-4a54-8ee3-c61a41514b87 - - - - - -] DHCP configuration for ports {'031c62e9-b76b-49bd-ad97-981393fcbd5a'} is completed#033[00m Nov 28 05:03:08 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:08.751 2 INFO neutron.agent.securitygroups_rpc [None req-be0492e2-ff74-4faa-8249-9d4640988efe f6a5516f43fb48ebaa16e4040dd82b84 cd7c5d213d924d3d9d4428db9d286082 - - default default] Security group member updated ['9f37640b-8d78-40f1-9b7c-3ec3fef04776']#033[00m Nov 28 05:03:08 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:08.963 2 INFO neutron.agent.securitygroups_rpc [None req-f639edd5-343d-4ae3-8fa2-2054bebb498d 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']#033[00m Nov 28 05:03:09 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:09.020 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:08Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=028280bb-2b70-45cf-b0a6-9521d60732bb, ip_allocation=immediate, mac_address=fa:16:3e:91:43:ca, name=tempest-AllowedAddressPairTestJSON-1463569347, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:02:50Z, description=, dns_domain=, id=2df2b9d7-92bb-4c3f-a2c7-b313541a7942, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-984695291, port_security_enabled=True, project_id=2cfa67078b8440d0bf985b2a5e0e5558, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63025, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1126, status=ACTIVE, subnets=['b57966e7-777d-4ac7-b284-15bb4a0b37f9'], tags=[], tenant_id=2cfa67078b8440d0bf985b2a5e0e5558, updated_at=2025-11-28T10:02:51Z, vlan_transparent=None, network_id=2df2b9d7-92bb-4c3f-a2c7-b313541a7942, port_security_enabled=True, project_id=2cfa67078b8440d0bf985b2a5e0e5558, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d'], standard_attr_id=1229, status=DOWN, tags=[], tenant_id=2cfa67078b8440d0bf985b2a5e0e5558, updated_at=2025-11-28T10:03:08Z on network 2df2b9d7-92bb-4c3f-a2c7-b313541a7942#033[00m Nov 28 05:03:09 localhost dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 3 addresses Nov 28 05:03:09 localhost podman[312555]: 2025-11-28 10:03:09.250980971 +0000 UTC m=+0.061111331 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:03:09 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host Nov 28 05:03:09 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts Nov 28 05:03:09 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:09.487 261084 INFO neutron.agent.dhcp.agent [None req-26475625-a8b9-4c3f-9d34-d8584b8cb346 - - - - - -] DHCP configuration for ports {'028280bb-2b70-45cf-b0a6-9521d60732bb'} is completed#033[00m Nov 28 05:03:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:03:10 localhost podman[238687]: time="2025-11-28T10:03:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:03:10 localhost podman[238687]: @ - - [28/Nov/2025:10:03:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157512 "" "Go-http-client/1.1" Nov 28 05:03:10 localhost podman[238687]: @ - - [28/Nov/2025:10:03:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19739 "" "Go-http-client/1.1" Nov 28 05:03:10 localhost nova_compute[279673]: 2025-11-28 10:03:10.229 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:11 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:11.431 2 INFO neutron.agent.securitygroups_rpc [None req-cc15aeb8-86ce-4ade-b16e-7c5f404511cd 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']#033[00m Nov 28 05:03:11 localhost dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 2 addresses Nov 28 05:03:11 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host Nov 28 05:03:11 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts Nov 28 05:03:11 localhost podman[312594]: 2025-11-28 10:03:11.684306366 +0000 UTC m=+0.059999630 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true) Nov 28 05:03:12 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:12.069 2 INFO neutron.agent.securitygroups_rpc [None req-93eb68a4-7d7e-4f26-af38-fff447267025 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']#033[00m Nov 28 05:03:12 localhost podman[312631]: 2025-11-28 10:03:12.330941704 +0000 UTC m=+0.060001750 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:03:12 localhost dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 1 addresses Nov 28 05:03:12 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host Nov 28 05:03:12 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts Nov 28 05:03:12 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:12.503 2 INFO neutron.agent.securitygroups_rpc [None req-0547c360-35fd-496e-9dbb-6212e2de25bb 7dd8e21cfc81423e88248cb5fa529d85 2cfa67078b8440d0bf985b2a5e0e5558 - - default default] Security group member updated ['99b6bdbb-5b32-49a7-af1f-7638d6a8bc5d']#033[00m Nov 28 05:03:12 localhost systemd[1]: tmp-crun.q2zIoU.mount: Deactivated successfully. Nov 28 05:03:12 localhost dnsmasq[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/addn_hosts - 0 addresses Nov 28 05:03:12 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/host Nov 28 05:03:12 localhost podman[312668]: 2025-11-28 10:03:12.776759269 +0000 UTC m=+0.072292743 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 28 05:03:12 localhost dnsmasq-dhcp[312077]: read /var/lib/neutron/dhcp/2df2b9d7-92bb-4c3f-a2c7-b313541a7942/opts Nov 28 05:03:13 localhost dnsmasq[312077]: exiting on receipt of SIGTERM Nov 28 05:03:13 localhost podman[312705]: 2025-11-28 10:03:13.744198749 +0000 UTC m=+0.056517360 container kill 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true) Nov 28 05:03:13 localhost systemd[1]: libpod-61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31.scope: Deactivated successfully. Nov 28 05:03:13 localhost podman[312718]: 2025-11-28 10:03:13.817925233 +0000 UTC m=+0.056576623 container died 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 28 05:03:13 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31-userdata-shm.mount: Deactivated successfully. Nov 28 05:03:13 localhost podman[312718]: 2025-11-28 10:03:13.852814372 +0000 UTC m=+0.091465692 container cleanup 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 28 05:03:13 localhost systemd[1]: var-lib-containers-storage-overlay-b372ebb8673178fa80e5f637587aa00b946a13d3ac306690ce5e3c373a5ccdef-merged.mount: Deactivated successfully. Nov 28 05:03:13 localhost systemd[1]: libpod-conmon-61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31.scope: Deactivated successfully. Nov 28 05:03:13 localhost podman[312719]: 2025-11-28 10:03:13.882312428 +0000 UTC m=+0.117352824 container remove 61bddb1823422e4136ce6ad58a76f4bec746cf5dd6ba8ab5cf0b926c56f74e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2df2b9d7-92bb-4c3f-a2c7-b313541a7942, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Nov 28 05:03:13 localhost ovn_controller[152322]: 2025-11-28T10:03:13Z|00168|binding|INFO|Releasing lport 8c96f24c-a809-498c-a368-b04d504c0694 from this chassis (sb_readonly=0) Nov 28 05:03:13 localhost kernel: device tap8c96f24c-a8 left promiscuous mode Nov 28 05:03:13 localhost ovn_controller[152322]: 2025-11-28T10:03:13Z|00169|binding|INFO|Setting lport 8c96f24c-a809-498c-a368-b04d504c0694 down in Southbound Nov 28 05:03:13 localhost nova_compute[279673]: 2025-11-28 10:03:13.928 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:13 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:13.936 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-2df2b9d7-92bb-4c3f-a2c7-b313541a7942', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2df2b9d7-92bb-4c3f-a2c7-b313541a7942', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2cfa67078b8440d0bf985b2a5e0e5558', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c3a1ad44-5623-4936-b8c1-f0d1c2dea95d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8c96f24c-a809-498c-a368-b04d504c0694) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:13 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:13.938 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 8c96f24c-a809-498c-a368-b04d504c0694 in datapath 2df2b9d7-92bb-4c3f-a2c7-b313541a7942 unbound from our chassis#033[00m Nov 28 05:03:13 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:13.940 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2df2b9d7-92bb-4c3f-a2c7-b313541a7942, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:03:13 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:13.941 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[4b00684d-e2bc-45b8-bee6-a6fda47a3011]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:13 localhost nova_compute[279673]: 2025-11-28 10:03:13.951 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:14 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:14.324 261084 INFO neutron.agent.dhcp.agent [None req-f79aedbe-2d65-46e2-a66f-a5f1d6e784b4 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:03:14 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:14.430 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:03:14 localhost systemd[1]: run-netns-qdhcp\x2d2df2b9d7\x2d92bb\x2d4c3f\x2da2c7\x2db313541a7942.mount: Deactivated successfully. Nov 28 05:03:14 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:03:15 localhost nova_compute[279673]: 2025-11-28 10:03:15.260 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:15 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:15.541 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:03:16 localhost ovn_controller[152322]: 2025-11-28T10:03:16Z|00170|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:03:16 localhost nova_compute[279673]: 2025-11-28 10:03:16.143 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:03:16 localhost systemd[1]: tmp-crun.EOGvtA.mount: Deactivated successfully. Nov 28 05:03:16 localhost podman[312748]: 2025-11-28 10:03:16.854502374 +0000 UTC m=+0.091830513 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, name=ubi9-minimal, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, version=9.6, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, maintainer=Red Hat, Inc.) Nov 28 05:03:16 localhost podman[312748]: 2025-11-28 10:03:16.866380474 +0000 UTC m=+0.103708583 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, container_name=openstack_network_exporter) Nov 28 05:03:16 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:03:17 localhost nova_compute[279673]: 2025-11-28 10:03:17.716 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:18 localhost openstack_network_exporter[240658]: ERROR 10:03:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:03:18 localhost openstack_network_exporter[240658]: ERROR 10:03:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:03:18 localhost openstack_network_exporter[240658]: ERROR 10:03:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:03:18 localhost openstack_network_exporter[240658]: ERROR 10:03:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:03:18 localhost openstack_network_exporter[240658]: Nov 28 05:03:18 localhost openstack_network_exporter[240658]: ERROR 10:03:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:03:18 localhost openstack_network_exporter[240658]: Nov 28 05:03:18 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 05:03:18 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:03:18 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 05:03:18 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 05:03:18 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:03:18 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 05:03:18 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:03:18 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:03:18 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 05:03:18 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:03:18 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 05:03:18 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:03:19 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:19.005 261084 INFO neutron.agent.linux.ip_lib [None req-91921294-1053-4956-b053-0520fb8e8ee7 - - - - - -] Device tap341ca857-e3 cannot be used as it has no MAC address#033[00m Nov 28 05:03:19 localhost nova_compute[279673]: 2025-11-28 10:03:19.032 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:19 localhost kernel: device tap341ca857-e3 entered promiscuous mode Nov 28 05:03:19 localhost NetworkManager[5967]: [1764324199.0412] manager: (tap341ca857-e3): new Generic device (/org/freedesktop/NetworkManager/Devices/31) Nov 28 05:03:19 localhost ovn_controller[152322]: 2025-11-28T10:03:19Z|00171|binding|INFO|Claiming lport 341ca857-e376-4066-bea3-5b6fff39b2b6 for this chassis. Nov 28 05:03:19 localhost ovn_controller[152322]: 2025-11-28T10:03:19Z|00172|binding|INFO|341ca857-e376-4066-bea3-5b6fff39b2b6: Claiming unknown Nov 28 05:03:19 localhost systemd-udevd[312872]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:03:19 localhost nova_compute[279673]: 2025-11-28 10:03:19.048 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:19 localhost ovn_controller[152322]: 2025-11-28T10:03:19Z|00173|binding|INFO|Setting lport 341ca857-e376-4066-bea3-5b6fff39b2b6 ovn-installed in OVS Nov 28 05:03:19 localhost ovn_controller[152322]: 2025-11-28T10:03:19Z|00174|binding|INFO|Setting lport 341ca857-e376-4066-bea3-5b6fff39b2b6 up in Southbound Nov 28 05:03:19 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:19.057 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=341ca857-e376-4066-bea3-5b6fff39b2b6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:19 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:19.059 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 341ca857-e376-4066-bea3-5b6fff39b2b6 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis#033[00m Nov 28 05:03:19 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:19.061 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:03:19 localhost nova_compute[279673]: 2025-11-28 10:03:19.060 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:19 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:19.062 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[94afafac-8fde-413b-9794-7db6872abb6b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:19 localhost journal[227875]: ethtool ioctl error on tap341ca857-e3: No such device Nov 28 05:03:19 localhost journal[227875]: ethtool ioctl error on tap341ca857-e3: No such device Nov 28 05:03:19 localhost nova_compute[279673]: 2025-11-28 10:03:19.093 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:19 localhost journal[227875]: ethtool ioctl error on tap341ca857-e3: No such device Nov 28 05:03:19 localhost journal[227875]: ethtool ioctl error on tap341ca857-e3: No such device Nov 28 05:03:19 localhost journal[227875]: ethtool ioctl error on tap341ca857-e3: No such device Nov 28 05:03:19 localhost journal[227875]: ethtool ioctl error on tap341ca857-e3: No such device Nov 28 05:03:19 localhost journal[227875]: ethtool ioctl error on tap341ca857-e3: No such device Nov 28 05:03:19 localhost journal[227875]: ethtool ioctl error on tap341ca857-e3: No such device Nov 28 05:03:19 localhost nova_compute[279673]: 2025-11-28 10:03:19.136 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:19 localhost nova_compute[279673]: 2025-11-28 10:03:19.167 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:19 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:19.300 2 INFO neutron.agent.securitygroups_rpc [None req-0bd438b8-b072-41d3-bddf-9588300a9670 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:03:19 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:03:19 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:03:19 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:03:19 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:03:19 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:03:19 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:03:19 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 05:03:19 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:03:19 localhost ovn_controller[152322]: 2025-11-28T10:03:19Z|00175|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:03:19 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:03:19 localhost nova_compute[279673]: 2025-11-28 10:03:19.989 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:20 localhost podman[312993]: Nov 28 05:03:20 localhost podman[312993]: 2025-11-28 10:03:20.099609479 +0000 UTC m=+0.092314637 container create dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Nov 28 05:03:20 localhost podman[312993]: 2025-11-28 10:03:20.056981397 +0000 UTC m=+0.049686585 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:03:20 localhost systemd[1]: Started libpod-conmon-dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3.scope. Nov 28 05:03:20 localhost systemd[1]: Started libcrun container. Nov 28 05:03:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0ba7633cf2477f6d6fa8cda99e361b32589ee760afefe1d9142670b2275abf0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:03:20 localhost podman[312993]: 2025-11-28 10:03:20.190335948 +0000 UTC m=+0.183041106 container init dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:03:20 localhost podman[312993]: 2025-11-28 10:03:20.202889178 +0000 UTC m=+0.195594336 container start dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 28 05:03:20 localhost dnsmasq[313012]: started, version 2.85 cachesize 150 Nov 28 05:03:20 localhost dnsmasq[313012]: DNS service limited to local subnets Nov 28 05:03:20 localhost dnsmasq[313012]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:03:20 localhost dnsmasq[313012]: warning: no upstream servers configured Nov 28 05:03:20 localhost dnsmasq-dhcp[313012]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:03:20 localhost dnsmasq[313012]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:03:20 localhost dnsmasq-dhcp[313012]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:03:20 localhost dnsmasq-dhcp[313012]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:03:20 localhost nova_compute[279673]: 2025-11-28 10:03:20.263 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:20 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:20.263 261084 INFO neutron.agent.dhcp.agent [None req-91921294-1053-4956-b053-0520fb8e8ee7 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:18Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=335213db-753a-4eae-b67f-8acb9db5d4f0, ip_allocation=immediate, mac_address=fa:16:3e:2c:2b:04, name=tempest-NetworksTestDHCPv6-67632547, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['b9394be9-c4f7-4b2c-bf87-38897cbd99e1'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:16Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1289, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:18Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1#033[00m Nov 28 05:03:20 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:20.445 261084 INFO neutron.agent.dhcp.agent [None req-2884d28d-33a4-43c7-8d2a-630c3fc49618 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed#033[00m Nov 28 05:03:20 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 05:03:20 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:03:20 localhost dnsmasq[313012]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses Nov 28 05:03:20 localhost dnsmasq-dhcp[313012]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:03:20 localhost dnsmasq-dhcp[313012]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:03:20 localhost podman[313031]: 2025-11-28 10:03:20.487989598 +0000 UTC m=+0.067005431 container kill dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 28 05:03:20 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 28 05:03:20 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:03:20 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:20.885 261084 INFO neutron.agent.dhcp.agent [None req-a452ed1e-c53a-443b-8aa1-76af381123c8 - - - - - -] DHCP configuration for ports {'335213db-753a-4eae-b67f-8acb9db5d4f0'} is completed#033[00m Nov 28 05:03:20 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:20.918 2 INFO neutron.agent.securitygroups_rpc [None req-cc447c81-1a1f-4f5d-aa14-abdbefdf4620 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:03:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:03:21 localhost systemd[1]: tmp-crun.sIlLsp.mount: Deactivated successfully. Nov 28 05:03:21 localhost podman[313066]: 2025-11-28 10:03:21.108791615 +0000 UTC m=+0.095235830 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 05:03:21 localhost dnsmasq[313012]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:03:21 localhost dnsmasq-dhcp[313012]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:03:21 localhost dnsmasq-dhcp[313012]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:03:21 localhost podman[313081]: 2025-11-28 10:03:21.1292177 +0000 UTC m=+0.072042344 container kill dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:03:21 localhost podman[313066]: 2025-11-28 10:03:21.193554784 +0000 UTC m=+0.179999029 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 28 05:03:21 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:03:21 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:03:22 localhost dnsmasq[313012]: exiting on receipt of SIGTERM Nov 28 05:03:22 localhost podman[313130]: 2025-11-28 10:03:22.096925199 +0000 UTC m=+0.063316185 container kill dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 28 05:03:22 localhost systemd[1]: libpod-dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3.scope: Deactivated successfully. Nov 28 05:03:22 localhost podman[313143]: 2025-11-28 10:03:22.173349539 +0000 UTC m=+0.061496293 container died dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Nov 28 05:03:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3-userdata-shm.mount: Deactivated successfully. Nov 28 05:03:22 localhost systemd[1]: var-lib-containers-storage-overlay-e0ba7633cf2477f6d6fa8cda99e361b32589ee760afefe1d9142670b2275abf0-merged.mount: Deactivated successfully. Nov 28 05:03:22 localhost podman[313143]: 2025-11-28 10:03:22.212558082 +0000 UTC m=+0.100704806 container cleanup dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 28 05:03:22 localhost systemd[1]: libpod-conmon-dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3.scope: Deactivated successfully. Nov 28 05:03:22 localhost podman[313145]: 2025-11-28 10:03:22.253042793 +0000 UTC m=+0.133088815 container remove dd91ed26c83f618177359c94ce41fb4e3d2f0f188d9ef2908e7eb143b229b2d3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 28 05:03:22 localhost ovn_controller[152322]: 2025-11-28T10:03:22Z|00176|binding|INFO|Releasing lport 341ca857-e376-4066-bea3-5b6fff39b2b6 from this chassis (sb_readonly=0) Nov 28 05:03:22 localhost ovn_controller[152322]: 2025-11-28T10:03:22Z|00177|binding|INFO|Setting lport 341ca857-e376-4066-bea3-5b6fff39b2b6 down in Southbound Nov 28 05:03:22 localhost nova_compute[279673]: 2025-11-28 10:03:22.267 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:22 localhost kernel: device tap341ca857-e3 left promiscuous mode Nov 28 05:03:22 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:22.275 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=341ca857-e376-4066-bea3-5b6fff39b2b6) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:22 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:22.276 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 341ca857-e376-4066-bea3-5b6fff39b2b6 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis#033[00m Nov 28 05:03:22 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:22.278 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:03:22 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:22.279 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[cb36e133-3a03-4847-aa2b-87137c0cc891]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:22 localhost nova_compute[279673]: 2025-11-28 10:03:22.285 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:22 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:22.564 261084 INFO neutron.agent.dhcp.agent [None req-aca12473-8b6c-4395-81c5-de30dda27e96 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:03:23 localhost systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully. Nov 28 05:03:23 localhost nova_compute[279673]: 2025-11-28 10:03:23.456 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:23 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:23.897 261084 INFO neutron.agent.linux.ip_lib [None req-c53502c5-ba38-467c-ac18-97b685a13bc8 - - - - - -] Device tapbfd23fad-b1 cannot be used as it has no MAC address#033[00m Nov 28 05:03:23 localhost nova_compute[279673]: 2025-11-28 10:03:23.926 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:23 localhost kernel: device tapbfd23fad-b1 entered promiscuous mode Nov 28 05:03:23 localhost NetworkManager[5967]: [1764324203.9360] manager: (tapbfd23fad-b1): new Generic device (/org/freedesktop/NetworkManager/Devices/32) Nov 28 05:03:23 localhost ovn_controller[152322]: 2025-11-28T10:03:23Z|00178|binding|INFO|Claiming lport bfd23fad-b10d-4e29-b498-7b05b354a75f for this chassis. Nov 28 05:03:23 localhost ovn_controller[152322]: 2025-11-28T10:03:23Z|00179|binding|INFO|bfd23fad-b10d-4e29-b498-7b05b354a75f: Claiming unknown Nov 28 05:03:23 localhost nova_compute[279673]: 2025-11-28 10:03:23.936 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:23 localhost systemd-udevd[313182]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:03:23 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:23.946 2 INFO neutron.agent.securitygroups_rpc [None req-8c468440-8245-4890-91bf-66327309dae3 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:03:23 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:23.947 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=bfd23fad-b10d-4e29-b498-7b05b354a75f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:23 localhost ovn_controller[152322]: 2025-11-28T10:03:23Z|00180|binding|INFO|Setting lport bfd23fad-b10d-4e29-b498-7b05b354a75f ovn-installed in OVS Nov 28 05:03:23 localhost ovn_controller[152322]: 2025-11-28T10:03:23Z|00181|binding|INFO|Setting lport bfd23fad-b10d-4e29-b498-7b05b354a75f up in Southbound Nov 28 05:03:23 localhost nova_compute[279673]: 2025-11-28 10:03:23.949 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:23 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:23.950 158130 INFO neutron.agent.ovn.metadata.agent [-] Port bfd23fad-b10d-4e29-b498-7b05b354a75f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis#033[00m Nov 28 05:03:23 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:23.952 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:03:23 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:23.953 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[93ddd13a-2063-46ca-8119-fa9c6fa230ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:23 localhost nova_compute[279673]: 2025-11-28 10:03:23.955 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:23 localhost journal[227875]: ethtool ioctl error on tapbfd23fad-b1: No such device Nov 28 05:03:23 localhost nova_compute[279673]: 2025-11-28 10:03:23.974 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:23 localhost journal[227875]: ethtool ioctl error on tapbfd23fad-b1: No such device Nov 28 05:03:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:03:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:03:23 localhost journal[227875]: ethtool ioctl error on tapbfd23fad-b1: No such device Nov 28 05:03:23 localhost journal[227875]: ethtool ioctl error on tapbfd23fad-b1: No such device Nov 28 05:03:23 localhost journal[227875]: ethtool ioctl error on tapbfd23fad-b1: No such device Nov 28 05:03:24 localhost journal[227875]: ethtool ioctl error on tapbfd23fad-b1: No such device Nov 28 05:03:24 localhost journal[227875]: ethtool ioctl error on tapbfd23fad-b1: No such device Nov 28 05:03:24 localhost journal[227875]: ethtool ioctl error on tapbfd23fad-b1: No such device Nov 28 05:03:24 localhost nova_compute[279673]: 2025-11-28 10:03:24.030 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:24 localhost nova_compute[279673]: 2025-11-28 10:03:24.068 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:24 localhost systemd[1]: tmp-crun.Kjgbt3.mount: Deactivated successfully. Nov 28 05:03:24 localhost podman[313193]: 2025-11-28 10:03:24.158760069 +0000 UTC m=+0.157876854 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS) Nov 28 05:03:24 localhost podman[313190]: 2025-11-28 10:03:24.11377379 +0000 UTC m=+0.116274623 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Nov 28 05:03:24 localhost podman[313193]: 2025-11-28 10:03:24.192518417 +0000 UTC m=+0.191635162 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:03:24 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:03:24 localhost podman[313190]: 2025-11-28 10:03:24.245596538 +0000 UTC m=+0.248097401 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:03:24 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:03:24 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:24.620 2 INFO neutron.agent.securitygroups_rpc [None req-6563d2b7-ae08-45e8-8b76-40044d8bfa2e 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:03:24 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:24.920 261084 INFO neutron.agent.linux.ip_lib [None req-67323245-aba1-4149-8266-c1f3b685ea24 - - - - - -] Device tap79491b70-fe cannot be used as it has no MAC address#033[00m Nov 28 05:03:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:03:24 localhost nova_compute[279673]: 2025-11-28 10:03:24.989 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:25 localhost kernel: device tap79491b70-fe entered promiscuous mode Nov 28 05:03:25 localhost nova_compute[279673]: 2025-11-28 10:03:24.996 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:25 localhost NetworkManager[5967]: [1764324204.9994] manager: (tap79491b70-fe): new Generic device (/org/freedesktop/NetworkManager/Devices/33) Nov 28 05:03:25 localhost ovn_controller[152322]: 2025-11-28T10:03:25Z|00182|binding|INFO|Claiming lport 79491b70-fe82-4673-a612-1252578cdd84 for this chassis. Nov 28 05:03:25 localhost ovn_controller[152322]: 2025-11-28T10:03:25Z|00183|binding|INFO|79491b70-fe82-4673-a612-1252578cdd84: Claiming unknown Nov 28 05:03:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:25.013 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-54d19915-3dc0-4577-b573-72119a0c141d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-54d19915-3dc0-4577-b573-72119a0c141d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c66e098e4fb4a349dc2bb4293454135', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db21ade0-fc80-4871-bcd6-f4301708978d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=79491b70-fe82-4673-a612-1252578cdd84) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:25.016 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 79491b70-fe82-4673-a612-1252578cdd84 in datapath 54d19915-3dc0-4577-b573-72119a0c141d bound to our chassis#033[00m Nov 28 05:03:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:25.018 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 54d19915-3dc0-4577-b573-72119a0c141d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:03:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:25.021 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[abb44dec-798d-4342-8f08-fa81ea2e1949]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:25 localhost ovn_controller[152322]: 2025-11-28T10:03:25Z|00184|binding|INFO|Setting lport 79491b70-fe82-4673-a612-1252578cdd84 ovn-installed in OVS Nov 28 05:03:25 localhost ovn_controller[152322]: 2025-11-28T10:03:25Z|00185|binding|INFO|Setting lport 79491b70-fe82-4673-a612-1252578cdd84 up in Southbound Nov 28 05:03:25 localhost nova_compute[279673]: 2025-11-28 10:03:25.026 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:25 localhost nova_compute[279673]: 2025-11-28 10:03:25.051 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:25 localhost podman[313300]: Nov 28 05:03:25 localhost podman[313300]: 2025-11-28 10:03:25.084117054 +0000 UTC m=+0.142873295 container create f971bb1c82067a732a118ee8686514659117bc1baaa55465d213913a9a561507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125) Nov 28 05:03:25 localhost nova_compute[279673]: 2025-11-28 10:03:25.119 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:25 localhost systemd[1]: Started libpod-conmon-f971bb1c82067a732a118ee8686514659117bc1baaa55465d213913a9a561507.scope. Nov 28 05:03:25 localhost podman[313300]: 2025-11-28 10:03:25.050800929 +0000 UTC m=+0.109557240 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:03:25 localhost systemd[1]: Started libcrun container. Nov 28 05:03:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be1b76aa3a048e62f5201b64518592855d4798bf23964acf53818c2f01e55cdb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:03:25 localhost nova_compute[279673]: 2025-11-28 10:03:25.171 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:25 localhost podman[313300]: 2025-11-28 10:03:25.176267655 +0000 UTC m=+0.235023926 container init f971bb1c82067a732a118ee8686514659117bc1baaa55465d213913a9a561507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Nov 28 05:03:25 localhost systemd[1]: tmp-crun.95Cw9m.mount: Deactivated successfully. Nov 28 05:03:25 localhost podman[313300]: 2025-11-28 10:03:25.188375001 +0000 UTC m=+0.247131272 container start f971bb1c82067a732a118ee8686514659117bc1baaa55465d213913a9a561507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:03:25 localhost dnsmasq[313328]: started, version 2.85 cachesize 150 Nov 28 05:03:25 localhost dnsmasq[313328]: DNS service limited to local subnets Nov 28 05:03:25 localhost dnsmasq[313328]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:03:25 localhost dnsmasq[313328]: warning: no upstream servers configured Nov 28 05:03:25 localhost dnsmasq[313328]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:03:25 localhost nova_compute[279673]: 2025-11-28 10:03:25.267 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:25 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:25.396 261084 INFO neutron.agent.dhcp.agent [None req-d85c7343-5333-4b96-8f67-8f95e25fdc86 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed#033[00m Nov 28 05:03:25 localhost dnsmasq[313328]: exiting on receipt of SIGTERM Nov 28 05:03:25 localhost podman[313359]: 2025-11-28 10:03:25.572969312 +0000 UTC m=+0.065373115 container kill f971bb1c82067a732a118ee8686514659117bc1baaa55465d213913a9a561507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 28 05:03:25 localhost systemd[1]: libpod-f971bb1c82067a732a118ee8686514659117bc1baaa55465d213913a9a561507.scope: Deactivated successfully. Nov 28 05:03:25 localhost podman[313376]: 2025-11-28 10:03:25.651640826 +0000 UTC m=+0.066200818 container died f971bb1c82067a732a118ee8686514659117bc1baaa55465d213913a9a561507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Nov 28 05:03:25 localhost podman[313376]: 2025-11-28 10:03:25.687001659 +0000 UTC m=+0.101561611 container cleanup f971bb1c82067a732a118ee8686514659117bc1baaa55465d213913a9a561507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 05:03:25 localhost systemd[1]: libpod-conmon-f971bb1c82067a732a118ee8686514659117bc1baaa55465d213913a9a561507.scope: Deactivated successfully. Nov 28 05:03:25 localhost podman[313381]: 2025-11-28 10:03:25.711859981 +0000 UTC m=+0.113268966 container remove f971bb1c82067a732a118ee8686514659117bc1baaa55465d213913a9a561507 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:03:25 localhost ovn_controller[152322]: 2025-11-28T10:03:25Z|00186|binding|INFO|Releasing lport bfd23fad-b10d-4e29-b498-7b05b354a75f from this chassis (sb_readonly=0) Nov 28 05:03:25 localhost ovn_controller[152322]: 2025-11-28T10:03:25Z|00187|binding|INFO|Setting lport bfd23fad-b10d-4e29-b498-7b05b354a75f down in Southbound Nov 28 05:03:25 localhost nova_compute[279673]: 2025-11-28 10:03:25.727 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:25 localhost kernel: device tapbfd23fad-b1 left promiscuous mode Nov 28 05:03:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:25.737 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=bfd23fad-b10d-4e29-b498-7b05b354a75f) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:25.739 158130 INFO neutron.agent.ovn.metadata.agent [-] Port bfd23fad-b10d-4e29-b498-7b05b354a75f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis#033[00m Nov 28 05:03:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:25.741 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:03:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:25.747 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[173e3637-49b9-43f2-affe-e7412bd540cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:25 localhost nova_compute[279673]: 2025-11-28 10:03:25.750 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:25 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e126 do_prune osdmap full prune enabled Nov 28 05:03:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e127 e127: 6 total, 6 up, 6 in Nov 28 05:03:26 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e127: 6 total, 6 up, 6 in Nov 28 05:03:26 localhost podman[313432]: Nov 28 05:03:26 localhost podman[313432]: 2025-11-28 10:03:26.162960667 +0000 UTC m=+0.098394090 container create 7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54d19915-3dc0-4577-b573-72119a0c141d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true) Nov 28 05:03:26 localhost systemd[1]: var-lib-containers-storage-overlay-be1b76aa3a048e62f5201b64518592855d4798bf23964acf53818c2f01e55cdb-merged.mount: Deactivated successfully. Nov 28 05:03:26 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f971bb1c82067a732a118ee8686514659117bc1baaa55465d213913a9a561507-userdata-shm.mount: Deactivated successfully. Nov 28 05:03:26 localhost systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully. Nov 28 05:03:26 localhost systemd[1]: Started libpod-conmon-7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5.scope. Nov 28 05:03:26 localhost podman[313432]: 2025-11-28 10:03:26.119634546 +0000 UTC m=+0.055067999 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:03:26 localhost systemd[1]: Started libcrun container. Nov 28 05:03:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:03:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c2133ceac09aa4671885348478e6b1807140f8a826fd1cc3877c75256248e37/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:03:26 localhost podman[313432]: 2025-11-28 10:03:26.257761344 +0000 UTC m=+0.193194777 container init 7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54d19915-3dc0-4577-b573-72119a0c141d, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true) Nov 28 05:03:26 localhost podman[313432]: 2025-11-28 10:03:26.26772324 +0000 UTC m=+0.203156673 container start 7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54d19915-3dc0-4577-b573-72119a0c141d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true) Nov 28 05:03:26 localhost dnsmasq[313457]: started, version 2.85 cachesize 150 Nov 28 05:03:26 localhost dnsmasq[313457]: DNS service limited to local subnets Nov 28 05:03:26 localhost dnsmasq[313457]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:03:26 localhost dnsmasq[313457]: warning: no upstream servers configured Nov 28 05:03:26 localhost dnsmasq-dhcp[313457]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:03:26 localhost dnsmasq[313457]: read /var/lib/neutron/dhcp/54d19915-3dc0-4577-b573-72119a0c141d/addn_hosts - 0 addresses Nov 28 05:03:26 localhost dnsmasq-dhcp[313457]: read /var/lib/neutron/dhcp/54d19915-3dc0-4577-b573-72119a0c141d/host Nov 28 05:03:26 localhost dnsmasq-dhcp[313457]: read /var/lib/neutron/dhcp/54d19915-3dc0-4577-b573-72119a0c141d/opts Nov 28 05:03:26 localhost podman[313450]: 2025-11-28 10:03:26.3497456 +0000 UTC m=+0.092380839 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 05:03:26 localhost podman[313450]: 2025-11-28 10:03:26.368507687 +0000 UTC m=+0.111142896 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 28 05:03:26 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:03:26 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:26.450 261084 INFO neutron.agent.dhcp.agent [None req-9d58455f-b9e4-4772-8ea1-75e0fc7bbaf4 - - - - - -] DHCP configuration for ports {'4bca3778-f7fe-4f52-a319-4ddc2deb73f1'} is completed#033[00m Nov 28 05:03:26 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:26.719 261084 INFO neutron.agent.linux.ip_lib [None req-7a7440a7-77a7-4562-ac0b-ec783591dc65 - - - - - -] Device tap80c11714-73 cannot be used as it has no MAC address#033[00m Nov 28 05:03:26 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:26.725 2 INFO neutron.agent.securitygroups_rpc [None req-a6c40294-bcff-4fbb-89ad-bea0e8a1937c 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:03:26 localhost nova_compute[279673]: 2025-11-28 10:03:26.751 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:26 localhost kernel: device tap80c11714-73 entered promiscuous mode Nov 28 05:03:26 localhost NetworkManager[5967]: [1764324206.7603] manager: (tap80c11714-73): new Generic device (/org/freedesktop/NetworkManager/Devices/34) Nov 28 05:03:26 localhost nova_compute[279673]: 2025-11-28 10:03:26.760 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:26 localhost ovn_controller[152322]: 2025-11-28T10:03:26Z|00188|binding|INFO|Claiming lport 80c11714-7320-4ace-8aff-6149fd7ecd71 for this chassis. Nov 28 05:03:26 localhost ovn_controller[152322]: 2025-11-28T10:03:26Z|00189|binding|INFO|80c11714-7320-4ace-8aff-6149fd7ecd71: Claiming unknown Nov 28 05:03:26 localhost ovn_controller[152322]: 2025-11-28T10:03:26Z|00190|binding|INFO|Setting lport 80c11714-7320-4ace-8aff-6149fd7ecd71 ovn-installed in OVS Nov 28 05:03:26 localhost ovn_controller[152322]: 2025-11-28T10:03:26Z|00191|binding|INFO|Setting lport 80c11714-7320-4ace-8aff-6149fd7ecd71 up in Southbound Nov 28 05:03:26 localhost nova_compute[279673]: 2025-11-28 10:03:26.774 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:26.776 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=80c11714-7320-4ace-8aff-6149fd7ecd71) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:26.778 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 80c11714-7320-4ace-8aff-6149fd7ecd71 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis#033[00m Nov 28 05:03:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:26.780 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:03:26 localhost nova_compute[279673]: 2025-11-28 10:03:26.782 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:26.781 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[6e34120a-c3ce-4acb-9672-84a308b3f888]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:26 localhost nova_compute[279673]: 2025-11-28 10:03:26.805 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:26 localhost nova_compute[279673]: 2025-11-28 10:03:26.852 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:26 localhost nova_compute[279673]: 2025-11-28 10:03:26.892 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:27 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:27.492 2 INFO neutron.agent.securitygroups_rpc [None req-9bac993d-08a2-4a7a-9741-5f6e8a523396 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:03:27 localhost podman[313537]: Nov 28 05:03:27 localhost podman[313537]: 2025-11-28 10:03:27.758487536 +0000 UTC m=+0.086678735 container create c81095c088dbbd0ccb72cdf8259fd0da9bd35dd09ba7b44742077a1ff324e069 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:03:27 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:27.774 261084 INFO neutron.agent.linux.ip_lib [None req-d55d30d6-980a-4c20-8ecb-8a8df164c5ef - - - - - -] Device tap3e54b4c6-84 cannot be used as it has no MAC address#033[00m Nov 28 05:03:27 localhost podman[313537]: 2025-11-28 10:03:27.720978381 +0000 UTC m=+0.049169580 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:03:27 localhost systemd[1]: Started libpod-conmon-c81095c088dbbd0ccb72cdf8259fd0da9bd35dd09ba7b44742077a1ff324e069.scope. Nov 28 05:03:27 localhost nova_compute[279673]: 2025-11-28 10:03:27.847 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:27 localhost kernel: device tap3e54b4c6-84 entered promiscuous mode Nov 28 05:03:27 localhost NetworkManager[5967]: [1764324207.8577] manager: (tap3e54b4c6-84): new Generic device (/org/freedesktop/NetworkManager/Devices/35) Nov 28 05:03:27 localhost ovn_controller[152322]: 2025-11-28T10:03:27Z|00192|binding|INFO|Claiming lport 3e54b4c6-8462-4da0-9951-b922d57575cf for this chassis. Nov 28 05:03:27 localhost ovn_controller[152322]: 2025-11-28T10:03:27Z|00193|binding|INFO|3e54b4c6-8462-4da0-9951-b922d57575cf: Claiming unknown Nov 28 05:03:27 localhost nova_compute[279673]: 2025-11-28 10:03:27.858 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:27 localhost nova_compute[279673]: 2025-11-28 10:03:27.862 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:27 localhost ovn_controller[152322]: 2025-11-28T10:03:27Z|00194|binding|INFO|Setting lport 3e54b4c6-8462-4da0-9951-b922d57575cf ovn-installed in OVS Nov 28 05:03:27 localhost ovn_controller[152322]: 2025-11-28T10:03:27Z|00195|binding|INFO|Setting lport 3e54b4c6-8462-4da0-9951-b922d57575cf up in Southbound Nov 28 05:03:27 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:27.880 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-3f9a6f97-9109-45cc-b3d8-12edbd83a346', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f9a6f97-9109-45cc-b3d8-12edbd83a346', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c66e098e4fb4a349dc2bb4293454135', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=34d43fdc-65ac-42a6-8e26-177d541f3791, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3e54b4c6-8462-4da0-9951-b922d57575cf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:27 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:27.882 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 3e54b4c6-8462-4da0-9951-b922d57575cf in datapath 3f9a6f97-9109-45cc-b3d8-12edbd83a346 bound to our chassis#033[00m Nov 28 05:03:27 localhost nova_compute[279673]: 2025-11-28 10:03:27.881 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:27 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:27.884 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3f9a6f97-9109-45cc-b3d8-12edbd83a346 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:03:27 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:27.885 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[04909683-1f8b-4a44-b351-9b1b7bd3eb52]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:27 localhost systemd[1]: Started libcrun container. Nov 28 05:03:27 localhost nova_compute[279673]: 2025-11-28 10:03:27.892 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/615202a384b5901bf010b31341d8cceb3dc86a8f244a7a54a09f31e43589443f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:03:27 localhost podman[313537]: 2025-11-28 10:03:27.905267471 +0000 UTC m=+0.233458670 container init c81095c088dbbd0ccb72cdf8259fd0da9bd35dd09ba7b44742077a1ff324e069 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:03:27 localhost podman[313537]: 2025-11-28 10:03:27.919810499 +0000 UTC m=+0.248001698 container start c81095c088dbbd0ccb72cdf8259fd0da9bd35dd09ba7b44742077a1ff324e069 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:03:27 localhost dnsmasq[313569]: started, version 2.85 cachesize 150 Nov 28 05:03:27 localhost dnsmasq[313569]: DNS service limited to local subnets Nov 28 05:03:27 localhost dnsmasq[313569]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:03:27 localhost dnsmasq[313569]: warning: no upstream servers configured Nov 28 05:03:27 localhost dnsmasq-dhcp[313569]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:03:27 localhost dnsmasq[313569]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:03:27 localhost dnsmasq-dhcp[313569]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:03:27 localhost dnsmasq-dhcp[313569]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:03:27 localhost nova_compute[279673]: 2025-11-28 10:03:27.933 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:27 localhost nova_compute[279673]: 2025-11-28 10:03:27.964 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:28 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e127 do_prune osdmap full prune enabled Nov 28 05:03:28 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e128 e128: 6 total, 6 up, 6 in Nov 28 05:03:28 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e128: 6 total, 6 up, 6 in Nov 28 05:03:28 localhost systemd[1]: tmp-crun.6CQfIu.mount: Deactivated successfully. Nov 28 05:03:28 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:28.381 261084 INFO neutron.agent.dhcp.agent [None req-dcc1db14-efa7-425a-8704-492431afa33a - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed#033[00m Nov 28 05:03:28 localhost dnsmasq[313569]: exiting on receipt of SIGTERM Nov 28 05:03:28 localhost podman[313609]: 2025-11-28 10:03:28.567466817 +0000 UTC m=+0.061244487 container kill c81095c088dbbd0ccb72cdf8259fd0da9bd35dd09ba7b44742077a1ff324e069 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0) Nov 28 05:03:28 localhost systemd[1]: libpod-c81095c088dbbd0ccb72cdf8259fd0da9bd35dd09ba7b44742077a1ff324e069.scope: Deactivated successfully. Nov 28 05:03:28 localhost podman[313626]: 2025-11-28 10:03:28.641997461 +0000 UTC m=+0.049257641 container died c81095c088dbbd0ccb72cdf8259fd0da9bd35dd09ba7b44742077a1ff324e069 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:03:28 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c81095c088dbbd0ccb72cdf8259fd0da9bd35dd09ba7b44742077a1ff324e069-userdata-shm.mount: Deactivated successfully. Nov 28 05:03:28 localhost systemd[1]: var-lib-containers-storage-overlay-615202a384b5901bf010b31341d8cceb3dc86a8f244a7a54a09f31e43589443f-merged.mount: Deactivated successfully. Nov 28 05:03:28 localhost podman[313626]: 2025-11-28 10:03:28.693027874 +0000 UTC m=+0.100287944 container remove c81095c088dbbd0ccb72cdf8259fd0da9bd35dd09ba7b44742077a1ff324e069 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Nov 28 05:03:28 localhost systemd[1]: libpod-conmon-c81095c088dbbd0ccb72cdf8259fd0da9bd35dd09ba7b44742077a1ff324e069.scope: Deactivated successfully. Nov 28 05:03:28 localhost ovn_controller[152322]: 2025-11-28T10:03:28Z|00196|binding|INFO|Releasing lport 80c11714-7320-4ace-8aff-6149fd7ecd71 from this chassis (sb_readonly=0) Nov 28 05:03:28 localhost ovn_controller[152322]: 2025-11-28T10:03:28Z|00197|binding|INFO|Setting lport 80c11714-7320-4ace-8aff-6149fd7ecd71 down in Southbound Nov 28 05:03:28 localhost kernel: device tap80c11714-73 left promiscuous mode Nov 28 05:03:28 localhost nova_compute[279673]: 2025-11-28 10:03:28.714 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:28 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:28.720 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=80c11714-7320-4ace-8aff-6149fd7ecd71) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:28 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:28.722 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 80c11714-7320-4ace-8aff-6149fd7ecd71 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis#033[00m Nov 28 05:03:28 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:28.723 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:03:28 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:28.724 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[c8923e72-e116-493f-bede-b461113aa643]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:28 localhost nova_compute[279673]: 2025-11-28 10:03:28.735 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:29 localhost podman[313673]: Nov 28 05:03:29 localhost podman[313673]: 2025-11-28 10:03:29.013728383 +0000 UTC m=+0.087311472 container create 2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f9a6f97-9109-45cc-b3d8-12edbd83a346, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3) Nov 28 05:03:29 localhost systemd[1]: Started libpod-conmon-2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a.scope. Nov 28 05:03:29 localhost podman[313673]: 2025-11-28 10:03:28.974401517 +0000 UTC m=+0.047984606 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:03:29 localhost systemd[1]: Started libcrun container. Nov 28 05:03:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6f48a9173b9ce2ea6c707fe91dbf14c53a35e5091245aaa244be183a6c7890a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:03:29 localhost podman[313673]: 2025-11-28 10:03:29.098668257 +0000 UTC m=+0.172251376 container init 2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f9a6f97-9109-45cc-b3d8-12edbd83a346, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2) Nov 28 05:03:29 localhost podman[313673]: 2025-11-28 10:03:29.108053896 +0000 UTC m=+0.181636995 container start 2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f9a6f97-9109-45cc-b3d8-12edbd83a346, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:03:29 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:29.109 261084 INFO neutron.agent.dhcp.agent [None req-9416bd7d-ed78-4bf5-87f1-14425dcd69fc - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:03:29 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:29.111 261084 INFO neutron.agent.dhcp.agent [None req-9416bd7d-ed78-4bf5-87f1-14425dcd69fc - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:03:29 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:29.112 261084 INFO neutron.agent.dhcp.agent [None req-9416bd7d-ed78-4bf5-87f1-14425dcd69fc - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:03:29 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:29.112 261084 INFO neutron.agent.dhcp.agent [None req-9416bd7d-ed78-4bf5-87f1-14425dcd69fc - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:03:29 localhost dnsmasq[313691]: started, version 2.85 cachesize 150 Nov 28 05:03:29 localhost dnsmasq[313691]: DNS service limited to local subnets Nov 28 05:03:29 localhost dnsmasq[313691]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:03:29 localhost dnsmasq[313691]: warning: no upstream servers configured Nov 28 05:03:29 localhost dnsmasq-dhcp[313691]: DHCP, static leases only on 10.101.0.0, lease time 1d Nov 28 05:03:29 localhost dnsmasq[313691]: read /var/lib/neutron/dhcp/3f9a6f97-9109-45cc-b3d8-12edbd83a346/addn_hosts - 0 addresses Nov 28 05:03:29 localhost dnsmasq-dhcp[313691]: read /var/lib/neutron/dhcp/3f9a6f97-9109-45cc-b3d8-12edbd83a346/host Nov 28 05:03:29 localhost dnsmasq-dhcp[313691]: read /var/lib/neutron/dhcp/3f9a6f97-9109-45cc-b3d8-12edbd83a346/opts Nov 28 05:03:29 localhost systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully. Nov 28 05:03:29 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:29.257 261084 INFO neutron.agent.dhcp.agent [None req-d7a37b2c-9460-4ffb-85ac-c14e00cc5088 - - - - - -] DHCP configuration for ports {'5d63853b-9b6d-4df6-a058-ea8bdb95fd89'} is completed#033[00m Nov 28 05:03:29 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:29.861 261084 INFO neutron.agent.linux.ip_lib [None req-ed4c1b1d-7cca-4932-a6ac-61674616df16 - - - - - -] Device tapb9229365-e6 cannot be used as it has no MAC address#033[00m Nov 28 05:03:29 localhost nova_compute[279673]: 2025-11-28 10:03:29.885 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:29 localhost kernel: device tapb9229365-e6 entered promiscuous mode Nov 28 05:03:29 localhost NetworkManager[5967]: [1764324209.8924] manager: (tapb9229365-e6): new Generic device (/org/freedesktop/NetworkManager/Devices/36) Nov 28 05:03:29 localhost ovn_controller[152322]: 2025-11-28T10:03:29Z|00198|binding|INFO|Claiming lport b9229365-e63f-47af-83a5-e34c4eab9b13 for this chassis. Nov 28 05:03:29 localhost ovn_controller[152322]: 2025-11-28T10:03:29Z|00199|binding|INFO|b9229365-e63f-47af-83a5-e34c4eab9b13: Claiming unknown Nov 28 05:03:29 localhost nova_compute[279673]: 2025-11-28 10:03:29.894 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:29 localhost ovn_controller[152322]: 2025-11-28T10:03:29Z|00200|binding|INFO|Setting lport b9229365-e63f-47af-83a5-e34c4eab9b13 up in Southbound Nov 28 05:03:29 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:29.905 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b9229365-e63f-47af-83a5-e34c4eab9b13) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:29 localhost nova_compute[279673]: 2025-11-28 10:03:29.905 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:29 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:29.907 158130 INFO neutron.agent.ovn.metadata.agent [-] Port b9229365-e63f-47af-83a5-e34c4eab9b13 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis#033[00m Nov 28 05:03:29 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:29.909 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:03:29 localhost ovn_controller[152322]: 2025-11-28T10:03:29Z|00201|binding|INFO|Setting lport b9229365-e63f-47af-83a5-e34c4eab9b13 ovn-installed in OVS Nov 28 05:03:29 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:29.910 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[2c9195d5-62a9-4e74-b1d9-07cf8f4b8fb6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:29 localhost nova_compute[279673]: 2025-11-28 10:03:29.913 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:29 localhost nova_compute[279673]: 2025-11-28 10:03:29.942 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:29 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:03:29 localhost nova_compute[279673]: 2025-11-28 10:03:29.981 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:30 localhost nova_compute[279673]: 2025-11-28 10:03:30.007 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:30 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e128 do_prune osdmap full prune enabled Nov 28 05:03:30 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e129 e129: 6 total, 6 up, 6 in Nov 28 05:03:30 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e129: 6 total, 6 up, 6 in Nov 28 05:03:30 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:30.073 2 INFO neutron.agent.securitygroups_rpc [None req-a538bd0d-c0aa-4d14-8c4b-26de5d170843 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:03:30 localhost nova_compute[279673]: 2025-11-28 10:03:30.296 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:30 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:30.468 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:29Z, description=, device_id=95dd87c4-3547-4d63-b1b3-f7fac634736c, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=6f30b16c-3d15-4d00-b3e9-56746d8a041a, ip_allocation=immediate, mac_address=fa:16:3e:7e:00:68, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:21Z, description=, dns_domain=, id=54d19915-3dc0-4577-b573-72119a0c141d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-437254213, port_security_enabled=True, project_id=8c66e098e4fb4a349dc2bb4293454135, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=2845, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1311, status=ACTIVE, subnets=['7300da77-eea5-408a-91a4-c84afe3031ce'], tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:03:23Z, vlan_transparent=None, network_id=54d19915-3dc0-4577-b573-72119a0c141d, port_security_enabled=False, project_id=8c66e098e4fb4a349dc2bb4293454135, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1361, status=DOWN, tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:03:30Z on network 54d19915-3dc0-4577-b573-72119a0c141d#033[00m Nov 28 05:03:30 localhost podman[313750]: 2025-11-28 10:03:30.695709668 +0000 UTC m=+0.065508738 container kill 7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54d19915-3dc0-4577-b573-72119a0c141d, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:03:30 localhost dnsmasq[313457]: read /var/lib/neutron/dhcp/54d19915-3dc0-4577-b573-72119a0c141d/addn_hosts - 1 addresses Nov 28 05:03:30 localhost dnsmasq-dhcp[313457]: read /var/lib/neutron/dhcp/54d19915-3dc0-4577-b573-72119a0c141d/host Nov 28 05:03:30 localhost dnsmasq-dhcp[313457]: read /var/lib/neutron/dhcp/54d19915-3dc0-4577-b573-72119a0c141d/opts Nov 28 05:03:30 localhost podman[313793]: Nov 28 05:03:31 localhost podman[313793]: 2025-11-28 10:03:31.00396308 +0000 UTC m=+0.105735180 container create 23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 05:03:31 localhost podman[313793]: 2025-11-28 10:03:30.956161481 +0000 UTC m=+0.057933621 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:03:31 localhost systemd[1]: Started libpod-conmon-23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae.scope. Nov 28 05:03:31 localhost systemd[1]: Started libcrun container. Nov 28 05:03:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b1a668778bcfd1edf34acdc7e1173cee88f0564f8beedbbd0a0b8935a85fd2a6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:03:31 localhost podman[313793]: 2025-11-28 10:03:31.18757677 +0000 UTC m=+0.289348880 container init 23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 05:03:31 localhost podman[313793]: 2025-11-28 10:03:31.193796999 +0000 UTC m=+0.295569109 container start 23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:03:31 localhost dnsmasq[313811]: started, version 2.85 cachesize 150 Nov 28 05:03:31 localhost dnsmasq[313811]: DNS service limited to local subnets Nov 28 05:03:31 localhost dnsmasq[313811]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:03:31 localhost dnsmasq[313811]: warning: no upstream servers configured Nov 28 05:03:31 localhost dnsmasq-dhcp[313811]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:03:31 localhost dnsmasq[313811]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:03:31 localhost dnsmasq-dhcp[313811]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:03:31 localhost dnsmasq-dhcp[313811]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:03:31 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:31.234 261084 INFO neutron.agent.dhcp.agent [None req-ed4c1b1d-7cca-4932-a6ac-61674616df16 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:29Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=eca23945-2aa5-4e91-a9eb-84685313cbf1, ip_allocation=immediate, mac_address=fa:16:3e:8d:8a:22, name=tempest-NetworksTestDHCPv6-1625906628, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=8, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['c6e666cb-138e-4fa7-b852-373dd89d6438'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:28Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1360, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:29Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1#033[00m Nov 28 05:03:31 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:31.371 261084 INFO neutron.agent.dhcp.agent [None req-4519bccb-0d8b-4f79-94fa-1768aeafc3f9 - - - - - -] DHCP configuration for ports {'6f30b16c-3d15-4d00-b3e9-56746d8a041a'} is completed#033[00m Nov 28 05:03:31 localhost dnsmasq[313811]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses Nov 28 05:03:31 localhost dnsmasq-dhcp[313811]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:03:31 localhost dnsmasq-dhcp[313811]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:03:31 localhost podman[313828]: 2025-11-28 10:03:31.456440696 +0000 UTC m=+0.069018219 container kill 23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:03:31 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:31.526 261084 INFO neutron.agent.dhcp.agent [None req-03f01c79-972b-4a85-9494-52a9f9b8edd6 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed#033[00m Nov 28 05:03:31 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:31.762 261084 INFO neutron.agent.dhcp.agent [None req-b059a3c1-4b71-4dcc-8bdd-38a79446d36f - - - - - -] DHCP configuration for ports {'eca23945-2aa5-4e91-a9eb-84685313cbf1'} is completed#033[00m Nov 28 05:03:31 localhost ovn_controller[152322]: 2025-11-28T10:03:31Z|00202|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:03:31 localhost nova_compute[279673]: 2025-11-28 10:03:31.847 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:32 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e129 do_prune osdmap full prune enabled Nov 28 05:03:32 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e130 e130: 6 total, 6 up, 6 in Nov 28 05:03:32 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e130: 6 total, 6 up, 6 in Nov 28 05:03:32 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:32.349 2 INFO neutron.agent.securitygroups_rpc [None req-23b7a4db-87e9-4c7d-8b9d-380815f2adcd 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:03:32 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:32.364 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:29Z, description=, device_id=95dd87c4-3547-4d63-b1b3-f7fac634736c, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=6f30b16c-3d15-4d00-b3e9-56746d8a041a, ip_allocation=immediate, mac_address=fa:16:3e:7e:00:68, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:21Z, description=, dns_domain=, id=54d19915-3dc0-4577-b573-72119a0c141d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-437254213, port_security_enabled=True, project_id=8c66e098e4fb4a349dc2bb4293454135, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=2845, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1311, status=ACTIVE, subnets=['7300da77-eea5-408a-91a4-c84afe3031ce'], tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:03:23Z, vlan_transparent=None, network_id=54d19915-3dc0-4577-b573-72119a0c141d, port_security_enabled=False, project_id=8c66e098e4fb4a349dc2bb4293454135, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1361, status=DOWN, tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:03:30Z on network 54d19915-3dc0-4577-b573-72119a0c141d#033[00m Nov 28 05:03:32 localhost systemd[1]: tmp-crun.JhQVSQ.mount: Deactivated successfully. Nov 28 05:03:32 localhost dnsmasq[313457]: read /var/lib/neutron/dhcp/54d19915-3dc0-4577-b573-72119a0c141d/addn_hosts - 1 addresses Nov 28 05:03:32 localhost dnsmasq-dhcp[313457]: read /var/lib/neutron/dhcp/54d19915-3dc0-4577-b573-72119a0c141d/host Nov 28 05:03:32 localhost dnsmasq-dhcp[313457]: read /var/lib/neutron/dhcp/54d19915-3dc0-4577-b573-72119a0c141d/opts Nov 28 05:03:32 localhost podman[313881]: 2025-11-28 10:03:32.61696357 +0000 UTC m=+0.078970494 container kill 7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54d19915-3dc0-4577-b573-72119a0c141d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:03:32 localhost dnsmasq[313811]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:03:32 localhost dnsmasq-dhcp[313811]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:03:32 localhost dnsmasq-dhcp[313811]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:03:32 localhost podman[313888]: 2025-11-28 10:03:32.677806913 +0000 UTC m=+0.122397568 container kill 23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:03:32 localhost systemd[1]: tmp-crun.OUcszA.mount: Deactivated successfully. Nov 28 05:03:32 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:32.933 261084 INFO neutron.agent.dhcp.agent [None req-ff8a999b-bb21-441c-af5b-99c50650f3ee - - - - - -] DHCP configuration for ports {'6f30b16c-3d15-4d00-b3e9-56746d8a041a'} is completed#033[00m Nov 28 05:03:33 localhost dnsmasq[313811]: exiting on receipt of SIGTERM Nov 28 05:03:33 localhost podman[313939]: 2025-11-28 10:03:33.336349133 +0000 UTC m=+0.064718566 container kill 23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:03:33 localhost systemd[1]: libpod-23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae.scope: Deactivated successfully. Nov 28 05:03:33 localhost podman[313951]: 2025-11-28 10:03:33.406362969 +0000 UTC m=+0.057498769 container died 23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 05:03:33 localhost podman[313951]: 2025-11-28 10:03:33.449918927 +0000 UTC m=+0.101054717 container cleanup 23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:03:33 localhost systemd[1]: libpod-conmon-23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae.scope: Deactivated successfully. Nov 28 05:03:33 localhost podman[313958]: 2025-11-28 10:03:33.500913168 +0000 UTC m=+0.137511071 container remove 23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:03:33 localhost ovn_controller[152322]: 2025-11-28T10:03:33Z|00203|binding|INFO|Releasing lport b9229365-e63f-47af-83a5-e34c4eab9b13 from this chassis (sb_readonly=0) Nov 28 05:03:33 localhost ovn_controller[152322]: 2025-11-28T10:03:33Z|00204|binding|INFO|Setting lport b9229365-e63f-47af-83a5-e34c4eab9b13 down in Southbound Nov 28 05:03:33 localhost nova_compute[279673]: 2025-11-28 10:03:33.547 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:33 localhost kernel: device tapb9229365-e6 left promiscuous mode Nov 28 05:03:33 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:33.555 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b9229365-e63f-47af-83a5-e34c4eab9b13) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:33 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:33.557 158130 INFO neutron.agent.ovn.metadata.agent [-] Port b9229365-e63f-47af-83a5-e34c4eab9b13 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis#033[00m Nov 28 05:03:33 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:33.558 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:03:33 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:33.559 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[7dc8c26c-16ab-41eb-91a3-44fc82d3752e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:33 localhost nova_compute[279673]: 2025-11-28 10:03:33.575 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:33 localhost systemd[1]: var-lib-containers-storage-overlay-b1a668778bcfd1edf34acdc7e1173cee88f0564f8beedbbd0a0b8935a85fd2a6-merged.mount: Deactivated successfully. Nov 28 05:03:33 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-23ef835cb965ba09f5d90b64683eae3417f2eeaf3df24eac9516848d711d21ae-userdata-shm.mount: Deactivated successfully. Nov 28 05:03:33 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:33.769 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:33Z, description=, device_id=95dd87c4-3547-4d63-b1b3-f7fac634736c, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=037df326-df58-4d28-8dbc-25f9c11ef071, ip_allocation=immediate, mac_address=fa:16:3e:44:9d:d6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:22Z, description=, dns_domain=, id=3f9a6f97-9109-45cc-b3d8-12edbd83a346, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1083733682, port_security_enabled=True, project_id=8c66e098e4fb4a349dc2bb4293454135, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=61995, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1315, status=ACTIVE, subnets=['fc4d69a4-0dba-4d3f-a03f-18d4101010b3'], tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:03:26Z, vlan_transparent=None, network_id=3f9a6f97-9109-45cc-b3d8-12edbd83a346, port_security_enabled=False, project_id=8c66e098e4fb4a349dc2bb4293454135, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1375, status=DOWN, tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:03:33Z on network 3f9a6f97-9109-45cc-b3d8-12edbd83a346#033[00m Nov 28 05:03:33 localhost systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully. Nov 28 05:03:33 localhost podman[313999]: 2025-11-28 10:03:33.988767827 +0000 UTC m=+0.064397096 container kill 2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f9a6f97-9109-45cc-b3d8-12edbd83a346, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 28 05:03:33 localhost dnsmasq[313691]: read /var/lib/neutron/dhcp/3f9a6f97-9109-45cc-b3d8-12edbd83a346/addn_hosts - 1 addresses Nov 28 05:03:33 localhost dnsmasq-dhcp[313691]: read /var/lib/neutron/dhcp/3f9a6f97-9109-45cc-b3d8-12edbd83a346/host Nov 28 05:03:33 localhost dnsmasq-dhcp[313691]: read /var/lib/neutron/dhcp/3f9a6f97-9109-45cc-b3d8-12edbd83a346/opts Nov 28 05:03:34 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:34.504 261084 INFO neutron.agent.dhcp.agent [None req-243ada98-2bf5-439a-a2f3-e76a20daaaa2 - - - - - -] DHCP configuration for ports {'037df326-df58-4d28-8dbc-25f9c11ef071'} is completed#033[00m Nov 28 05:03:34 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:34.815 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:33Z, description=, device_id=95dd87c4-3547-4d63-b1b3-f7fac634736c, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=037df326-df58-4d28-8dbc-25f9c11ef071, ip_allocation=immediate, mac_address=fa:16:3e:44:9d:d6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:22Z, description=, dns_domain=, id=3f9a6f97-9109-45cc-b3d8-12edbd83a346, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1083733682, port_security_enabled=True, project_id=8c66e098e4fb4a349dc2bb4293454135, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=61995, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1315, status=ACTIVE, subnets=['fc4d69a4-0dba-4d3f-a03f-18d4101010b3'], tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:03:26Z, vlan_transparent=None, network_id=3f9a6f97-9109-45cc-b3d8-12edbd83a346, port_security_enabled=False, project_id=8c66e098e4fb4a349dc2bb4293454135, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1375, status=DOWN, tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:03:33Z on network 3f9a6f97-9109-45cc-b3d8-12edbd83a346#033[00m Nov 28 05:03:34 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:34.880 2 INFO neutron.agent.securitygroups_rpc [None req-a1e00e91-b063-4693-9d8b-b7a005d16694 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:03:34 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:03:34 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e130 do_prune osdmap full prune enabled Nov 28 05:03:34 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e131 e131: 6 total, 6 up, 6 in Nov 28 05:03:34 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e131: 6 total, 6 up, 6 in Nov 28 05:03:35 localhost dnsmasq[313691]: read /var/lib/neutron/dhcp/3f9a6f97-9109-45cc-b3d8-12edbd83a346/addn_hosts - 1 addresses Nov 28 05:03:35 localhost dnsmasq-dhcp[313691]: read /var/lib/neutron/dhcp/3f9a6f97-9109-45cc-b3d8-12edbd83a346/host Nov 28 05:03:35 localhost podman[314040]: 2025-11-28 10:03:35.038522176 +0000 UTC m=+0.058236580 container kill 2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f9a6f97-9109-45cc-b3d8-12edbd83a346, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 28 05:03:35 localhost dnsmasq-dhcp[313691]: read /var/lib/neutron/dhcp/3f9a6f97-9109-45cc-b3d8-12edbd83a346/opts Nov 28 05:03:35 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:35.062 261084 INFO neutron.agent.linux.ip_lib [None req-ebecf48a-9039-4715-a56f-c608e77dba56 - - - - - -] Device tapd925f0b8-f1 cannot be used as it has no MAC address#033[00m Nov 28 05:03:35 localhost nova_compute[279673]: 2025-11-28 10:03:35.144 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:35 localhost kernel: device tapd925f0b8-f1 entered promiscuous mode Nov 28 05:03:35 localhost nova_compute[279673]: 2025-11-28 10:03:35.149 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:35 localhost NetworkManager[5967]: [1764324215.1535] manager: (tapd925f0b8-f1): new Generic device (/org/freedesktop/NetworkManager/Devices/37) Nov 28 05:03:35 localhost ovn_controller[152322]: 2025-11-28T10:03:35Z|00205|binding|INFO|Claiming lport d925f0b8-f14d-42ee-9e29-a163823c50b3 for this chassis. Nov 28 05:03:35 localhost ovn_controller[152322]: 2025-11-28T10:03:35Z|00206|binding|INFO|d925f0b8-f14d-42ee-9e29-a163823c50b3: Claiming unknown Nov 28 05:03:35 localhost systemd-udevd[314063]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:03:35 localhost nova_compute[279673]: 2025-11-28 10:03:35.158 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:35 localhost ovn_controller[152322]: 2025-11-28T10:03:35Z|00207|binding|INFO|Setting lport d925f0b8-f14d-42ee-9e29-a163823c50b3 ovn-installed in OVS Nov 28 05:03:35 localhost ovn_controller[152322]: 2025-11-28T10:03:35Z|00208|binding|INFO|Setting lport d925f0b8-f14d-42ee-9e29-a163823c50b3 up in Southbound Nov 28 05:03:35 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:35.168 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d925f0b8-f14d-42ee-9e29-a163823c50b3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:35 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:35.172 158130 INFO neutron.agent.ovn.metadata.agent [-] Port d925f0b8-f14d-42ee-9e29-a163823c50b3 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis#033[00m Nov 28 05:03:35 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:35.174 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:03:35 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:35.175 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[0e22c034-3058-4fd9-a75c-9b7d4ccb602f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:35 localhost journal[227875]: ethtool ioctl error on tapd925f0b8-f1: No such device Nov 28 05:03:35 localhost nova_compute[279673]: 2025-11-28 10:03:35.188 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:35 localhost journal[227875]: ethtool ioctl error on tapd925f0b8-f1: No such device Nov 28 05:03:35 localhost journal[227875]: ethtool ioctl error on tapd925f0b8-f1: No such device Nov 28 05:03:35 localhost journal[227875]: ethtool ioctl error on tapd925f0b8-f1: No such device Nov 28 05:03:35 localhost journal[227875]: ethtool ioctl error on tapd925f0b8-f1: No such device Nov 28 05:03:35 localhost journal[227875]: ethtool ioctl error on tapd925f0b8-f1: No such device Nov 28 05:03:35 localhost journal[227875]: ethtool ioctl error on tapd925f0b8-f1: No such device Nov 28 05:03:35 localhost journal[227875]: ethtool ioctl error on tapd925f0b8-f1: No such device Nov 28 05:03:35 localhost nova_compute[279673]: 2025-11-28 10:03:35.247 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:35 localhost nova_compute[279673]: 2025-11-28 10:03:35.283 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:35 localhost nova_compute[279673]: 2025-11-28 10:03:35.298 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:35 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:35.356 261084 INFO neutron.agent.dhcp.agent [None req-768664a4-c7a9-45d3-a856-6fd2cac9670f - - - - - -] DHCP configuration for ports {'037df326-df58-4d28-8dbc-25f9c11ef071'} is completed#033[00m Nov 28 05:03:35 localhost nova_compute[279673]: 2025-11-28 10:03:35.693 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:35 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:35.850 2 INFO neutron.agent.securitygroups_rpc [None req-57c1bb59-2e0b-4157-baad-e850337ecf12 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:03:36 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e131 do_prune osdmap full prune enabled Nov 28 05:03:36 localhost podman[314139]: Nov 28 05:03:36 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e132 e132: 6 total, 6 up, 6 in Nov 28 05:03:36 localhost podman[314139]: 2025-11-28 10:03:36.217656603 +0000 UTC m=+0.092817231 container create 905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS) Nov 28 05:03:36 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e132: 6 total, 6 up, 6 in Nov 28 05:03:36 localhost systemd[1]: Started libpod-conmon-905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1.scope. Nov 28 05:03:36 localhost podman[314139]: 2025-11-28 10:03:36.174742493 +0000 UTC m=+0.049903191 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:03:36 localhost systemd[1]: Started libcrun container. Nov 28 05:03:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f97123adbd9c48a0f6f44f905a033a1731be1c1882de0ddc59c8b81ddaf0fc55/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:03:36 localhost podman[314139]: 2025-11-28 10:03:36.320213791 +0000 UTC m=+0.195374389 container init 905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 28 05:03:36 localhost podman[314139]: 2025-11-28 10:03:36.328248752 +0000 UTC m=+0.203409350 container start 905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 28 05:03:36 localhost dnsmasq[314158]: started, version 2.85 cachesize 150 Nov 28 05:03:36 localhost dnsmasq[314158]: DNS service limited to local subnets Nov 28 05:03:36 localhost dnsmasq[314158]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:03:36 localhost dnsmasq[314158]: warning: no upstream servers configured Nov 28 05:03:36 localhost dnsmasq[314158]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:03:36 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:36.373 261084 INFO neutron.agent.dhcp.agent [None req-ebecf48a-9039-4715-a56f-c608e77dba56 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:33Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=4ca17edf-539e-4eb8-8272-2ef525cc9f90, ip_allocation=immediate, mac_address=fa:16:3e:21:93:1a, name=tempest-NetworksTestDHCPv6-642344273, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=10, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['04c13047-e932-4b16-a523-563db6998f3b'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:33Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1376, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:34Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1#033[00m Nov 28 05:03:36 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:36.526 261084 INFO neutron.agent.dhcp.agent [None req-ff48d7c0-99db-42ce-81db-4ede61c064b7 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed#033[00m Nov 28 05:03:36 localhost dnsmasq[314158]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses Nov 28 05:03:36 localhost podman[314190]: 2025-11-28 10:03:36.557781989 +0000 UTC m=+0.058645971 container kill 905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Nov 28 05:03:36 localhost dnsmasq[313691]: read /var/lib/neutron/dhcp/3f9a6f97-9109-45cc-b3d8-12edbd83a346/addn_hosts - 0 addresses Nov 28 05:03:36 localhost dnsmasq-dhcp[313691]: read /var/lib/neutron/dhcp/3f9a6f97-9109-45cc-b3d8-12edbd83a346/host Nov 28 05:03:36 localhost podman[314204]: 2025-11-28 10:03:36.657207757 +0000 UTC m=+0.090300578 container kill 2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f9a6f97-9109-45cc-b3d8-12edbd83a346, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:03:36 localhost dnsmasq-dhcp[313691]: read /var/lib/neutron/dhcp/3f9a6f97-9109-45cc-b3d8-12edbd83a346/opts Nov 28 05:03:36 localhost ovn_controller[152322]: 2025-11-28T10:03:36Z|00209|binding|INFO|Releasing lport 3e54b4c6-8462-4da0-9951-b922d57575cf from this chassis (sb_readonly=0) Nov 28 05:03:36 localhost kernel: device tap3e54b4c6-84 left promiscuous mode Nov 28 05:03:36 localhost nova_compute[279673]: 2025-11-28 10:03:36.882 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:36 localhost ovn_controller[152322]: 2025-11-28T10:03:36Z|00210|binding|INFO|Setting lport 3e54b4c6-8462-4da0-9951-b922d57575cf down in Southbound Nov 28 05:03:36 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:36.897 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-3f9a6f97-9109-45cc-b3d8-12edbd83a346', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f9a6f97-9109-45cc-b3d8-12edbd83a346', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c66e098e4fb4a349dc2bb4293454135', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=34d43fdc-65ac-42a6-8e26-177d541f3791, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3e54b4c6-8462-4da0-9951-b922d57575cf) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:36 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:36.898 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 3e54b4c6-8462-4da0-9951-b922d57575cf in datapath 3f9a6f97-9109-45cc-b3d8-12edbd83a346 unbound from our chassis#033[00m Nov 28 05:03:36 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:36.899 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3f9a6f97-9109-45cc-b3d8-12edbd83a346, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:03:36 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:36.900 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[1e7503d9-b7a5-417e-a811-868b4ff95391]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:36 localhost nova_compute[279673]: 2025-11-28 10:03:36.912 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:36 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:36.928 261084 INFO neutron.agent.dhcp.agent [None req-f77bac8d-9df6-465e-ab8a-7da88b9a6892 - - - - - -] DHCP configuration for ports {'4ca17edf-539e-4eb8-8272-2ef525cc9f90'} is completed#033[00m Nov 28 05:03:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:03:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:03:37 localhost podman[314235]: 2025-11-28 10:03:37.082804682 +0000 UTC m=+0.067419122 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 05:03:37 localhost podman[314235]: 2025-11-28 10:03:37.163187236 +0000 UTC m=+0.147801706 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 05:03:37 localhost podman[314236]: 2025-11-28 10:03:37.11444729 +0000 UTC m=+0.094121629 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 28 05:03:37 localhost podman[314236]: 2025-11-28 10:03:37.199640521 +0000 UTC m=+0.179314830 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:03:37 localhost podman[314290]: 2025-11-28 10:03:37.202891433 +0000 UTC m=+0.057105836 container kill 905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:03:37 localhost dnsmasq[314158]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:03:37 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:03:37 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e132 do_prune osdmap full prune enabled Nov 28 05:03:37 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:03:37 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e133 e133: 6 total, 6 up, 6 in Nov 28 05:03:37 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e133: 6 total, 6 up, 6 in Nov 28 05:03:37 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:37.424 2 INFO neutron.agent.securitygroups_rpc [None req-69a76c47-354e-40d8-9c7a-4acd924cbac4 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:03:37 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:37.753 261084 INFO neutron.agent.dhcp.agent [None req-66785779-8803-4e42-9b78-bde749875261 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', 'd925f0b8-f14d-42ee-9e29-a163823c50b3'} is completed#033[00m Nov 28 05:03:37 localhost nova_compute[279673]: 2025-11-28 10:03:37.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:03:37 localhost systemd[1]: tmp-crun.ckqDjF.mount: Deactivated successfully. Nov 28 05:03:37 localhost dnsmasq[314158]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses Nov 28 05:03:37 localhost podman[314333]: 2025-11-28 10:03:37.844065136 +0000 UTC m=+0.070976164 container kill 905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Nov 28 05:03:37 localhost dnsmasq[313457]: read /var/lib/neutron/dhcp/54d19915-3dc0-4577-b573-72119a0c141d/addn_hosts - 0 addresses Nov 28 05:03:37 localhost dnsmasq-dhcp[313457]: read /var/lib/neutron/dhcp/54d19915-3dc0-4577-b573-72119a0c141d/host Nov 28 05:03:37 localhost dnsmasq-dhcp[313457]: read /var/lib/neutron/dhcp/54d19915-3dc0-4577-b573-72119a0c141d/opts Nov 28 05:03:37 localhost podman[314362]: 2025-11-28 10:03:37.93805734 +0000 UTC m=+0.067218857 container kill 7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54d19915-3dc0-4577-b573-72119a0c141d, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3) Nov 28 05:03:38 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:38.064 261084 INFO neutron.agent.dhcp.agent [None req-b537f614-6618-40e2-b0aa-deb70b449541 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:37Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f0e40f52-9209-4283-a23f-088678cd6c9e, ip_allocation=immediate, mac_address=fa:16:3e:3f:c6:b2, name=tempest-NetworksTestDHCPv6-1521665284, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=12, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['157c2b3e-45c5-4baa-a8ab-f40581745311'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:36Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1380, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:37Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1#033[00m Nov 28 05:03:38 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:38.184 261084 INFO neutron.agent.dhcp.agent [None req-12020dff-2454-40f9-9ca8-d08b3e6c59cb - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', 'd925f0b8-f14d-42ee-9e29-a163823c50b3', 'f0e40f52-9209-4283-a23f-088678cd6c9e'} is completed#033[00m Nov 28 05:03:38 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:38.187 2 INFO neutron.agent.securitygroups_rpc [None req-eef6fd1f-c62c-4fce-a6ed-f73dc25767c9 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:03:38 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e133 do_prune osdmap full prune enabled Nov 28 05:03:38 localhost ovn_controller[152322]: 2025-11-28T10:03:38Z|00211|binding|INFO|Releasing lport 79491b70-fe82-4673-a612-1252578cdd84 from this chassis (sb_readonly=0) Nov 28 05:03:38 localhost ovn_controller[152322]: 2025-11-28T10:03:38Z|00212|binding|INFO|Setting lport 79491b70-fe82-4673-a612-1252578cdd84 down in Southbound Nov 28 05:03:38 localhost nova_compute[279673]: 2025-11-28 10:03:38.252 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:38 localhost kernel: device tap79491b70-fe left promiscuous mode Nov 28 05:03:38 localhost nova_compute[279673]: 2025-11-28 10:03:38.275 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:38 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e134 e134: 6 total, 6 up, 6 in Nov 28 05:03:38 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e134: 6 total, 6 up, 6 in Nov 28 05:03:38 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:38.286 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-54d19915-3dc0-4577-b573-72119a0c141d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-54d19915-3dc0-4577-b573-72119a0c141d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c66e098e4fb4a349dc2bb4293454135', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=db21ade0-fc80-4871-bcd6-f4301708978d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=79491b70-fe82-4673-a612-1252578cdd84) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:38 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:38.288 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 79491b70-fe82-4673-a612-1252578cdd84 in datapath 54d19915-3dc0-4577-b573-72119a0c141d unbound from our chassis#033[00m Nov 28 05:03:38 localhost ovn_controller[152322]: 2025-11-28T10:03:38Z|00213|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:03:38 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:38.292 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 54d19915-3dc0-4577-b573-72119a0c141d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:03:38 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:38.293 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[097599d1-84a4-4bb0-b670-0ab8deb0b42f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:38 localhost nova_compute[279673]: 2025-11-28 10:03:38.334 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:38 localhost podman[314412]: 2025-11-28 10:03:38.345798362 +0000 UTC m=+0.075534425 container kill 905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:03:38 localhost dnsmasq[314158]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses Nov 28 05:03:38 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:38.611 261084 INFO neutron.agent.dhcp.agent [None req-20f3e84f-edd5-476d-87eb-dc91a7444dcc - - - - - -] DHCP configuration for ports {'f0e40f52-9209-4283-a23f-088678cd6c9e'} is completed#033[00m Nov 28 05:03:38 localhost nova_compute[279673]: 2025-11-28 10:03:38.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:03:38 localhost nova_compute[279673]: 2025-11-28 10:03:38.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 05:03:38 localhost dnsmasq[314158]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:03:38 localhost podman[314450]: 2025-11-28 10:03:38.774693282 +0000 UTC m=+0.073169978 container kill 905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:03:39 localhost dnsmasq[313691]: exiting on receipt of SIGTERM Nov 28 05:03:39 localhost podman[314489]: 2025-11-28 10:03:39.21814729 +0000 UTC m=+0.069296947 container kill 2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f9a6f97-9109-45cc-b3d8-12edbd83a346, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 28 05:03:39 localhost systemd[1]: libpod-2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a.scope: Deactivated successfully. Nov 28 05:03:39 localhost podman[314518]: 2025-11-28 10:03:39.29495695 +0000 UTC m=+0.054581124 container died 2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f9a6f97-9109-45cc-b3d8-12edbd83a346, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:03:39 localhost systemd[1]: tmp-crun.LQuSrS.mount: Deactivated successfully. Nov 28 05:03:39 localhost systemd[1]: var-lib-containers-storage-overlay-c6f48a9173b9ce2ea6c707fe91dbf14c53a35e5091245aaa244be183a6c7890a-merged.mount: Deactivated successfully. Nov 28 05:03:39 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a-userdata-shm.mount: Deactivated successfully. Nov 28 05:03:39 localhost podman[314518]: 2025-11-28 10:03:39.393990047 +0000 UTC m=+0.153614171 container remove 2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f9a6f97-9109-45cc-b3d8-12edbd83a346, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 28 05:03:39 localhost systemd[1]: libpod-conmon-2a094995f99ba4fc2f02a5aea73dae129d59818e0f4d45c7ceb22355b444231a.scope: Deactivated successfully. Nov 28 05:03:39 localhost dnsmasq[314158]: exiting on receipt of SIGTERM Nov 28 05:03:39 localhost systemd[1]: libpod-905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1.scope: Deactivated successfully. Nov 28 05:03:39 localhost podman[314538]: 2025-11-28 10:03:39.440684716 +0000 UTC m=+0.160261023 container kill 905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 05:03:39 localhost systemd[1]: tmp-crun.PvFtZt.mount: Deactivated successfully. Nov 28 05:03:39 localhost dnsmasq[313457]: exiting on receipt of SIGTERM Nov 28 05:03:39 localhost podman[314573]: 2025-11-28 10:03:39.515558831 +0000 UTC m=+0.112121723 container kill 7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54d19915-3dc0-4577-b573-72119a0c141d, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:03:39 localhost systemd[1]: libpod-7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5.scope: Deactivated successfully. Nov 28 05:03:39 localhost podman[314587]: 2025-11-28 10:03:39.536390098 +0000 UTC m=+0.083232135 container died 905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:03:39 localhost podman[314587]: 2025-11-28 10:03:39.571606817 +0000 UTC m=+0.118448804 container cleanup 905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:03:39 localhost systemd[1]: libpod-conmon-905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1.scope: Deactivated successfully. Nov 28 05:03:39 localhost podman[314594]: 2025-11-28 10:03:39.614884437 +0000 UTC m=+0.144584873 container remove 905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Nov 28 05:03:39 localhost ovn_controller[152322]: 2025-11-28T10:03:39Z|00214|binding|INFO|Releasing lport d925f0b8-f14d-42ee-9e29-a163823c50b3 from this chassis (sb_readonly=0) Nov 28 05:03:39 localhost ovn_controller[152322]: 2025-11-28T10:03:39Z|00215|binding|INFO|Setting lport d925f0b8-f14d-42ee-9e29-a163823c50b3 down in Southbound Nov 28 05:03:39 localhost kernel: device tapd925f0b8-f1 left promiscuous mode Nov 28 05:03:39 localhost nova_compute[279673]: 2025-11-28 10:03:39.666 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:39 localhost podman[314612]: 2025-11-28 10:03:39.675569756 +0000 UTC m=+0.140021333 container died 7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54d19915-3dc0-4577-b573-72119a0c141d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:03:39 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:39.677 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d925f0b8-f14d-42ee-9e29-a163823c50b3) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:39 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:39.681 261084 INFO neutron.agent.dhcp.agent [None req-90146684-8217-426a-92a0-b7458f59b967 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:03:39 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:39.683 261084 INFO neutron.agent.dhcp.agent [None req-90146684-8217-426a-92a0-b7458f59b967 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:03:39 localhost nova_compute[279673]: 2025-11-28 10:03:39.686 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:39 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:39.685 158130 INFO neutron.agent.ovn.metadata.agent [-] Port d925f0b8-f14d-42ee-9e29-a163823c50b3 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis#033[00m Nov 28 05:03:39 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:39.692 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:03:39 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:39.694 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[2dea9726-06d2-45db-be79-507dd932009c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:39 localhost nova_compute[279673]: 2025-11-28 10:03:39.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:03:39 localhost podman[314612]: 2025-11-28 10:03:39.773830562 +0000 UTC m=+0.238282159 container remove 7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-54d19915-3dc0-4577-b573-72119a0c141d, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Nov 28 05:03:39 localhost systemd[1]: libpod-conmon-7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5.scope: Deactivated successfully. Nov 28 05:03:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:03:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e134 do_prune osdmap full prune enabled Nov 28 05:03:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e135 e135: 6 total, 6 up, 6 in Nov 28 05:03:39 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e135: 6 total, 6 up, 6 in Nov 28 05:03:40 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:40.037 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:03:40 localhost podman[238687]: time="2025-11-28T10:03:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:03:40 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:40.089 261084 INFO neutron.agent.dhcp.agent [None req-17c930e4-ed87-497a-a8db-dbf6503df94e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:03:40 localhost podman[238687]: @ - - [28/Nov/2025:10:03:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 28 05:03:40 localhost podman[238687]: @ - - [28/Nov/2025:10:03:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19254 "" "Go-http-client/1.1" Nov 28 05:03:40 localhost nova_compute[279673]: 2025-11-28 10:03:40.302 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:40 localhost systemd[1]: var-lib-containers-storage-overlay-f97123adbd9c48a0f6f44f905a033a1731be1c1882de0ddc59c8b81ddaf0fc55-merged.mount: Deactivated successfully. Nov 28 05:03:40 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-905fba54955fde208144e2d2130c38bed8883e2aca828b78f6d235b5b1a02bf1-userdata-shm.mount: Deactivated successfully. Nov 28 05:03:40 localhost systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully. Nov 28 05:03:40 localhost systemd[1]: run-netns-qdhcp\x2d3f9a6f97\x2d9109\x2d45cc\x2db3d8\x2d12edbd83a346.mount: Deactivated successfully. Nov 28 05:03:40 localhost systemd[1]: var-lib-containers-storage-overlay-7c2133ceac09aa4671885348478e6b1807140f8a826fd1cc3877c75256248e37-merged.mount: Deactivated successfully. Nov 28 05:03:40 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7781e9a0f25490b6569a0afa5b659e21b181524239b45537a5ee198c7bd75ad5-userdata-shm.mount: Deactivated successfully. Nov 28 05:03:40 localhost systemd[1]: run-netns-qdhcp\x2d54d19915\x2d3dc0\x2d4577\x2db573\x2d72119a0c141d.mount: Deactivated successfully. Nov 28 05:03:40 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:40.465 2 INFO neutron.agent.securitygroups_rpc [None req-3119b771-1e00-43fd-8d05-15e8a1d2219b 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:03:40 localhost nova_compute[279673]: 2025-11-28 10:03:40.534 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:40 localhost nova_compute[279673]: 2025-11-28 10:03:40.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:03:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:41.149 261084 INFO neutron.agent.linux.ip_lib [None req-2dcca3c0-71e9-44b5-9cb0-377311810ae2 - - - - - -] Device tapda5b2ad4-01 cannot be used as it has no MAC address#033[00m Nov 28 05:03:41 localhost nova_compute[279673]: 2025-11-28 10:03:41.200 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:41 localhost kernel: device tapda5b2ad4-01 entered promiscuous mode Nov 28 05:03:41 localhost nova_compute[279673]: 2025-11-28 10:03:41.208 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:41 localhost NetworkManager[5967]: [1764324221.2112] manager: (tapda5b2ad4-01): new Generic device (/org/freedesktop/NetworkManager/Devices/38) Nov 28 05:03:41 localhost ovn_controller[152322]: 2025-11-28T10:03:41Z|00216|binding|INFO|Claiming lport da5b2ad4-01e7-4e6c-9145-eaa3075e76fa for this chassis. Nov 28 05:03:41 localhost ovn_controller[152322]: 2025-11-28T10:03:41Z|00217|binding|INFO|da5b2ad4-01e7-4e6c-9145-eaa3075e76fa: Claiming unknown Nov 28 05:03:41 localhost systemd-udevd[314649]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:03:41 localhost ovn_controller[152322]: 2025-11-28T10:03:41Z|00218|binding|INFO|Setting lport da5b2ad4-01e7-4e6c-9145-eaa3075e76fa ovn-installed in OVS Nov 28 05:03:41 localhost nova_compute[279673]: 2025-11-28 10:03:41.218 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:41 localhost ovn_controller[152322]: 2025-11-28T10:03:41Z|00219|binding|INFO|Setting lport da5b2ad4-01e7-4e6c-9145-eaa3075e76fa up in Southbound Nov 28 05:03:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:41.226 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=da5b2ad4-01e7-4e6c-9145-eaa3075e76fa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:41.227 158130 INFO neutron.agent.ovn.metadata.agent [-] Port da5b2ad4-01e7-4e6c-9145-eaa3075e76fa in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis#033[00m Nov 28 05:03:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:41.228 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:03:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:41.229 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[1e763f94-e2e1-4b8d-8b11-c7915ee51443]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:41 localhost nova_compute[279673]: 2025-11-28 10:03:41.239 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:41 localhost nova_compute[279673]: 2025-11-28 10:03:41.247 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:41 localhost nova_compute[279673]: 2025-11-28 10:03:41.286 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:41 localhost nova_compute[279673]: 2025-11-28 10:03:41.327 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:41.555 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:03:41 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:41.646 2 INFO neutron.agent.securitygroups_rpc [None req-1b0c80b5-e3aa-421f-ac6e-3cbc3bd6a095 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:03:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:41.722 261084 INFO neutron.agent.linux.ip_lib [None req-fb7d8aeb-8c15-4a31-a128-a92fe0323a66 - - - - - -] Device tap0f64cccb-f5 cannot be used as it has no MAC address#033[00m Nov 28 05:03:41 localhost nova_compute[279673]: 2025-11-28 10:03:41.755 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:41 localhost kernel: device tap0f64cccb-f5 entered promiscuous mode Nov 28 05:03:41 localhost systemd-udevd[314651]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:03:41 localhost NetworkManager[5967]: [1764324221.7627] manager: (tap0f64cccb-f5): new Generic device (/org/freedesktop/NetworkManager/Devices/39) Nov 28 05:03:41 localhost ovn_controller[152322]: 2025-11-28T10:03:41Z|00220|binding|INFO|Claiming lport 0f64cccb-f58c-4c85-8d1d-c5dbcdc8de50 for this chassis. Nov 28 05:03:41 localhost ovn_controller[152322]: 2025-11-28T10:03:41Z|00221|binding|INFO|0f64cccb-f58c-4c85-8d1d-c5dbcdc8de50: Claiming unknown Nov 28 05:03:41 localhost nova_compute[279673]: 2025-11-28 10:03:41.764 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:41 localhost nova_compute[279673]: 2025-11-28 10:03:41.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:03:41 localhost journal[227875]: ethtool ioctl error on tap0f64cccb-f5: No such device Nov 28 05:03:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:41.790 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-eec9ef76-e9ff-47a5-b8c7-9b69e3732166', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eec9ef76-e9ff-47a5-b8c7-9b69e3732166', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '50a1392ce96c4024bcd36a3df403ca29', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76364ea1-fb09-4d6a-aaea-0dba21691fb4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0f64cccb-f58c-4c85-8d1d-c5dbcdc8de50) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:41 localhost nova_compute[279673]: 2025-11-28 10:03:41.792 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:41.794 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 0f64cccb-f58c-4c85-8d1d-c5dbcdc8de50 in datapath eec9ef76-e9ff-47a5-b8c7-9b69e3732166 bound to our chassis#033[00m Nov 28 05:03:41 localhost ovn_controller[152322]: 2025-11-28T10:03:41Z|00222|binding|INFO|Setting lport 0f64cccb-f58c-4c85-8d1d-c5dbcdc8de50 ovn-installed in OVS Nov 28 05:03:41 localhost ovn_controller[152322]: 2025-11-28T10:03:41Z|00223|binding|INFO|Setting lport 0f64cccb-f58c-4c85-8d1d-c5dbcdc8de50 up in Southbound Nov 28 05:03:41 localhost journal[227875]: ethtool ioctl error on tap0f64cccb-f5: No such device Nov 28 05:03:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:41.796 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network eec9ef76-e9ff-47a5-b8c7-9b69e3732166 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:03:41 localhost nova_compute[279673]: 2025-11-28 10:03:41.796 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:41 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:41.797 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[4371a29c-0d29-40cb-be6f-f78b07a0fef3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:41 localhost journal[227875]: ethtool ioctl error on tap0f64cccb-f5: No such device Nov 28 05:03:41 localhost journal[227875]: ethtool ioctl error on tap0f64cccb-f5: No such device Nov 28 05:03:41 localhost journal[227875]: ethtool ioctl error on tap0f64cccb-f5: No such device Nov 28 05:03:41 localhost journal[227875]: ethtool ioctl error on tap0f64cccb-f5: No such device Nov 28 05:03:41 localhost journal[227875]: ethtool ioctl error on tap0f64cccb-f5: No such device Nov 28 05:03:41 localhost journal[227875]: ethtool ioctl error on tap0f64cccb-f5: No such device Nov 28 05:03:41 localhost nova_compute[279673]: 2025-11-28 10:03:41.856 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:41 localhost nova_compute[279673]: 2025-11-28 10:03:41.895 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:42 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:42.090 2 INFO neutron.agent.securitygroups_rpc [None req-e8cd93f0-eb23-4135-97c1-cd5750f74f24 b286c38dfd0e4889806c62c7b4b9ee98 50a1392ce96c4024bcd36a3df403ca29 - - default default] Security group member updated ['b372bb98-860c-4571-936b-bf08ecbd647d']#033[00m Nov 28 05:03:42 localhost podman[314747]: Nov 28 05:03:42 localhost podman[314747]: 2025-11-28 10:03:42.326386632 +0000 UTC m=+0.114090840 container create a4028ba968286b01611c690ab3feb5a8c69335b6561e7e724f885feb57711aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3) Nov 28 05:03:42 localhost podman[314747]: 2025-11-28 10:03:42.281011252 +0000 UTC m=+0.068715480 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:03:42 localhost systemd[1]: Started libpod-conmon-a4028ba968286b01611c690ab3feb5a8c69335b6561e7e724f885feb57711aa6.scope. Nov 28 05:03:42 localhost systemd[1]: tmp-crun.II8WIY.mount: Deactivated successfully. Nov 28 05:03:42 localhost systemd[1]: Started libcrun container. Nov 28 05:03:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fd4128ec09b58571ce595b339c06d55fb46f8875cc1866ebdca77487d42fde8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:03:42 localhost podman[314747]: 2025-11-28 10:03:42.435365245 +0000 UTC m=+0.223069433 container init a4028ba968286b01611c690ab3feb5a8c69335b6561e7e724f885feb57711aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125) Nov 28 05:03:42 localhost podman[314747]: 2025-11-28 10:03:42.445687431 +0000 UTC m=+0.233391619 container start a4028ba968286b01611c690ab3feb5a8c69335b6561e7e724f885feb57711aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:03:42 localhost dnsmasq[314776]: started, version 2.85 cachesize 150 Nov 28 05:03:42 localhost dnsmasq[314776]: DNS service limited to local subnets Nov 28 05:03:42 localhost dnsmasq[314776]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:03:42 localhost dnsmasq[314776]: warning: no upstream servers configured Nov 28 05:03:42 localhost dnsmasq-dhcp[314776]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:03:42 localhost dnsmasq[314776]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:03:42 localhost dnsmasq-dhcp[314776]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:03:42 localhost dnsmasq-dhcp[314776]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:03:42 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:42.608 261084 INFO neutron.agent.dhcp.agent [None req-ccbb5d18-c057-4a29-bd70-cab972c8980e - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed#033[00m Nov 28 05:03:42 localhost systemd[1]: tmp-crun.T2eI5r.mount: Deactivated successfully. Nov 28 05:03:42 localhost dnsmasq[314776]: exiting on receipt of SIGTERM Nov 28 05:03:42 localhost podman[314803]: 2025-11-28 10:03:42.876869256 +0000 UTC m=+0.130778829 container kill a4028ba968286b01611c690ab3feb5a8c69335b6561e7e724f885feb57711aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:03:42 localhost systemd[1]: libpod-a4028ba968286b01611c690ab3feb5a8c69335b6561e7e724f885feb57711aa6.scope: Deactivated successfully. Nov 28 05:03:42 localhost podman[314826]: Nov 28 05:03:42 localhost podman[314845]: 2025-11-28 10:03:42.948508078 +0000 UTC m=+0.048250893 container died a4028ba968286b01611c690ab3feb5a8c69335b6561e7e724f885feb57711aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Nov 28 05:03:42 localhost podman[314826]: 2025-11-28 10:03:42.890181277 +0000 UTC m=+0.053768742 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:03:42 localhost podman[314845]: 2025-11-28 10:03:42.989972997 +0000 UTC m=+0.089715802 container remove a4028ba968286b01611c690ab3feb5a8c69335b6561e7e724f885feb57711aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Nov 28 05:03:43 localhost systemd[1]: libpod-conmon-a4028ba968286b01611c690ab3feb5a8c69335b6561e7e724f885feb57711aa6.scope: Deactivated successfully. Nov 28 05:03:43 localhost podman[314826]: 2025-11-28 10:03:43.003236717 +0000 UTC m=+0.166824142 container create 62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eec9ef76-e9ff-47a5-b8c7-9b69e3732166, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:03:43 localhost ovn_controller[152322]: 2025-11-28T10:03:43Z|00224|binding|INFO|Releasing lport da5b2ad4-01e7-4e6c-9145-eaa3075e76fa from this chassis (sb_readonly=0) Nov 28 05:03:43 localhost kernel: device tapda5b2ad4-01 left promiscuous mode Nov 28 05:03:43 localhost ovn_controller[152322]: 2025-11-28T10:03:43Z|00225|binding|INFO|Setting lport da5b2ad4-01e7-4e6c-9145-eaa3075e76fa down in Southbound Nov 28 05:03:43 localhost nova_compute[279673]: 2025-11-28 10:03:43.014 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:43 localhost nova_compute[279673]: 2025-11-28 10:03:43.029 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:43 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:43.038 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=da5b2ad4-01e7-4e6c-9145-eaa3075e76fa) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:43 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:43.040 158130 INFO neutron.agent.ovn.metadata.agent [-] Port da5b2ad4-01e7-4e6c-9145-eaa3075e76fa in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis#033[00m Nov 28 05:03:43 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:43.042 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:03:43 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:43.043 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[52f61220-15b7-455d-a8c1-32c9382ab11e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:43 localhost systemd[1]: Started libpod-conmon-62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c.scope. Nov 28 05:03:43 localhost systemd[1]: Started libcrun container. Nov 28 05:03:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/721a99abb32a0f90fdd1a1d8fc012d3e9fa0457f65acb87fb03055f5486bc057/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:03:43 localhost podman[314826]: 2025-11-28 10:03:43.097211549 +0000 UTC m=+0.260798944 container init 62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eec9ef76-e9ff-47a5-b8c7-9b69e3732166, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 28 05:03:43 localhost podman[314826]: 2025-11-28 10:03:43.104803577 +0000 UTC m=+0.268390982 container start 62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eec9ef76-e9ff-47a5-b8c7-9b69e3732166, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true) Nov 28 05:03:43 localhost dnsmasq[314875]: started, version 2.85 cachesize 150 Nov 28 05:03:43 localhost dnsmasq[314875]: DNS service limited to local subnets Nov 28 05:03:43 localhost dnsmasq[314875]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:03:43 localhost dnsmasq[314875]: warning: no upstream servers configured Nov 28 05:03:43 localhost dnsmasq-dhcp[314875]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:03:43 localhost dnsmasq[314875]: read /var/lib/neutron/dhcp/eec9ef76-e9ff-47a5-b8c7-9b69e3732166/addn_hosts - 0 addresses Nov 28 05:03:43 localhost dnsmasq-dhcp[314875]: read /var/lib/neutron/dhcp/eec9ef76-e9ff-47a5-b8c7-9b69e3732166/host Nov 28 05:03:43 localhost dnsmasq-dhcp[314875]: read /var/lib/neutron/dhcp/eec9ef76-e9ff-47a5-b8c7-9b69e3732166/opts Nov 28 05:03:43 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:43.175 261084 INFO neutron.agent.dhcp.agent [None req-fb7d8aeb-8c15-4a31-a128-a92fe0323a66 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:41Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=12365673-4af9-4ad5-b481-8ba59b4a12f4, ip_allocation=immediate, mac_address=fa:16:3e:59:37:09, name=tempest-RoutersIpV6Test-279049389, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:38Z, description=, dns_domain=, id=eec9ef76-e9ff-47a5-b8c7-9b69e3732166, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1460277724, port_security_enabled=True, project_id=50a1392ce96c4024bcd36a3df403ca29, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=54776, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1381, status=ACTIVE, subnets=['3795ae5d-6bce-4edc-bccb-dff09b0cd314'], tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:03:40Z, vlan_transparent=None, network_id=eec9ef76-e9ff-47a5-b8c7-9b69e3732166, port_security_enabled=True, project_id=50a1392ce96c4024bcd36a3df403ca29, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['b372bb98-860c-4571-936b-bf08ecbd647d'], standard_attr_id=1406, status=DOWN, tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:03:41Z on network eec9ef76-e9ff-47a5-b8c7-9b69e3732166#033[00m Nov 28 05:03:43 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:43.267 261084 INFO neutron.agent.dhcp.agent [None req-f580b92e-ead0-4dcb-b599-a9d76b513677 - - - - - -] DHCP configuration for ports {'6881ddac-156c-4793-be9e-6e7674c0d668'} is completed#033[00m Nov 28 05:03:43 localhost dnsmasq[314875]: read /var/lib/neutron/dhcp/eec9ef76-e9ff-47a5-b8c7-9b69e3732166/addn_hosts - 1 addresses Nov 28 05:03:43 localhost dnsmasq-dhcp[314875]: read /var/lib/neutron/dhcp/eec9ef76-e9ff-47a5-b8c7-9b69e3732166/host Nov 28 05:03:43 localhost dnsmasq-dhcp[314875]: read /var/lib/neutron/dhcp/eec9ef76-e9ff-47a5-b8c7-9b69e3732166/opts Nov 28 05:03:43 localhost podman[314894]: 2025-11-28 10:03:43.380751064 +0000 UTC m=+0.062984486 container kill 62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eec9ef76-e9ff-47a5-b8c7-9b69e3732166, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:03:43 localhost systemd[1]: var-lib-containers-storage-overlay-6fd4128ec09b58571ce595b339c06d55fb46f8875cc1866ebdca77487d42fde8-merged.mount: Deactivated successfully. Nov 28 05:03:43 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a4028ba968286b01611c690ab3feb5a8c69335b6561e7e724f885feb57711aa6-userdata-shm.mount: Deactivated successfully. Nov 28 05:03:43 localhost systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully. Nov 28 05:03:43 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:43.682 261084 INFO neutron.agent.dhcp.agent [None req-e4affca1-79b6-48a4-91a1-2507aefbbc6e - - - - - -] DHCP configuration for ports {'12365673-4af9-4ad5-b481-8ba59b4a12f4'} is completed#033[00m Nov 28 05:03:43 localhost nova_compute[279673]: 2025-11-28 10:03:43.767 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:03:43 localhost nova_compute[279673]: 2025-11-28 10:03:43.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:03:44 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:44.193 2 INFO neutron.agent.securitygroups_rpc [None req-363c9598-0bac-406f-990f-c24334dc748e 595b5cbed3764c7a95b0ab3634e5becb 8c66e098e4fb4a349dc2bb4293454135 - - default default] Security group member updated ['f75eb612-183d-4664-84c2-1d15534e163f']#033[00m Nov 28 05:03:44 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:44.452 261084 INFO neutron.agent.dhcp.agent [-] Synchronizing state#033[00m Nov 28 05:03:44 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:44.671 261084 INFO neutron.agent.dhcp.agent [None req-a7e52cd8-48cc-42a2-89aa-67ae7618ec92 - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 28 05:03:44 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:44.672 261084 INFO neutron.agent.dhcp.agent [-] Starting network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 dhcp configuration#033[00m Nov 28 05:03:44 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:44.672 261084 INFO neutron.agent.dhcp.agent [-] Finished network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 dhcp configuration#033[00m Nov 28 05:03:44 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:44.673 261084 INFO neutron.agent.dhcp.agent [None req-a7e52cd8-48cc-42a2-89aa-67ae7618ec92 - - - - - -] Synchronizing state complete#033[00m Nov 28 05:03:44 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:44.674 261084 INFO neutron.agent.dhcp.agent [None req-7f1d8073-db00-44f4-924a-155fc8d62a3e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:03:44 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:44.674 261084 INFO neutron.agent.dhcp.agent [None req-7f1d8073-db00-44f4-924a-155fc8d62a3e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:03:44 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:44.675 261084 INFO neutron.agent.dhcp.agent [None req-7f1d8073-db00-44f4-924a-155fc8d62a3e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:03:44 localhost nova_compute[279673]: 2025-11-28 10:03:44.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:03:44 localhost nova_compute[279673]: 2025-11-28 10:03:44.791 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:03:44 localhost nova_compute[279673]: 2025-11-28 10:03:44.792 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:03:44 localhost nova_compute[279673]: 2025-11-28 10:03:44.792 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:03:44 localhost nova_compute[279673]: 2025-11-28 10:03:44.793 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 05:03:44 localhost nova_compute[279673]: 2025-11-28 10:03:44.794 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:03:44 localhost nova_compute[279673]: 2025-11-28 10:03:44.952 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:44 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:44.956 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:44 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:44.958 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 28 05:03:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e135 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:03:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e135 do_prune osdmap full prune enabled Nov 28 05:03:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e136 e136: 6 total, 6 up, 6 in Nov 28 05:03:44 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e136: 6 total, 6 up, 6 in Nov 28 05:03:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:03:45 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3685964475' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:03:45 localhost nova_compute[279673]: 2025-11-28 10:03:45.267 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:03:45 localhost nova_compute[279673]: 2025-11-28 10:03:45.326 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:45 localhost nova_compute[279673]: 2025-11-28 10:03:45.343 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:03:45 localhost nova_compute[279673]: 2025-11-28 10:03:45.344 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:03:45 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:45.487 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:41Z, description=, device_id=cd81d6b9-90e3-4f09-bc0f-0098790e2353, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=12365673-4af9-4ad5-b481-8ba59b4a12f4, ip_allocation=immediate, mac_address=fa:16:3e:59:37:09, name=tempest-RoutersIpV6Test-279049389, network_id=eec9ef76-e9ff-47a5-b8c7-9b69e3732166, port_security_enabled=True, project_id=50a1392ce96c4024bcd36a3df403ca29, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['b372bb98-860c-4571-936b-bf08ecbd647d'], standard_attr_id=1406, status=DOWN, tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:03:42Z on network eec9ef76-e9ff-47a5-b8c7-9b69e3732166#033[00m Nov 28 05:03:45 localhost nova_compute[279673]: 2025-11-28 10:03:45.596 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 05:03:45 localhost nova_compute[279673]: 2025-11-28 10:03:45.599 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11198MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 05:03:45 localhost nova_compute[279673]: 2025-11-28 10:03:45.600 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:03:45 localhost nova_compute[279673]: 2025-11-28 10:03:45.600 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:03:45 localhost nova_compute[279673]: 2025-11-28 10:03:45.674 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 05:03:45 localhost nova_compute[279673]: 2025-11-28 10:03:45.675 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 05:03:45 localhost nova_compute[279673]: 2025-11-28 10:03:45.675 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 05:03:45 localhost systemd[1]: tmp-crun.4K5DLx.mount: Deactivated successfully. Nov 28 05:03:45 localhost podman[314955]: 2025-11-28 10:03:45.721190367 +0000 UTC m=+0.083923956 container kill 62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eec9ef76-e9ff-47a5-b8c7-9b69e3732166, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:03:45 localhost dnsmasq[314875]: read /var/lib/neutron/dhcp/eec9ef76-e9ff-47a5-b8c7-9b69e3732166/addn_hosts - 1 addresses Nov 28 05:03:45 localhost dnsmasq-dhcp[314875]: read /var/lib/neutron/dhcp/eec9ef76-e9ff-47a5-b8c7-9b69e3732166/host Nov 28 05:03:45 localhost dnsmasq-dhcp[314875]: read /var/lib/neutron/dhcp/eec9ef76-e9ff-47a5-b8c7-9b69e3732166/opts Nov 28 05:03:45 localhost nova_compute[279673]: 2025-11-28 10:03:45.724 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:03:45 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:45.834 261084 INFO neutron.agent.linux.ip_lib [None req-e89f33e6-aa1d-4760-b4e1-22b9f72304ed - - - - - -] Device tap46d5cf00-99 cannot be used as it has no MAC address#033[00m Nov 28 05:03:45 localhost nova_compute[279673]: 2025-11-28 10:03:45.866 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:45 localhost kernel: device tap46d5cf00-99 entered promiscuous mode Nov 28 05:03:45 localhost NetworkManager[5967]: [1764324225.8739] manager: (tap46d5cf00-99): new Generic device (/org/freedesktop/NetworkManager/Devices/40) Nov 28 05:03:45 localhost ovn_controller[152322]: 2025-11-28T10:03:45Z|00226|binding|INFO|Claiming lport 46d5cf00-9955-46d8-9cab-1f8f84e925f3 for this chassis. Nov 28 05:03:45 localhost ovn_controller[152322]: 2025-11-28T10:03:45Z|00227|binding|INFO|46d5cf00-9955-46d8-9cab-1f8f84e925f3: Claiming unknown Nov 28 05:03:45 localhost systemd-udevd[314995]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:03:45 localhost ovn_controller[152322]: 2025-11-28T10:03:45Z|00228|binding|INFO|Setting lport 46d5cf00-9955-46d8-9cab-1f8f84e925f3 ovn-installed in OVS Nov 28 05:03:45 localhost nova_compute[279673]: 2025-11-28 10:03:45.889 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:45 localhost ovn_controller[152322]: 2025-11-28T10:03:45Z|00229|binding|INFO|Setting lport 46d5cf00-9955-46d8-9cab-1f8f84e925f3 up in Southbound Nov 28 05:03:45 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:45.894 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=46d5cf00-9955-46d8-9cab-1f8f84e925f3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:45 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:45.896 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 46d5cf00-9955-46d8-9cab-1f8f84e925f3 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis#033[00m Nov 28 05:03:45 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:45.897 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:03:45 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:45.898 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[7333c351-4d81-4435-a6e1-c9b5a4a0bec3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:45 localhost journal[227875]: ethtool ioctl error on tap46d5cf00-99: No such device Nov 28 05:03:45 localhost nova_compute[279673]: 2025-11-28 10:03:45.909 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:45 localhost journal[227875]: ethtool ioctl error on tap46d5cf00-99: No such device Nov 28 05:03:45 localhost nova_compute[279673]: 2025-11-28 10:03:45.916 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:45 localhost journal[227875]: ethtool ioctl error on tap46d5cf00-99: No such device Nov 28 05:03:45 localhost journal[227875]: ethtool ioctl error on tap46d5cf00-99: No such device Nov 28 05:03:45 localhost journal[227875]: ethtool ioctl error on tap46d5cf00-99: No such device Nov 28 05:03:45 localhost journal[227875]: ethtool ioctl error on tap46d5cf00-99: No such device Nov 28 05:03:45 localhost journal[227875]: ethtool ioctl error on tap46d5cf00-99: No such device Nov 28 05:03:45 localhost journal[227875]: ethtool ioctl error on tap46d5cf00-99: No such device Nov 28 05:03:45 localhost nova_compute[279673]: 2025-11-28 10:03:45.973 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:46 localhost nova_compute[279673]: 2025-11-28 10:03:46.017 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:46 localhost nova_compute[279673]: 2025-11-28 10:03:46.245 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:03:46 localhost nova_compute[279673]: 2025-11-28 10:03:46.254 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 05:03:46 localhost nova_compute[279673]: 2025-11-28 10:03:46.334 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 05:03:46 localhost nova_compute[279673]: 2025-11-28 10:03:46.338 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 05:03:46 localhost nova_compute[279673]: 2025-11-28 10:03:46.338 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.738s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:03:46 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:46.434 261084 INFO neutron.agent.dhcp.agent [None req-27c1ffa9-10aa-4d14-9216-c2e9382ec5f8 - - - - - -] DHCP configuration for ports {'12365673-4af9-4ad5-b481-8ba59b4a12f4'} is completed#033[00m Nov 28 05:03:46 localhost podman[315080]: Nov 28 05:03:46 localhost podman[315080]: 2025-11-28 10:03:46.990210089 +0000 UTC m=+0.103462676 container create 329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125) Nov 28 05:03:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:03:47 localhost systemd[1]: Started libpod-conmon-329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4.scope. Nov 28 05:03:47 localhost podman[315080]: 2025-11-28 10:03:46.944654584 +0000 UTC m=+0.057907231 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:03:47 localhost systemd[1]: tmp-crun.McNrgT.mount: Deactivated successfully. Nov 28 05:03:47 localhost ovn_controller[152322]: 2025-11-28T10:03:47Z|00230|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:03:47 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e136 do_prune osdmap full prune enabled Nov 28 05:03:47 localhost systemd[1]: Started libcrun container. Nov 28 05:03:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/578f4be3cc19b21fef1ae33791a1b3511e0f8f189090f691ad48efc34f6226aa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:03:47 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e137 e137: 6 total, 6 up, 6 in Nov 28 05:03:47 localhost podman[315080]: 2025-11-28 10:03:47.089878485 +0000 UTC m=+0.203131072 container init 329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125) Nov 28 05:03:47 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e137: 6 total, 6 up, 6 in Nov 28 05:03:47 localhost podman[315080]: 2025-11-28 10:03:47.100824328 +0000 UTC m=+0.214076905 container start 329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:03:47 localhost nova_compute[279673]: 2025-11-28 10:03:47.101 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:47 localhost dnsmasq[315108]: started, version 2.85 cachesize 150 Nov 28 05:03:47 localhost dnsmasq[315108]: DNS service limited to local subnets Nov 28 05:03:47 localhost dnsmasq[315108]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:03:47 localhost dnsmasq[315108]: warning: no upstream servers configured Nov 28 05:03:47 localhost dnsmasq-dhcp[315108]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:03:47 localhost dnsmasq[315108]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:03:47 localhost dnsmasq-dhcp[315108]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:03:47 localhost dnsmasq-dhcp[315108]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:03:47 localhost podman[315095]: 2025-11-28 10:03:47.185855035 +0000 UTC m=+0.151199223 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, architecture=x86_64, config_id=edpm) Nov 28 05:03:47 localhost podman[315095]: 2025-11-28 10:03:47.198830226 +0000 UTC m=+0.164174414 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, version=9.6, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, config_id=edpm, release=1755695350, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 28 05:03:47 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:03:47 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:47.361 261084 INFO neutron.agent.dhcp.agent [None req-35bfdd1a-a27b-4b8e-9b52-2da75bab8f1c - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '46d5cf00-9955-46d8-9cab-1f8f84e925f3'} is completed#033[00m Nov 28 05:03:47 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:47.546 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:46Z, description=, device_id=fd23721d-5971-4a55-93b6-1f314d7bfba5, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=9814bbf1-0df0-4d12-a042-026ac18c60f8, ip_allocation=immediate, mac_address=fa:16:3e:2c:af:f0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=18, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['78639234-2b5b-4fdc-97e8-c3d3fd0c61b3'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:44Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=False, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1420, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:46Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1#033[00m Nov 28 05:03:47 localhost dnsmasq[315108]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses Nov 28 05:03:47 localhost dnsmasq-dhcp[315108]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:03:47 localhost podman[315138]: 2025-11-28 10:03:47.762010403 +0000 UTC m=+0.063358766 container kill 329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 05:03:47 localhost dnsmasq-dhcp[315108]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:03:48 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:48.038 261084 INFO neutron.agent.dhcp.agent [None req-6c58773d-3169-48ed-9bea-3b7ee28846fe - - - - - -] DHCP configuration for ports {'9814bbf1-0df0-4d12-a042-026ac18c60f8'} is completed#033[00m Nov 28 05:03:48 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:48.074 2 INFO neutron.agent.securitygroups_rpc [None req-d103e3e5-6a2c-4d52-97a2-9ed0e9f72fa6 595b5cbed3764c7a95b0ab3634e5becb 8c66e098e4fb4a349dc2bb4293454135 - - default default] Security group member updated ['f75eb612-183d-4664-84c2-1d15534e163f']#033[00m Nov 28 05:03:48 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e137 do_prune osdmap full prune enabled Nov 28 05:03:48 localhost openstack_network_exporter[240658]: ERROR 10:03:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:03:48 localhost openstack_network_exporter[240658]: ERROR 10:03:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:03:48 localhost openstack_network_exporter[240658]: ERROR 10:03:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:03:48 localhost openstack_network_exporter[240658]: ERROR 10:03:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:03:48 localhost openstack_network_exporter[240658]: Nov 28 05:03:48 localhost openstack_network_exporter[240658]: ERROR 10:03:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:03:48 localhost openstack_network_exporter[240658]: Nov 28 05:03:48 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e138 e138: 6 total, 6 up, 6 in Nov 28 05:03:48 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e138: 6 total, 6 up, 6 in Nov 28 05:03:48 localhost neutron_sriov_agent[254147]: 2025-11-28 10:03:48.287 2 INFO neutron.agent.securitygroups_rpc [None req-8df05b65-915d-4be4-a7dc-9f54beb052e9 b286c38dfd0e4889806c62c7b4b9ee98 50a1392ce96c4024bcd36a3df403ca29 - - default default] Security group member updated ['b372bb98-860c-4571-936b-bf08ecbd647d']#033[00m Nov 28 05:03:48 localhost nova_compute[279673]: 2025-11-28 10:03:48.341 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:03:48 localhost nova_compute[279673]: 2025-11-28 10:03:48.342 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 05:03:48 localhost nova_compute[279673]: 2025-11-28 10:03:48.343 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 05:03:48 localhost nova_compute[279673]: 2025-11-28 10:03:48.478 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 05:03:48 localhost nova_compute[279673]: 2025-11-28 10:03:48.479 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 05:03:48 localhost nova_compute[279673]: 2025-11-28 10:03:48.480 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 05:03:48 localhost nova_compute[279673]: 2025-11-28 10:03:48.480 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:03:48 localhost podman[315177]: 2025-11-28 10:03:48.525780329 +0000 UTC m=+0.065453246 container kill 62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eec9ef76-e9ff-47a5-b8c7-9b69e3732166, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:03:48 localhost dnsmasq[314875]: read /var/lib/neutron/dhcp/eec9ef76-e9ff-47a5-b8c7-9b69e3732166/addn_hosts - 0 addresses Nov 28 05:03:48 localhost dnsmasq-dhcp[314875]: read /var/lib/neutron/dhcp/eec9ef76-e9ff-47a5-b8c7-9b69e3732166/host Nov 28 05:03:48 localhost dnsmasq-dhcp[314875]: read /var/lib/neutron/dhcp/eec9ef76-e9ff-47a5-b8c7-9b69e3732166/opts Nov 28 05:03:48 localhost ovn_controller[152322]: 2025-11-28T10:03:48Z|00231|binding|INFO|Releasing lport 0f64cccb-f58c-4c85-8d1d-c5dbcdc8de50 from this chassis (sb_readonly=0) Nov 28 05:03:48 localhost kernel: device tap0f64cccb-f5 left promiscuous mode Nov 28 05:03:48 localhost nova_compute[279673]: 2025-11-28 10:03:48.723 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:48 localhost ovn_controller[152322]: 2025-11-28T10:03:48Z|00232|binding|INFO|Setting lport 0f64cccb-f58c-4c85-8d1d-c5dbcdc8de50 down in Southbound Nov 28 05:03:48 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:48.736 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-eec9ef76-e9ff-47a5-b8c7-9b69e3732166', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eec9ef76-e9ff-47a5-b8c7-9b69e3732166', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '50a1392ce96c4024bcd36a3df403ca29', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=76364ea1-fb09-4d6a-aaea-0dba21691fb4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0f64cccb-f58c-4c85-8d1d-c5dbcdc8de50) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:48 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:48.738 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 0f64cccb-f58c-4c85-8d1d-c5dbcdc8de50 in datapath eec9ef76-e9ff-47a5-b8c7-9b69e3732166 unbound from our chassis#033[00m Nov 28 05:03:48 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:48.739 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network eec9ef76-e9ff-47a5-b8c7-9b69e3732166 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:03:48 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:48.740 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[7db80c86-35dc-4700-bbf4-8e5142bc1ccb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:48 localhost nova_compute[279673]: 2025-11-28 10:03:48.752 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:49 localhost nova_compute[279673]: 2025-11-28 10:03:49.713 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:03:49 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:49.726 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:46Z, description=, device_id=fd23721d-5971-4a55-93b6-1f314d7bfba5, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=9814bbf1-0df0-4d12-a042-026ac18c60f8, ip_allocation=immediate, mac_address=fa:16:3e:2c:af:f0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=18, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['78639234-2b5b-4fdc-97e8-c3d3fd0c61b3'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:44Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=False, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1420, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:46Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1#033[00m Nov 28 05:03:49 localhost nova_compute[279673]: 2025-11-28 10:03:49.737 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 05:03:49 localhost nova_compute[279673]: 2025-11-28 10:03:49.738 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 05:03:49 localhost dnsmasq[315108]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses Nov 28 05:03:49 localhost dnsmasq-dhcp[315108]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:03:49 localhost podman[315218]: 2025-11-28 10:03:49.936257255 +0000 UTC m=+0.071335865 container kill 329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:03:49 localhost dnsmasq-dhcp[315108]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:03:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:03:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e138 do_prune osdmap full prune enabled Nov 28 05:03:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e139 e139: 6 total, 6 up, 6 in Nov 28 05:03:49 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e139: 6 total, 6 up, 6 in Nov 28 05:03:50 localhost nova_compute[279673]: 2025-11-28 10:03:50.012 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:50 localhost nova_compute[279673]: 2025-11-28 10:03:50.365 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:50 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:50.394 261084 INFO neutron.agent.dhcp.agent [None req-a6054db3-96e0-4b25-b3de-2530fc3c0543 - - - - - -] DHCP configuration for ports {'9814bbf1-0df0-4d12-a042-026ac18c60f8'} is completed#033[00m Nov 28 05:03:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:50.842 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:03:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:50.843 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:03:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:50.843 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:03:51 localhost systemd[1]: tmp-crun.TZhsTq.mount: Deactivated successfully. Nov 28 05:03:51 localhost dnsmasq[314875]: exiting on receipt of SIGTERM Nov 28 05:03:51 localhost podman[315255]: 2025-11-28 10:03:51.01865139 +0000 UTC m=+0.055998606 container kill 62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eec9ef76-e9ff-47a5-b8c7-9b69e3732166, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 05:03:51 localhost systemd[1]: libpod-62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c.scope: Deactivated successfully. Nov 28 05:03:51 localhost podman[315269]: 2025-11-28 10:03:51.104113279 +0000 UTC m=+0.067982300 container died 62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eec9ef76-e9ff-47a5-b8c7-9b69e3732166, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 28 05:03:51 localhost podman[315269]: 2025-11-28 10:03:51.138331199 +0000 UTC m=+0.102200170 container cleanup 62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eec9ef76-e9ff-47a5-b8c7-9b69e3732166, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 28 05:03:51 localhost systemd[1]: libpod-conmon-62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c.scope: Deactivated successfully. Nov 28 05:03:51 localhost podman[315271]: 2025-11-28 10:03:51.183431152 +0000 UTC m=+0.138113199 container remove 62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eec9ef76-e9ff-47a5-b8c7-9b69e3732166, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 05:03:51 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:51.513 261084 INFO neutron.agent.dhcp.agent [None req-7ca3987f-3444-4869-9417-f626955b6198 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:03:51 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:51.514 261084 INFO neutron.agent.dhcp.agent [None req-7ca3987f-3444-4869-9417-f626955b6198 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:03:51 localhost dnsmasq[315108]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:03:51 localhost dnsmasq-dhcp[315108]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:03:51 localhost dnsmasq-dhcp[315108]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:03:51 localhost podman[315316]: 2025-11-28 10:03:51.560104344 +0000 UTC m=+0.074232867 container kill 329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Nov 28 05:03:51 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:51.599 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:03:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:03:51 localhost ovn_controller[152322]: 2025-11-28T10:03:51Z|00233|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:03:51 localhost podman[315338]: 2025-11-28 10:03:51.869549412 +0000 UTC m=+0.091994268 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 05:03:51 localhost podman[315338]: 2025-11-28 10:03:51.883229134 +0000 UTC m=+0.105674040 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 28 05:03:51 localhost nova_compute[279673]: 2025-11-28 10:03:51.902 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:51 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:03:52 localhost systemd[1]: var-lib-containers-storage-overlay-721a99abb32a0f90fdd1a1d8fc012d3e9fa0457f65acb87fb03055f5486bc057-merged.mount: Deactivated successfully. Nov 28 05:03:52 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-62fbe091ee5e7dbf8f31f1146b57d941dc810a4092db9e381f05912d5ba30a2c-userdata-shm.mount: Deactivated successfully. Nov 28 05:03:52 localhost systemd[1]: run-netns-qdhcp\x2deec9ef76\x2de9ff\x2d47a5\x2db8c7\x2d9b69e3732166.mount: Deactivated successfully. Nov 28 05:03:52 localhost nova_compute[279673]: 2025-11-28 10:03:52.164 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:03:52 localhost dnsmasq[315108]: exiting on receipt of SIGTERM Nov 28 05:03:52 localhost podman[315378]: 2025-11-28 10:03:52.639288188 +0000 UTC m=+0.062230795 container kill 329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2) Nov 28 05:03:52 localhost systemd[1]: libpod-329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4.scope: Deactivated successfully. Nov 28 05:03:52 localhost podman[315390]: 2025-11-28 10:03:52.712566268 +0000 UTC m=+0.057405537 container died 329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:03:52 localhost systemd[1]: tmp-crun.m1U55p.mount: Deactivated successfully. Nov 28 05:03:52 localhost podman[315390]: 2025-11-28 10:03:52.753365276 +0000 UTC m=+0.098204485 container cleanup 329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:03:52 localhost systemd[1]: libpod-conmon-329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4.scope: Deactivated successfully. Nov 28 05:03:52 localhost podman[315392]: 2025-11-28 10:03:52.800842517 +0000 UTC m=+0.134623139 container remove 329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 28 05:03:52 localhost ovn_controller[152322]: 2025-11-28T10:03:52Z|00234|binding|INFO|Releasing lport 46d5cf00-9955-46d8-9cab-1f8f84e925f3 from this chassis (sb_readonly=0) Nov 28 05:03:52 localhost nova_compute[279673]: 2025-11-28 10:03:52.847 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:52 localhost ovn_controller[152322]: 2025-11-28T10:03:52Z|00235|binding|INFO|Setting lport 46d5cf00-9955-46d8-9cab-1f8f84e925f3 down in Southbound Nov 28 05:03:52 localhost kernel: device tap46d5cf00-99 left promiscuous mode Nov 28 05:03:52 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:52.861 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=46d5cf00-9955-46d8-9cab-1f8f84e925f3) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:52 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:52.863 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 46d5cf00-9955-46d8-9cab-1f8f84e925f3 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis#033[00m Nov 28 05:03:52 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:52.865 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:03:52 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:52.866 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[c5f5e74a-121c-45a8-abee-e3e6a4648b61]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:52 localhost nova_compute[279673]: 2025-11-28 10:03:52.874 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:52 localhost nova_compute[279673]: 2025-11-28 10:03:52.875 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:53 localhost systemd[1]: var-lib-containers-storage-overlay-578f4be3cc19b21fef1ae33791a1b3511e0f8f189090f691ad48efc34f6226aa-merged.mount: Deactivated successfully. Nov 28 05:03:53 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-329c0dce623a80c20116ddbfa4d4e42ef3e52c0d343ad9bbf3ec6f34f057f3b4-userdata-shm.mount: Deactivated successfully. Nov 28 05:03:53 localhost systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully. Nov 28 05:03:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:03:54 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:54.349 261084 INFO neutron.agent.linux.ip_lib [None req-96df33e2-8cea-42a0-ae91-fe4e1878310c - - - - - -] Device tapd896fe01-47 cannot be used as it has no MAC address#033[00m Nov 28 05:03:54 localhost podman[315422]: 2025-11-28 10:03:54.37422586 +0000 UTC m=+0.130654705 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent) Nov 28 05:03:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:03:54 localhost nova_compute[279673]: 2025-11-28 10:03:54.384 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:54 localhost podman[315422]: 2025-11-28 10:03:54.390507157 +0000 UTC m=+0.146936052 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:03:54 localhost kernel: device tapd896fe01-47 entered promiscuous mode Nov 28 05:03:54 localhost ovn_controller[152322]: 2025-11-28T10:03:54Z|00236|binding|INFO|Claiming lport d896fe01-471a-407a-a0b3-6d40883262a8 for this chassis. Nov 28 05:03:54 localhost ovn_controller[152322]: 2025-11-28T10:03:54Z|00237|binding|INFO|d896fe01-471a-407a-a0b3-6d40883262a8: Claiming unknown Nov 28 05:03:54 localhost nova_compute[279673]: 2025-11-28 10:03:54.395 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:54 localhost NetworkManager[5967]: [1764324234.3971] manager: (tapd896fe01-47): new Generic device (/org/freedesktop/NetworkManager/Devices/41) Nov 28 05:03:54 localhost systemd-udevd[315454]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:03:54 localhost ovn_controller[152322]: 2025-11-28T10:03:54Z|00238|binding|INFO|Setting lport d896fe01-471a-407a-a0b3-6d40883262a8 ovn-installed in OVS Nov 28 05:03:54 localhost nova_compute[279673]: 2025-11-28 10:03:54.407 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:54 localhost ovn_controller[152322]: 2025-11-28T10:03:54Z|00239|binding|INFO|Setting lport d896fe01-471a-407a-a0b3-6d40883262a8 up in Southbound Nov 28 05:03:54 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:54.409 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d896fe01-471a-407a-a0b3-6d40883262a8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:54 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:54.411 158130 INFO neutron.agent.ovn.metadata.agent [-] Port d896fe01-471a-407a-a0b3-6d40883262a8 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis#033[00m Nov 28 05:03:54 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:54.413 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:03:54 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:54.414 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[334140aa-b9ba-46e6-986d-861e1ad2cd13]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:54 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:03:54 localhost nova_compute[279673]: 2025-11-28 10:03:54.426 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:54 localhost nova_compute[279673]: 2025-11-28 10:03:54.474 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:54 localhost nova_compute[279673]: 2025-11-28 10:03:54.505 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:54 localhost podman[315447]: 2025-11-28 10:03:54.507628132 +0000 UTC m=+0.110366453 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 28 05:03:54 localhost podman[315447]: 2025-11-28 10:03:54.552387865 +0000 UTC m=+0.155126226 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 05:03:54 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:03:54 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:54.960 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:03:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:03:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e139 do_prune osdmap full prune enabled Nov 28 05:03:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e140 e140: 6 total, 6 up, 6 in Nov 28 05:03:54 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e140: 6 total, 6 up, 6 in Nov 28 05:03:55 localhost nova_compute[279673]: 2025-11-28 10:03:55.391 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:55 localhost podman[315526]: Nov 28 05:03:55 localhost podman[315526]: 2025-11-28 10:03:55.476860105 +0000 UTC m=+0.116885270 container create d9ddaa0a45813927e74b018780caae1ea49f79d4cd65767e110fa6166261c423 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125) Nov 28 05:03:55 localhost systemd[1]: Started libpod-conmon-d9ddaa0a45813927e74b018780caae1ea49f79d4cd65767e110fa6166261c423.scope. Nov 28 05:03:55 localhost podman[315526]: 2025-11-28 10:03:55.429115267 +0000 UTC m=+0.069140482 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:03:55 localhost systemd[1]: Started libcrun container. Nov 28 05:03:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c9bf28eca558c2a4e7a8c7f2646cc9ee0c26530bb499ee82f943e5f60e7841a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:03:55 localhost podman[315526]: 2025-11-28 10:03:55.571056604 +0000 UTC m=+0.211081769 container init d9ddaa0a45813927e74b018780caae1ea49f79d4cd65767e110fa6166261c423 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 28 05:03:55 localhost podman[315526]: 2025-11-28 10:03:55.584921961 +0000 UTC m=+0.224947126 container start d9ddaa0a45813927e74b018780caae1ea49f79d4cd65767e110fa6166261c423 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 05:03:55 localhost dnsmasq[315544]: started, version 2.85 cachesize 150 Nov 28 05:03:55 localhost dnsmasq[315544]: DNS service limited to local subnets Nov 28 05:03:55 localhost dnsmasq[315544]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:03:55 localhost dnsmasq[315544]: warning: no upstream servers configured Nov 28 05:03:55 localhost dnsmasq[315544]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:03:55 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:55.646 261084 INFO neutron.agent.dhcp.agent [None req-96df33e2-8cea-42a0-ae91-fe4e1878310c - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:53Z, description=, device_id=83d505e9-2952-4cc1-b961-ffbd511bb38a, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=40f1021c-6ae3-41b5-b7d7-1bd4943993da, ip_allocation=immediate, mac_address=fa:16:3e:0a:a8:0e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=20, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['7a3f4fc8-2554-42b9-b3cf-c49e67a0bb09'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:52Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=False, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1461, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:54Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1#033[00m Nov 28 05:03:55 localhost dnsmasq[315544]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses Nov 28 05:03:55 localhost podman[315562]: 2025-11-28 10:03:55.842233174 +0000 UTC m=+0.055630305 container kill d9ddaa0a45813927e74b018780caae1ea49f79d4cd65767e110fa6166261c423 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 28 05:03:55 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:55.932 261084 INFO neutron.agent.dhcp.agent [None req-8f336738-3886-4916-8fc2-4fc9156113b8 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed#033[00m Nov 28 05:03:56 localhost ovn_controller[152322]: 2025-11-28T10:03:56Z|00240|binding|INFO|Releasing lport d896fe01-471a-407a-a0b3-6d40883262a8 from this chassis (sb_readonly=0) Nov 28 05:03:56 localhost nova_compute[279673]: 2025-11-28 10:03:56.038 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:56 localhost ovn_controller[152322]: 2025-11-28T10:03:56Z|00241|binding|INFO|Setting lport d896fe01-471a-407a-a0b3-6d40883262a8 down in Southbound Nov 28 05:03:56 localhost kernel: device tapd896fe01-47 left promiscuous mode Nov 28 05:03:56 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:56.052 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d896fe01-471a-407a-a0b3-6d40883262a8) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:56 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:56.055 158130 INFO neutron.agent.ovn.metadata.agent [-] Port d896fe01-471a-407a-a0b3-6d40883262a8 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis#033[00m Nov 28 05:03:56 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:56.058 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:03:56 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:56.059 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[adba56c0-8a10-470d-b7c2-8ce4f68718a6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:56 localhost nova_compute[279673]: 2025-11-28 10:03:56.061 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.119 261084 INFO neutron.agent.dhcp.agent [None req-10ec3fab-cc80-4fc1-85ca-fff69e63bb87 - - - - - -] DHCP configuration for ports {'40f1021c-6ae3-41b5-b7d7-1bd4943993da'} is completed#033[00m Nov 28 05:03:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.554 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:03:53Z, description=, device_id=83d505e9-2952-4cc1-b961-ffbd511bb38a, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=40f1021c-6ae3-41b5-b7d7-1bd4943993da, ip_allocation=immediate, mac_address=fa:16:3e:0a:a8:0e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=20, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['7a3f4fc8-2554-42b9-b3cf-c49e67a0bb09'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:52Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=False, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1461, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:03:54Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1#033[00m Nov 28 05:03:56 localhost systemd[1]: tmp-crun.j64yLf.mount: Deactivated successfully. Nov 28 05:03:56 localhost podman[315584]: 2025-11-28 10:03:56.612908468 +0000 UTC m=+0.095721404 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125) Nov 28 05:03:56 localhost podman[315584]: 2025-11-28 10:03:56.625677014 +0000 UTC m=+0.108489950 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible) Nov 28 05:03:56 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:03:56 localhost dnsmasq[315544]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses Nov 28 05:03:56 localhost podman[315619]: 2025-11-28 10:03:56.776489905 +0000 UTC m=+0.073407534 container kill d9ddaa0a45813927e74b018780caae1ea49f79d4cd65767e110fa6166261c423 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent [-] Unable to reload_allocations dhcp for 8642adde-54ae-4fc2-b997-bf1962c6c7f1.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapd896fe01-47 not found in namespace qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1. Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent return fut.result() Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent return self.__get_result() Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent raise self._exception Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapd896fe01-47 not found in namespace qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1. Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.801 261084 ERROR neutron.agent.dhcp.agent #033[00m Nov 28 05:03:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:56.867 261084 INFO neutron.agent.dhcp.agent [None req-6a56c746-7130-4efb-8c36-7af29a638d45 - - - - - -] DHCP configuration for ports {'40f1021c-6ae3-41b5-b7d7-1bd4943993da'} is completed#033[00m Nov 28 05:03:57 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e140 do_prune osdmap full prune enabled Nov 28 05:03:57 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e141 e141: 6 total, 6 up, 6 in Nov 28 05:03:57 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e141: 6 total, 6 up, 6 in Nov 28 05:03:57 localhost nova_compute[279673]: 2025-11-28 10:03:57.081 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:57 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:57.204 261084 INFO neutron.agent.linux.ip_lib [None req-3f3aeafc-19b3-4e93-b7b2-cb95dd628857 - - - - - -] Device tapbe14f038-71 cannot be used as it has no MAC address#033[00m Nov 28 05:03:57 localhost nova_compute[279673]: 2025-11-28 10:03:57.234 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:57 localhost kernel: device tapbe14f038-71 entered promiscuous mode Nov 28 05:03:57 localhost NetworkManager[5967]: [1764324237.2440] manager: (tapbe14f038-71): new Generic device (/org/freedesktop/NetworkManager/Devices/42) Nov 28 05:03:57 localhost ovn_controller[152322]: 2025-11-28T10:03:57Z|00242|binding|INFO|Claiming lport be14f038-71a5-4e0d-89f8-2103c3afd8ad for this chassis. Nov 28 05:03:57 localhost ovn_controller[152322]: 2025-11-28T10:03:57Z|00243|binding|INFO|be14f038-71a5-4e0d-89f8-2103c3afd8ad: Claiming unknown Nov 28 05:03:57 localhost nova_compute[279673]: 2025-11-28 10:03:57.245 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:57 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:57.256 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b11f04c88dd4db3aa7f405d125f76a4', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=785a40f8-f8b6-4ed9-ab6b-593396055237, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=be14f038-71a5-4e0d-89f8-2103c3afd8ad) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:57 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:57.258 158130 INFO neutron.agent.ovn.metadata.agent [-] Port be14f038-71a5-4e0d-89f8-2103c3afd8ad in datapath a8ba88fa-1415-4e6a-82ae-4f41cdd912f1 bound to our chassis#033[00m Nov 28 05:03:57 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:57.260 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a8ba88fa-1415-4e6a-82ae-4f41cdd912f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:03:57 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:57.261 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[0f69eb90-0c21-4644-9686-d9cff1cafb8e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:57 localhost ovn_controller[152322]: 2025-11-28T10:03:57Z|00244|binding|INFO|Setting lport be14f038-71a5-4e0d-89f8-2103c3afd8ad ovn-installed in OVS Nov 28 05:03:57 localhost ovn_controller[152322]: 2025-11-28T10:03:57Z|00245|binding|INFO|Setting lport be14f038-71a5-4e0d-89f8-2103c3afd8ad up in Southbound Nov 28 05:03:57 localhost nova_compute[279673]: 2025-11-28 10:03:57.282 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:57 localhost nova_compute[279673]: 2025-11-28 10:03:57.336 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:57 localhost nova_compute[279673]: 2025-11-28 10:03:57.376 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:57 localhost systemd[1]: tmp-crun.6snNtl.mount: Deactivated successfully. Nov 28 05:03:58 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e141 do_prune osdmap full prune enabled Nov 28 05:03:58 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e142 e142: 6 total, 6 up, 6 in Nov 28 05:03:58 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e142: 6 total, 6 up, 6 in Nov 28 05:03:58 localhost podman[315697]: Nov 28 05:03:58 localhost podman[315697]: 2025-11-28 10:03:58.359861545 +0000 UTC m=+0.101003165 container create 3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3) Nov 28 05:03:58 localhost systemd[1]: Started libpod-conmon-3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b.scope. Nov 28 05:03:58 localhost podman[315697]: 2025-11-28 10:03:58.314052022 +0000 UTC m=+0.055193672 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:03:58 localhost systemd[1]: Started libcrun container. Nov 28 05:03:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3b7a3bb605cc17c39a0a6c043199db98db1eab94e3efb6892ac928f5feb4c4b2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:03:58 localhost podman[315697]: 2025-11-28 10:03:58.431316412 +0000 UTC m=+0.172458032 container init 3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:03:58 localhost podman[315697]: 2025-11-28 10:03:58.440474344 +0000 UTC m=+0.181615964 container start 3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 28 05:03:58 localhost dnsmasq[315715]: started, version 2.85 cachesize 150 Nov 28 05:03:58 localhost dnsmasq[315715]: DNS service limited to local subnets Nov 28 05:03:58 localhost dnsmasq[315715]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:03:58 localhost dnsmasq[315715]: warning: no upstream servers configured Nov 28 05:03:58 localhost dnsmasq-dhcp[315715]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:03:58 localhost dnsmasq[315715]: read /var/lib/neutron/dhcp/a8ba88fa-1415-4e6a-82ae-4f41cdd912f1/addn_hosts - 0 addresses Nov 28 05:03:58 localhost dnsmasq-dhcp[315715]: read /var/lib/neutron/dhcp/a8ba88fa-1415-4e6a-82ae-4f41cdd912f1/host Nov 28 05:03:58 localhost dnsmasq-dhcp[315715]: read /var/lib/neutron/dhcp/a8ba88fa-1415-4e6a-82ae-4f41cdd912f1/opts Nov 28 05:03:58 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:58.518 261084 INFO neutron.agent.dhcp.agent [None req-a7e52cd8-48cc-42a2-89aa-67ae7618ec92 - - - - - -] Synchronizing state#033[00m Nov 28 05:03:58 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:58.629 261084 INFO neutron.agent.dhcp.agent [None req-958a9933-f9d9-4c72-9e8d-08b93ca9792c - - - - - -] DHCP configuration for ports {'912a3c72-960a-49a4-a4ad-da9ef63c5cb1'} is completed#033[00m Nov 28 05:03:58 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:58.697 261084 INFO neutron.agent.dhcp.agent [None req-3a4c8826-6eeb-4084-b6e5-81cd4ed1d4d8 - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 28 05:03:58 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:58.698 261084 INFO neutron.agent.dhcp.agent [-] Starting network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 dhcp configuration#033[00m Nov 28 05:03:58 localhost dnsmasq[315544]: exiting on receipt of SIGTERM Nov 28 05:03:58 localhost podman[315733]: 2025-11-28 10:03:58.894749081 +0000 UTC m=+0.064619193 container kill d9ddaa0a45813927e74b018780caae1ea49f79d4cd65767e110fa6166261c423 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 05:03:58 localhost systemd[1]: libpod-d9ddaa0a45813927e74b018780caae1ea49f79d4cd65767e110fa6166261c423.scope: Deactivated successfully. Nov 28 05:03:58 localhost podman[315753]: 2025-11-28 10:03:58.980172069 +0000 UTC m=+0.055444620 container died d9ddaa0a45813927e74b018780caae1ea49f79d4cd65767e110fa6166261c423 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 28 05:03:59 localhost podman[315753]: 2025-11-28 10:03:59.03500142 +0000 UTC m=+0.110273931 container remove d9ddaa0a45813927e74b018780caae1ea49f79d4cd65767e110fa6166261c423 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 28 05:03:59 localhost systemd[1]: libpod-conmon-d9ddaa0a45813927e74b018780caae1ea49f79d4cd65767e110fa6166261c423.scope: Deactivated successfully. Nov 28 05:03:59 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:03:59.084 261084 INFO neutron.agent.linux.ip_lib [-] Device tapd896fe01-47 cannot be used as it has no MAC address#033[00m Nov 28 05:03:59 localhost nova_compute[279673]: 2025-11-28 10:03:59.113 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:59 localhost kernel: device tapd896fe01-47 entered promiscuous mode Nov 28 05:03:59 localhost NetworkManager[5967]: [1764324239.1223] manager: (tapd896fe01-47): new Generic device (/org/freedesktop/NetworkManager/Devices/43) Nov 28 05:03:59 localhost ovn_controller[152322]: 2025-11-28T10:03:59Z|00246|binding|INFO|Claiming lport d896fe01-471a-407a-a0b3-6d40883262a8 for this chassis. Nov 28 05:03:59 localhost ovn_controller[152322]: 2025-11-28T10:03:59Z|00247|binding|INFO|d896fe01-471a-407a-a0b3-6d40883262a8: Claiming unknown Nov 28 05:03:59 localhost nova_compute[279673]: 2025-11-28 10:03:59.124 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:59 localhost ovn_controller[152322]: 2025-11-28T10:03:59Z|00248|binding|INFO|Setting lport d896fe01-471a-407a-a0b3-6d40883262a8 ovn-installed in OVS Nov 28 05:03:59 localhost nova_compute[279673]: 2025-11-28 10:03:59.135 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:59 localhost nova_compute[279673]: 2025-11-28 10:03:59.137 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:59 localhost ovn_controller[152322]: 2025-11-28T10:03:59Z|00249|binding|INFO|Setting lport d896fe01-471a-407a-a0b3-6d40883262a8 up in Southbound Nov 28 05:03:59 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:59.143 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d896fe01-471a-407a-a0b3-6d40883262a8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:03:59 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:59.145 158130 INFO neutron.agent.ovn.metadata.agent [-] Port d896fe01-471a-407a-a0b3-6d40883262a8 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis#033[00m Nov 28 05:03:59 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:59.146 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:03:59 localhost ovn_metadata_agent[158125]: 2025-11-28 10:03:59.148 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[b9bce8c2-bb80-434b-bf2d-6d67eac7f330]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:03:59 localhost journal[227875]: ethtool ioctl error on tapd896fe01-47: No such device Nov 28 05:03:59 localhost nova_compute[279673]: 2025-11-28 10:03:59.158 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:59 localhost journal[227875]: ethtool ioctl error on tapd896fe01-47: No such device Nov 28 05:03:59 localhost journal[227875]: ethtool ioctl error on tapd896fe01-47: No such device Nov 28 05:03:59 localhost journal[227875]: ethtool ioctl error on tapd896fe01-47: No such device Nov 28 05:03:59 localhost journal[227875]: ethtool ioctl error on tapd896fe01-47: No such device Nov 28 05:03:59 localhost journal[227875]: ethtool ioctl error on tapd896fe01-47: No such device Nov 28 05:03:59 localhost journal[227875]: ethtool ioctl error on tapd896fe01-47: No such device Nov 28 05:03:59 localhost journal[227875]: ethtool ioctl error on tapd896fe01-47: No such device Nov 28 05:03:59 localhost nova_compute[279673]: 2025-11-28 10:03:59.211 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:59 localhost nova_compute[279673]: 2025-11-28 10:03:59.245 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:59 localhost systemd[1]: var-lib-containers-storage-overlay-1c9bf28eca558c2a4e7a8c7f2646cc9ee0c26530bb499ee82f943e5f60e7841a-merged.mount: Deactivated successfully. Nov 28 05:03:59 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d9ddaa0a45813927e74b018780caae1ea49f79d4cd65767e110fa6166261c423-userdata-shm.mount: Deactivated successfully. Nov 28 05:03:59 localhost ovn_controller[152322]: 2025-11-28T10:03:59Z|00250|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:03:59 localhost nova_compute[279673]: 2025-11-28 10:03:59.607 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:03:59 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:04:00 localhost podman[315848]: Nov 28 05:04:00 localhost podman[315848]: 2025-11-28 10:04:00.057347735 +0000 UTC m=+0.095760266 container create d36fc74848dd5752738fa562ae7718338f734f35475b5ceed56e46fe22e5e32e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125) Nov 28 05:04:00 localhost systemd[1]: Started libpod-conmon-d36fc74848dd5752738fa562ae7718338f734f35475b5ceed56e46fe22e5e32e.scope. Nov 28 05:04:00 localhost podman[315848]: 2025-11-28 10:04:00.012581682 +0000 UTC m=+0.050994303 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:00 localhost systemd[1]: Started libcrun container. Nov 28 05:04:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55257e7db66b5e9d4238db462fc123dc4198a1b88b4403abed55e4b023628632/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:04:00 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e142 do_prune osdmap full prune enabled Nov 28 05:04:00 localhost podman[315848]: 2025-11-28 10:04:00.134219357 +0000 UTC m=+0.172631878 container init d36fc74848dd5752738fa562ae7718338f734f35475b5ceed56e46fe22e5e32e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 05:04:00 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e143 e143: 6 total, 6 up, 6 in Nov 28 05:04:00 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e143: 6 total, 6 up, 6 in Nov 28 05:04:00 localhost podman[315848]: 2025-11-28 10:04:00.148063464 +0000 UTC m=+0.186475995 container start d36fc74848dd5752738fa562ae7718338f734f35475b5ceed56e46fe22e5e32e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:04:00 localhost dnsmasq[315866]: started, version 2.85 cachesize 150 Nov 28 05:04:00 localhost dnsmasq[315866]: DNS service limited to local subnets Nov 28 05:04:00 localhost dnsmasq[315866]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:04:00 localhost dnsmasq[315866]: warning: no upstream servers configured Nov 28 05:04:00 localhost dnsmasq[315866]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:00 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:00.220 261084 INFO neutron.agent.dhcp.agent [-] Finished network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 dhcp configuration#033[00m Nov 28 05:04:00 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:00.220 261084 INFO neutron.agent.dhcp.agent [None req-3a4c8826-6eeb-4084-b6e5-81cd4ed1d4d8 - - - - - -] Synchronizing state complete#033[00m Nov 28 05:04:00 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:00.337 261084 INFO neutron.agent.dhcp.agent [None req-d5e0c524-6fbe-4131-9826-21f5df2a33b2 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', 'd896fe01-471a-407a-a0b3-6d40883262a8'} is completed#033[00m Nov 28 05:04:00 localhost nova_compute[279673]: 2025-11-28 10:04:00.431 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:00 localhost podman[315884]: 2025-11-28 10:04:00.539595053 +0000 UTC m=+0.065324264 container kill d36fc74848dd5752738fa562ae7718338f734f35475b5ceed56e46fe22e5e32e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:00 localhost dnsmasq[315866]: exiting on receipt of SIGTERM Nov 28 05:04:00 localhost systemd[1]: libpod-d36fc74848dd5752738fa562ae7718338f734f35475b5ceed56e46fe22e5e32e.scope: Deactivated successfully. Nov 28 05:04:00 localhost podman[315899]: 2025-11-28 10:04:00.602868655 +0000 UTC m=+0.047495732 container died d36fc74848dd5752738fa562ae7718338f734f35475b5ceed56e46fe22e5e32e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 05:04:00 localhost systemd[1]: tmp-crun.6yvhyr.mount: Deactivated successfully. Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.675 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.676 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.680 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9a7fb02-edf9-4d21-8122-a7ee5a7b6d25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:04:00.677004', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8fd9661c-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.848925512, 'message_signature': '1877d22c9c5110be366247cc412b88083af02b2e85742d24b0296ca4a4c59229'}]}, 'timestamp': '2025-11-28 10:04:00.681826', '_unique_id': '3097198d14c84247b55de72e90e5d796'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.683 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.684 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 28 05:04:00 localhost podman[315899]: 2025-11-28 10:04:00.695768097 +0000 UTC m=+0.140395114 container cleanup d36fc74848dd5752738fa562ae7718338f734f35475b5ceed56e46fe22e5e32e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Nov 28 05:04:00 localhost systemd[1]: libpod-conmon-d36fc74848dd5752738fa562ae7718338f734f35475b5ceed56e46fe22e5e32e.scope: Deactivated successfully. Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.717 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.718 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '363a2971-e35a-4845-9836-48f56dd4fd03', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:04:00.684962', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8fdf02d4-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.85688868, 'message_signature': '6b0f7f773c3376caa1ff08a85f907c5f79c4f259a7c3e43ca125b8bb32c1c3c1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:04:00.684962', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8fdf19ea-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.85688868, 'message_signature': 'dd4b1ed66d82d7140c85fdd6b5a996a0f66ffb93e8ca7880509fb944817e7940'}]}, 'timestamp': '2025-11-28 10:04:00.719223', '_unique_id': '6298c8f2775a4e75938624a7eae42d40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:04:00 localhost systemd-journald[47227]: Data hash table of /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal has a fill level at 75.0 (53724 of 71630 items, 25165824 file size, 468 bytes per hash table item), suggesting rotation. Nov 28 05:04:00 localhost systemd-journald[47227]: /run/log/journal/5cd59ba25ae47acac865224fa46a5f9e/system.journal: Journal header limits reached or header out-of-date, rotating. Nov 28 05:04:00 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.721 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.722 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.722 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.723 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70676303-5427-4d85-b912-b58f8a549b98', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:04:00.722465', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8fdfb288-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.85688868, 'message_signature': '5231a6837f804074d4cd46742a8d82b4e20d6f1ccdaa339be50745cf9e996b9d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:04:00.722465', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8fdfcea8-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.85688868, 'message_signature': 'a43604e8d0c15024d3ffd891e86adf51e2729b4e7f6210327d09a6d4705f8adb'}]}, 'timestamp': '2025-11-28 10:04:00.723821', '_unique_id': '902025e17c9a484cb88f2da95bcf8ad5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.725 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.726 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.727 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost podman[315900]: 2025-11-28 10:04:00.729119213 +0000 UTC m=+0.163902417 container remove d36fc74848dd5752738fa562ae7718338f734f35475b5ceed56e46fe22e5e32e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5357b61f-2aba-4c9a-a601-c78fea703e83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:04:00.726997', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8fe06926-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.848925512, 'message_signature': '80b91536985e85aa7ba4f0c0f0e4ec951e64142dadd95aeea0af86bf5e5cc122'}]}, 'timestamp': '2025-11-28 10:04:00.727817', '_unique_id': '37278e6f4e7947b3888d89ebac401af4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.729 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.731 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.731 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84f436f4-4b70-4535-b0a9-f469c7858ed6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:04:00.731317', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8fe10b56-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.848925512, 'message_signature': '1cdb739a9b55d10c9d33194449093bff1d8554c1f2aa2f38fdddfe6609402eb5'}]}, 'timestamp': '2025-11-28 10:04:00.731976', '_unique_id': '22882a474a334cb3b0cd404b65df046e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.733 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.734 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.735 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4124fd88-1bdf-4891-a4b9-7937249997cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:04:00.735104', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8fe1a138-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.848925512, 'message_signature': '80dfba672449ed38730979380c745807ba237b3d8abe7ba912e73a539936527e'}]}, 'timestamp': '2025-11-28 10:04:00.735825', '_unique_id': '6bb436499f86421e91e68c80057f6ede'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.737 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.738 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 28 05:04:00 localhost kernel: device tapd896fe01-47 left promiscuous mode Nov 28 05:04:00 localhost ovn_controller[152322]: 2025-11-28T10:04:00Z|00251|binding|INFO|Releasing lport d896fe01-471a-407a-a0b3-6d40883262a8 from this chassis (sb_readonly=0) Nov 28 05:04:00 localhost ovn_controller[152322]: 2025-11-28T10:04:00Z|00252|binding|INFO|Setting lport d896fe01-471a-407a-a0b3-6d40883262a8 down in Southbound Nov 28 05:04:00 localhost nova_compute[279673]: 2025-11-28 10:04:00.762 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.767 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 16370000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79354169-df3a-4cb4-948a-a01999f270d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16370000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:04:00.739071', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '8fe68afe-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.939135017, 'message_signature': 'c897e1db75163101dca09f0131d8f45e9af7bd2efbda9bf3ee30c88828bfac30'}]}, 'timestamp': '2025-11-28 10:04:00.767918', '_unique_id': '62feecdfb042478292f2ea6e457da144'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.770 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.770 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.770 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8e46437-9cb6-46dc-9737-01ac83740a3d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:04:00.770637', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '8fe708c6-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.939135017, 'message_signature': '6e8577687f2d022931bfbde3fe5ea703e1b1182027f7b554bb8829d688b97a60'}]}, 'timestamp': '2025-11-28 10:04:00.771163', '_unique_id': '83a07f879d0a4038b81825f51c553fe9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.774 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.774 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5465045-f2fd-425a-835b-0120814fe254', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:04:00.774359', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8fe79af2-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.848925512, 'message_signature': '54e4d09a84a1721f21976f40a4b05f4fe00e192367c13c552b5bfacb5d68d4d8'}]}, 'timestamp': '2025-11-28 10:04:00.774868', '_unique_id': '8aa8fe84b23c4108a3045ad3cf14d694'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.776 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.777 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.778 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost nova_compute[279673]: 2025-11-28 10:04:00.778 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:00.776 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d896fe01-471a-407a-a0b3-6d40883262a8) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:00.778 158130 INFO neutron.agent.ovn.metadata.agent [-] Port d896fe01-471a-407a-a0b3-6d40883262a8 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis#033[00m Nov 28 05:04:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:00.780 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:04:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:00.781 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[dfae5901-c08f-4a35-a21f-b7d8315dff63]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10e4046a-ccc1-456b-8981-05f848eeef3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:04:00.777969', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8fe82ec2-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.848925512, 'message_signature': '4fa842ae6fc2818dfa5d3600652b74f88d39d4c3cfadfe4525a882b02391e6f7'}]}, 'timestamp': '2025-11-28 10:04:00.779410', '_unique_id': '3387accc91594ad7bf1a7bf9fa3b716a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.785 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.785 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aee721c0-f0fb-4b6f-9582-4242c6bbb5bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:04:00.785397', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8fe948e8-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.848925512, 'message_signature': 'dc887fe0818e1d411378a3f36dde74ac33e6828f4d15abe1fd385ebc1c3ef728'}]}, 'timestamp': '2025-11-28 10:04:00.785912', '_unique_id': '2118d6b9c994488ab1fcdc14041abc40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.786 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.788 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.788 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.789 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19e7d889-92ed-402f-bf12-4973b1731198', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:04:00.788598', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8fe9c5ac-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.85688868, 'message_signature': 'e37d6d61a8e96f7528284c7126c17c07656f82e3371e9802777e18a00db2d56a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:04:00.788598', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8fe9d8c6-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.85688868, 'message_signature': '1dec9adff7bfb3927cc4c68ada0f2cf7d36574414001a01df5ba90db559e9653'}]}, 'timestamp': '2025-11-28 10:04:00.789532', '_unique_id': 'de37c780bee94a73a02bf0ab621de784'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.790 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.791 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.803 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.804 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5522ee05-e427-4d83-8050-65041aa55054', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:04:00.791826', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8fec0f88-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.963740072, 'message_signature': '062acaadb13cdceea3a1d3a62eb11375d3ff198ac403b80dc942ccdeef84004a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:04:00.791826', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8fec23a6-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.963740072, 'message_signature': 'f5b786fecdc738ee62e2a7c3f07f544db7ab9438c4d4434d000ed023b8e75496'}]}, 'timestamp': '2025-11-28 10:04:00.804555', '_unique_id': '47979b40213b41679ea1dc43571ca7bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.805 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.806 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.806 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.807 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.807 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3b57731-e209-467f-863d-7d22cfc2deb7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:04:00.806959', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8fec946c-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.963740072, 'message_signature': 'aabcbcc3434eb3affc00fd02d11d72bf27f7e95c3267e85f193d22accc0bf80a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:04:00.806959', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8feca4f2-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.963740072, 'message_signature': '9f6ffd287a43201de9822ea467198cdec02387c083a34017e7308812391198b6'}]}, 'timestamp': '2025-11-28 10:04:00.807864', '_unique_id': '15b609f400b549c2ae0c0019905b8d43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.808 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.810 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.810 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.810 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86f86b47-eac6-474f-a60c-926a4ac9dedf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:04:00.810206', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8fed11a8-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.963740072, 'message_signature': 'b0693e9a70bdcdd4d2332a735f7c9ff62c7a1ca8e4a8cb3ec749aced433df470'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:04:00.810206', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8fed2224-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.963740072, 'message_signature': '5946666589e7e23baeee209357572477219416bac03665e08fe74bed8c7e8ba9'}]}, 'timestamp': '2025-11-28 10:04:00.811092', '_unique_id': '775e15185f824008bf014eabc5d1d629'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.812 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.813 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.813 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.813 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0760e97e-f176-4572-af3c-4e23df721124', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:04:00.813291', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8fed8a34-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.85688868, 'message_signature': '3f712eb541707a648ce63a9b3b9bc2ff679f975bf62061ed35dc3601ff30f9d5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:04:00.813291', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8fed9b82-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.85688868, 'message_signature': '419c2fce2df44c9e01f295afe778f863994ed6cec222ea10fd5bff9d82a5fe9b'}]}, 'timestamp': '2025-11-28 10:04:00.814207', '_unique_id': 'ddf308707e9e407c8f1da8e4af5d6d00'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.815 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.816 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.816 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5497ddf7-892a-475b-acad-655bf92e3193', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:04:00.816390', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8fee0360-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.848925512, 'message_signature': '539917ea0916d2bfdc7c1cd38c329e73024f228fc11cfa1adf90bca0e4f6aa41'}]}, 'timestamp': '2025-11-28 10:04:00.816863', '_unique_id': '3fd8a4f64a8d476cb8b9bb59aed5119f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.817 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.818 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.819 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.819 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ebdca55d-1c50-4ec7-8381-96b0694729ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:04:00.819069', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8fee6c74-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.85688868, 'message_signature': '6ea4e1ce75cf15e6f315923863817910dc781c35e7ea6b3e473900f66a6fd9f9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:04:00.819069', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8fee7c8c-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.85688868, 'message_signature': '41da1ac54a8a4886a33fbd7953f8819efa9d55a746425210ca87f311601d10ac'}]}, 'timestamp': '2025-11-28 10:04:00.819931', '_unique_id': 'd4c4a15aaa2c41cc9cd11da130e1d8ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.820 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.822 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.822 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.822 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4fea83b9-992d-4b52-98b3-5f82462e73c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:04:00.822477', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8feef0f4-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.85688868, 'message_signature': 'e8dd903a934b37e879d08440d25527993886e90735523d53844c504eb82456d6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:04:00.822477', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8fef02ce-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.85688868, 'message_signature': 'a69b54a75edfeb5e6491620d4811ed047a695ae57f0242965e8e371abfedd5c2'}]}, 'timestamp': '2025-11-28 10:04:00.823371', '_unique_id': 'bac0904ac8ee4a599af8ca2faa22feaf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.824 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.825 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.825 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.825 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.825 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fab16d33-0576-4330-891a-b680d60664a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:04:00.825884', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8fef779a-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.848925512, 'message_signature': '1b5661850848126765fd207a37bb1a2b9a2b30c306d90cda8aa3040531a08ae7'}]}, 'timestamp': '2025-11-28 10:04:00.826390', '_unique_id': '37e3d50f88974acf986a113d2ae56734'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.827 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a51bf0bd-83ae-4609-979d-4aea874d7513', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:04:00.827818', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '8fefbe3a-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12074.848925512, 'message_signature': '0bfcbeacadc86f14b1e0f3cde4764fc0c063589c31f488272861c565fde55d28'}]}, 'timestamp': '2025-11-28 10:04:00.828151', '_unique_id': '901b04dc751f49f3a43be5583c55a264'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:04:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:04:00.828 12 ERROR oslo_messaging.notify.messaging Nov 28 05:04:00 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Nov 28 05:04:00 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:00.953 261084 INFO neutron.agent.dhcp.agent [None req-1bf1e86e-c455-4eb0-8661-53ccfd3c0aa4 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:04:00 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:00.954 261084 INFO neutron.agent.dhcp.agent [None req-1bf1e86e-c455-4eb0-8661-53ccfd3c0aa4 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:04:00 localhost sshd[315928]: main: sshd: ssh-rsa algorithm is disabled Nov 28 05:04:01 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e143 do_prune osdmap full prune enabled Nov 28 05:04:01 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e144 e144: 6 total, 6 up, 6 in Nov 28 05:04:01 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e144: 6 total, 6 up, 6 in Nov 28 05:04:01 localhost systemd[1]: var-lib-containers-storage-overlay-55257e7db66b5e9d4238db462fc123dc4198a1b88b4403abed55e4b023628632-merged.mount: Deactivated successfully. Nov 28 05:04:01 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d36fc74848dd5752738fa562ae7718338f734f35475b5ceed56e46fe22e5e32e-userdata-shm.mount: Deactivated successfully. Nov 28 05:04:01 localhost systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully. Nov 28 05:04:03 localhost nova_compute[279673]: 2025-11-28 10:04:03.179 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:03.970 261084 INFO neutron.agent.linux.ip_lib [None req-4fae51f1-3f1d-4d2d-a9ed-5509cf71f4c9 - - - - - -] Device tap5fa5dbd0-88 cannot be used as it has no MAC address#033[00m Nov 28 05:04:03 localhost nova_compute[279673]: 2025-11-28 10:04:03.994 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:04 localhost kernel: device tap5fa5dbd0-88 entered promiscuous mode Nov 28 05:04:04 localhost nova_compute[279673]: 2025-11-28 10:04:04.002 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:04 localhost ovn_controller[152322]: 2025-11-28T10:04:04Z|00253|binding|INFO|Claiming lport 5fa5dbd0-889e-4133-9ba4-6f3810999535 for this chassis. Nov 28 05:04:04 localhost ovn_controller[152322]: 2025-11-28T10:04:04Z|00254|binding|INFO|5fa5dbd0-889e-4133-9ba4-6f3810999535: Claiming unknown Nov 28 05:04:04 localhost NetworkManager[5967]: [1764324244.0038] manager: (tap5fa5dbd0-88): new Generic device (/org/freedesktop/NetworkManager/Devices/44) Nov 28 05:04:04 localhost systemd-udevd[315939]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:04:04 localhost ovn_controller[152322]: 2025-11-28T10:04:04Z|00255|binding|INFO|Setting lport 5fa5dbd0-889e-4133-9ba4-6f3810999535 ovn-installed in OVS Nov 28 05:04:04 localhost nova_compute[279673]: 2025-11-28 10:04:04.015 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:04 localhost journal[227875]: ethtool ioctl error on tap5fa5dbd0-88: No such device Nov 28 05:04:04 localhost nova_compute[279673]: 2025-11-28 10:04:04.035 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:04 localhost journal[227875]: ethtool ioctl error on tap5fa5dbd0-88: No such device Nov 28 05:04:04 localhost nova_compute[279673]: 2025-11-28 10:04:04.041 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:04 localhost journal[227875]: ethtool ioctl error on tap5fa5dbd0-88: No such device Nov 28 05:04:04 localhost journal[227875]: ethtool ioctl error on tap5fa5dbd0-88: No such device Nov 28 05:04:04 localhost journal[227875]: ethtool ioctl error on tap5fa5dbd0-88: No such device Nov 28 05:04:04 localhost journal[227875]: ethtool ioctl error on tap5fa5dbd0-88: No such device Nov 28 05:04:04 localhost journal[227875]: ethtool ioctl error on tap5fa5dbd0-88: No such device Nov 28 05:04:04 localhost journal[227875]: ethtool ioctl error on tap5fa5dbd0-88: No such device Nov 28 05:04:04 localhost nova_compute[279673]: 2025-11-28 10:04:04.079 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:04 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:04.106 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe5a:389/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5fa5dbd0-889e-4133-9ba4-6f3810999535) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:04 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:04.108 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 5fa5dbd0-889e-4133-9ba4-6f3810999535 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis#033[00m Nov 28 05:04:04 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:04.111 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 3fb0e974-4428-459d-a213-c32931747eec IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:04:04 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:04.111 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:04:04 localhost nova_compute[279673]: 2025-11-28 10:04:04.113 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:04 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:04.112 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[d682f522-f29b-4622-8e70-70d3419d2988]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:04 localhost ovn_controller[152322]: 2025-11-28T10:04:04Z|00256|binding|INFO|Setting lport 5fa5dbd0-889e-4133-9ba4-6f3810999535 up in Southbound Nov 28 05:04:04 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:04.965 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:04Z, description=, device_id=d3e54e1b-bd42-4183-a535-29739c2a7728, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7a77d608-8675-459d-a174-8748cdb9bb12, ip_allocation=immediate, mac_address=fa:16:3e:39:10:08, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:54Z, description=, dns_domain=, id=a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroupTestJSON-775491249-network, port_security_enabled=True, project_id=6b11f04c88dd4db3aa7f405d125f76a4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10427, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1467, status=ACTIVE, subnets=['6b4c8b66-460b-444a-9b80-e4ad139c626d'], tags=[], tenant_id=6b11f04c88dd4db3aa7f405d125f76a4, updated_at=2025-11-28T10:03:56Z, vlan_transparent=None, network_id=a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, port_security_enabled=False, project_id=6b11f04c88dd4db3aa7f405d125f76a4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1509, status=DOWN, tags=[], tenant_id=6b11f04c88dd4db3aa7f405d125f76a4, updated_at=2025-11-28T10:04:04Z on network a8ba88fa-1415-4e6a-82ae-4f41cdd912f1#033[00m Nov 28 05:04:04 localhost podman[316010]: Nov 28 05:04:04 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:04:04 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e144 do_prune osdmap full prune enabled Nov 28 05:04:04 localhost podman[316010]: 2025-11-28 10:04:04.995224774 +0000 UTC m=+0.102979142 container create 23956af9f313c65c4c605801daf5486c487e2e32aa6889c7ee0e355eb6ba6170 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Nov 28 05:04:05 localhost podman[316010]: 2025-11-28 10:04:04.941635189 +0000 UTC m=+0.049389637 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:05 localhost systemd[1]: Started libpod-conmon-23956af9f313c65c4c605801daf5486c487e2e32aa6889c7ee0e355eb6ba6170.scope. Nov 28 05:04:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:05.060 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2 2001:db8::f816:3eff:fef8:ad87'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 2001:db8::f816:3eff:fef8:ad87'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:05.062 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated#033[00m Nov 28 05:04:05 localhost systemd[1]: Started libcrun container. Nov 28 05:04:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:05.065 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 3fb0e974-4428-459d-a213-c32931747eec IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:04:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:05.066 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:04:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:05.067 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[6ea454cf-d354-42d8-9977-580570a5d945]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aac4355a69b01ddb27ef64e3ac236aecb4f69adf3422aae8cccf0b7119eb8998/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:04:05 localhost podman[316010]: 2025-11-28 10:04:05.080725514 +0000 UTC m=+0.188479882 container init 23956af9f313c65c4c605801daf5486c487e2e32aa6889c7ee0e355eb6ba6170 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:04:05 localhost podman[316010]: 2025-11-28 10:04:05.090291398 +0000 UTC m=+0.198045756 container start 23956af9f313c65c4c605801daf5486c487e2e32aa6889c7ee0e355eb6ba6170 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Nov 28 05:04:05 localhost dnsmasq[316038]: started, version 2.85 cachesize 150 Nov 28 05:04:05 localhost dnsmasq[316038]: DNS service limited to local subnets Nov 28 05:04:05 localhost dnsmasq[316038]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:04:05 localhost dnsmasq[316038]: warning: no upstream servers configured Nov 28 05:04:05 localhost dnsmasq[316038]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:05 localhost podman[316045]: 2025-11-28 10:04:05.226129811 +0000 UTC m=+0.047014378 container kill 3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Nov 28 05:04:05 localhost dnsmasq[315715]: read /var/lib/neutron/dhcp/a8ba88fa-1415-4e6a-82ae-4f41cdd912f1/addn_hosts - 1 addresses Nov 28 05:04:05 localhost dnsmasq-dhcp[315715]: read /var/lib/neutron/dhcp/a8ba88fa-1415-4e6a-82ae-4f41cdd912f1/host Nov 28 05:04:05 localhost dnsmasq-dhcp[315715]: read /var/lib/neutron/dhcp/a8ba88fa-1415-4e6a-82ae-4f41cdd912f1/opts Nov 28 05:04:05 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:05.253 261084 INFO neutron.agent.dhcp.agent [None req-6e76fc09-aae3-4c05-a37b-9cb66dcf8ec4 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed#033[00m Nov 28 05:04:05 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e145 e145: 6 total, 6 up, 6 in Nov 28 05:04:05 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e145: 6 total, 6 up, 6 in Nov 28 05:04:05 localhost nova_compute[279673]: 2025-11-28 10:04:05.472 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:05 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:05.504 261084 INFO neutron.agent.dhcp.agent [None req-f08d02d1-3b77-4ea9-9ab4-c899e7c4845b - - - - - -] DHCP configuration for ports {'7a77d608-8675-459d-a174-8748cdb9bb12'} is completed#033[00m Nov 28 05:04:05 localhost dnsmasq[316038]: exiting on receipt of SIGTERM Nov 28 05:04:05 localhost podman[316083]: 2025-11-28 10:04:05.517903721 +0000 UTC m=+0.097760512 container kill 23956af9f313c65c4c605801daf5486c487e2e32aa6889c7ee0e355eb6ba6170 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3) Nov 28 05:04:05 localhost systemd[1]: libpod-23956af9f313c65c4c605801daf5486c487e2e32aa6889c7ee0e355eb6ba6170.scope: Deactivated successfully. Nov 28 05:04:05 localhost podman[316098]: 2025-11-28 10:04:05.579775434 +0000 UTC m=+0.041053388 container died 23956af9f313c65c4c605801daf5486c487e2e32aa6889c7ee0e355eb6ba6170 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:04:05 localhost podman[316098]: 2025-11-28 10:04:05.633763331 +0000 UTC m=+0.095041265 container remove 23956af9f313c65c4c605801daf5486c487e2e32aa6889c7ee0e355eb6ba6170 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:04:05 localhost systemd[1]: libpod-conmon-23956af9f313c65c4c605801daf5486c487e2e32aa6889c7ee0e355eb6ba6170.scope: Deactivated successfully. Nov 28 05:04:06 localhost systemd[1]: var-lib-containers-storage-overlay-aac4355a69b01ddb27ef64e3ac236aecb4f69adf3422aae8cccf0b7119eb8998-merged.mount: Deactivated successfully. Nov 28 05:04:06 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-23956af9f313c65c4c605801daf5486c487e2e32aa6889c7ee0e355eb6ba6170-userdata-shm.mount: Deactivated successfully. Nov 28 05:04:07 localhost podman[316174]: Nov 28 05:04:07 localhost podman[316174]: 2025-11-28 10:04:07.088254087 +0000 UTC m=+0.092321697 container create 7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 05:04:07 localhost systemd[1]: Started libpod-conmon-7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44.scope. Nov 28 05:04:07 localhost systemd[1]: Started libcrun container. Nov 28 05:04:07 localhost podman[316174]: 2025-11-28 10:04:07.042399193 +0000 UTC m=+0.046466823 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2e9ba0aa3684c459cfc81f6cc61b5ee805300d4fb3d8ef243c9434b8e324e32f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:04:07 localhost podman[316174]: 2025-11-28 10:04:07.153873438 +0000 UTC m=+0.157941038 container init 7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 28 05:04:07 localhost podman[316174]: 2025-11-28 10:04:07.164705768 +0000 UTC m=+0.168773388 container start 7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:07 localhost dnsmasq[316192]: started, version 2.85 cachesize 150 Nov 28 05:04:07 localhost dnsmasq[316192]: DNS service limited to local subnets Nov 28 05:04:07 localhost dnsmasq[316192]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:04:07 localhost dnsmasq[316192]: warning: no upstream servers configured Nov 28 05:04:07 localhost dnsmasq-dhcp[316192]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:04:07 localhost dnsmasq[316192]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:07 localhost dnsmasq-dhcp[316192]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:07 localhost dnsmasq-dhcp[316192]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:07 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:07.228 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:04Z, description=, device_id=d3e54e1b-bd42-4183-a535-29739c2a7728, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7a77d608-8675-459d-a174-8748cdb9bb12, ip_allocation=immediate, mac_address=fa:16:3e:39:10:08, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:54Z, description=, dns_domain=, id=a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroupTestJSON-775491249-network, port_security_enabled=True, project_id=6b11f04c88dd4db3aa7f405d125f76a4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10427, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1467, status=ACTIVE, subnets=['6b4c8b66-460b-444a-9b80-e4ad139c626d'], tags=[], tenant_id=6b11f04c88dd4db3aa7f405d125f76a4, updated_at=2025-11-28T10:03:56Z, vlan_transparent=None, network_id=a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, port_security_enabled=False, project_id=6b11f04c88dd4db3aa7f405d125f76a4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1509, status=DOWN, tags=[], tenant_id=6b11f04c88dd4db3aa7f405d125f76a4, updated_at=2025-11-28T10:04:04Z on network a8ba88fa-1415-4e6a-82ae-4f41cdd912f1#033[00m Nov 28 05:04:07 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e145 do_prune osdmap full prune enabled Nov 28 05:04:07 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e146 e146: 6 total, 6 up, 6 in Nov 28 05:04:07 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e146: 6 total, 6 up, 6 in Nov 28 05:04:07 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:07.374 261084 INFO neutron.agent.dhcp.agent [None req-85d6ca77-7fb6-4a0a-93e2-f999f3b259aa - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '5fa5dbd0-889e-4133-9ba4-6f3810999535'} is completed#033[00m Nov 28 05:04:07 localhost neutron_sriov_agent[254147]: 2025-11-28 10:04:07.425 2 INFO neutron.agent.securitygroups_rpc [None req-f8d4b801-af07-4edf-8fd0-12384366c126 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:04:07 localhost podman[316209]: 2025-11-28 10:04:07.458379702 +0000 UTC m=+0.063713676 container kill 3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 05:04:07 localhost dnsmasq[315715]: read /var/lib/neutron/dhcp/a8ba88fa-1415-4e6a-82ae-4f41cdd912f1/addn_hosts - 1 addresses Nov 28 05:04:07 localhost dnsmasq-dhcp[315715]: read /var/lib/neutron/dhcp/a8ba88fa-1415-4e6a-82ae-4f41cdd912f1/host Nov 28 05:04:07 localhost dnsmasq-dhcp[315715]: read /var/lib/neutron/dhcp/a8ba88fa-1415-4e6a-82ae-4f41cdd912f1/opts Nov 28 05:04:07 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:07.538 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:06Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=8ea63c52-cb39-4f5e-9330-64cd47bf74a1, ip_allocation=immediate, mac_address=fa:16:3e:06:3f:d1, name=tempest-NetworksTestDHCPv6-151675552, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=23, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['24e3dfe7-e849-45dc-a899-cccb948411be', '82448f9b-7246-4f52-8ff3-d912fa5f48b9'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:03Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1511, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:07Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1#033[00m Nov 28 05:04:07 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:07.722 261084 INFO neutron.agent.dhcp.agent [None req-e69ac829-3f70-44eb-8227-a20f5e2fce20 - - - - - -] DHCP configuration for ports {'7a77d608-8675-459d-a174-8748cdb9bb12'} is completed#033[00m Nov 28 05:04:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:04:07 localhost podman[316248]: 2025-11-28 10:04:07.764793233 +0000 UTC m=+0.058324033 container kill 7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 05:04:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:04:07 localhost dnsmasq[316192]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 2 addresses Nov 28 05:04:07 localhost dnsmasq-dhcp[316192]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:07 localhost dnsmasq-dhcp[316192]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:07 localhost podman[316263]: 2025-11-28 10:04:07.870649256 +0000 UTC m=+0.093551162 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 05:04:07 localhost podman[316263]: 2025-11-28 10:04:07.883442112 +0000 UTC m=+0.106343988 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:04:07 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:04:07 localhost podman[316262]: 2025-11-28 10:04:07.969146418 +0000 UTC m=+0.195083861 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 05:04:07 localhost podman[316262]: 2025-11-28 10:04:07.976819708 +0000 UTC m=+0.202757221 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 05:04:07 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:04:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:08.076 261084 INFO neutron.agent.dhcp.agent [None req-4bf49508-5b67-4b74-9817-81b475360309 - - - - - -] DHCP configuration for ports {'8ea63c52-cb39-4f5e-9330-64cd47bf74a1'} is completed#033[00m Nov 28 05:04:09 localhost neutron_sriov_agent[254147]: 2025-11-28 10:04:09.207 2 INFO neutron.agent.securitygroups_rpc [None req-f248108c-11ce-43fd-804f-455d486d1048 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:04:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e146 do_prune osdmap full prune enabled Nov 28 05:04:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e147 e147: 6 total, 6 up, 6 in Nov 28 05:04:09 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e147: 6 total, 6 up, 6 in Nov 28 05:04:09 localhost dnsmasq[316192]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:09 localhost podman[316330]: 2025-11-28 10:04:09.526929105 +0000 UTC m=+0.061403091 container kill 7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true) Nov 28 05:04:09 localhost dnsmasq-dhcp[316192]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:09 localhost dnsmasq-dhcp[316192]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:04:10 localhost podman[238687]: time="2025-11-28T10:04:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:04:10 localhost podman[238687]: @ - - [28/Nov/2025:10:04:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159335 "" "Go-http-client/1.1" Nov 28 05:04:10 localhost podman[238687]: @ - - [28/Nov/2025:10:04:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20201 "" "Go-http-client/1.1" Nov 28 05:04:10 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:04:10 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3798834837' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:04:10 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:04:10 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3798834837' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:04:10 localhost nova_compute[279673]: 2025-11-28 10:04:10.474 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:04:10 localhost dnsmasq[316192]: exiting on receipt of SIGTERM Nov 28 05:04:10 localhost podman[316368]: 2025-11-28 10:04:10.940944532 +0000 UTC m=+0.068155835 container kill 7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:04:10 localhost systemd[1]: libpod-7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44.scope: Deactivated successfully. Nov 28 05:04:11 localhost podman[316382]: 2025-11-28 10:04:11.025880035 +0000 UTC m=+0.065576810 container died 7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2) Nov 28 05:04:11 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44-userdata-shm.mount: Deactivated successfully. Nov 28 05:04:11 localhost podman[316382]: 2025-11-28 10:04:11.063277406 +0000 UTC m=+0.102974091 container cleanup 7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:04:11 localhost systemd[1]: libpod-conmon-7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44.scope: Deactivated successfully. Nov 28 05:04:11 localhost podman[316384]: 2025-11-28 10:04:11.102948153 +0000 UTC m=+0.135971136 container remove 7632c1dc86931ee2caa45e50e9b1c1f3d05bfe20f60b4172767395dfd1c01e44 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:11 localhost systemd[1]: var-lib-containers-storage-overlay-2e9ba0aa3684c459cfc81f6cc61b5ee805300d4fb3d8ef243c9434b8e324e32f-merged.mount: Deactivated successfully. Nov 28 05:04:12 localhost podman[316457]: Nov 28 05:04:12 localhost podman[316457]: 2025-11-28 10:04:12.026226551 +0000 UTC m=+0.106100661 container create df94be99b838ddb7f178da6fbb4e082ee64a3e1c2e542ccf53a9b992a1a5455a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:12 localhost systemd[1]: Started libpod-conmon-df94be99b838ddb7f178da6fbb4e082ee64a3e1c2e542ccf53a9b992a1a5455a.scope. Nov 28 05:04:12 localhost podman[316457]: 2025-11-28 10:04:11.980512232 +0000 UTC m=+0.060386382 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:12 localhost systemd[1]: Started libcrun container. Nov 28 05:04:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a2ec9c6c2154e417c331c77cab1b1300ce0ef76468d7987ca196d76462339c1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:04:12 localhost podman[316457]: 2025-11-28 10:04:12.109461657 +0000 UTC m=+0.189335767 container init df94be99b838ddb7f178da6fbb4e082ee64a3e1c2e542ccf53a9b992a1a5455a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 28 05:04:12 localhost podman[316457]: 2025-11-28 10:04:12.118638689 +0000 UTC m=+0.198512799 container start df94be99b838ddb7f178da6fbb4e082ee64a3e1c2e542ccf53a9b992a1a5455a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:12 localhost dnsmasq[316476]: started, version 2.85 cachesize 150 Nov 28 05:04:12 localhost dnsmasq[316476]: DNS service limited to local subnets Nov 28 05:04:12 localhost dnsmasq[316476]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:04:12 localhost dnsmasq[316476]: warning: no upstream servers configured Nov 28 05:04:12 localhost dnsmasq-dhcp[316476]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:04:12 localhost dnsmasq[316476]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:12 localhost dnsmasq-dhcp[316476]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:12 localhost dnsmasq-dhcp[316476]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:12 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:12.439 261084 INFO neutron.agent.dhcp.agent [None req-fb529359-f6e2-46f3-abe3-e10e5e012b18 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '5fa5dbd0-889e-4133-9ba4-6f3810999535'} is completed#033[00m Nov 28 05:04:12 localhost dnsmasq[315715]: read /var/lib/neutron/dhcp/a8ba88fa-1415-4e6a-82ae-4f41cdd912f1/addn_hosts - 0 addresses Nov 28 05:04:12 localhost podman[316503]: 2025-11-28 10:04:12.533207218 +0000 UTC m=+0.064933341 container kill 3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true) Nov 28 05:04:12 localhost dnsmasq-dhcp[315715]: read /var/lib/neutron/dhcp/a8ba88fa-1415-4e6a-82ae-4f41cdd912f1/host Nov 28 05:04:12 localhost dnsmasq-dhcp[315715]: read /var/lib/neutron/dhcp/a8ba88fa-1415-4e6a-82ae-4f41cdd912f1/opts Nov 28 05:04:12 localhost dnsmasq[316476]: exiting on receipt of SIGTERM Nov 28 05:04:12 localhost podman[316522]: 2025-11-28 10:04:12.591946861 +0000 UTC m=+0.055751698 container kill df94be99b838ddb7f178da6fbb4e082ee64a3e1c2e542ccf53a9b992a1a5455a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3) Nov 28 05:04:12 localhost systemd[1]: libpod-df94be99b838ddb7f178da6fbb4e082ee64a3e1c2e542ccf53a9b992a1a5455a.scope: Deactivated successfully. Nov 28 05:04:12 localhost podman[316541]: 2025-11-28 10:04:12.6672927 +0000 UTC m=+0.060708470 container died df94be99b838ddb7f178da6fbb4e082ee64a3e1c2e542ccf53a9b992a1a5455a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125) Nov 28 05:04:12 localhost podman[316541]: 2025-11-28 10:04:12.703972341 +0000 UTC m=+0.097388061 container cleanup df94be99b838ddb7f178da6fbb4e082ee64a3e1c2e542ccf53a9b992a1a5455a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125) Nov 28 05:04:12 localhost systemd[1]: libpod-conmon-df94be99b838ddb7f178da6fbb4e082ee64a3e1c2e542ccf53a9b992a1a5455a.scope: Deactivated successfully. Nov 28 05:04:12 localhost podman[316543]: 2025-11-28 10:04:12.757160565 +0000 UTC m=+0.143098871 container remove df94be99b838ddb7f178da6fbb4e082ee64a3e1c2e542ccf53a9b992a1a5455a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:12 localhost ovn_controller[152322]: 2025-11-28T10:04:12Z|00257|binding|INFO|Releasing lport be14f038-71a5-4e0d-89f8-2103c3afd8ad from this chassis (sb_readonly=0) Nov 28 05:04:12 localhost nova_compute[279673]: 2025-11-28 10:04:12.776 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:12 localhost kernel: device tapbe14f038-71 left promiscuous mode Nov 28 05:04:12 localhost ovn_controller[152322]: 2025-11-28T10:04:12Z|00258|binding|INFO|Setting lport be14f038-71a5-4e0d-89f8-2103c3afd8ad down in Southbound Nov 28 05:04:12 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:12.789 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6b11f04c88dd4db3aa7f405d125f76a4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=785a40f8-f8b6-4ed9-ab6b-593396055237, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=be14f038-71a5-4e0d-89f8-2103c3afd8ad) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:12 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:12.790 158130 INFO neutron.agent.ovn.metadata.agent [-] Port be14f038-71a5-4e0d-89f8-2103c3afd8ad in datapath a8ba88fa-1415-4e6a-82ae-4f41cdd912f1 unbound from our chassis#033[00m Nov 28 05:04:12 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:12.793 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:04:12 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:12.794 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[deb68c85-54f9-4365-a241-48960eb76783]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:12 localhost nova_compute[279673]: 2025-11-28 10:04:12.794 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:12 localhost systemd[1]: var-lib-containers-storage-overlay-1a2ec9c6c2154e417c331c77cab1b1300ce0ef76468d7987ca196d76462339c1-merged.mount: Deactivated successfully. Nov 28 05:04:12 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-df94be99b838ddb7f178da6fbb4e082ee64a3e1c2e542ccf53a9b992a1a5455a-userdata-shm.mount: Deactivated successfully. Nov 28 05:04:13 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:13.941 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 2001:db8::f816:3eff:fef8:ad87'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2 2001:db8::f816:3eff:fef8:ad87'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:13 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:13.942 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated#033[00m Nov 28 05:04:13 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:13.945 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 3fb0e974-4428-459d-a213-c32931747eec IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:04:13 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:13.945 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:04:13 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:13.946 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[c8f393b2-272e-4e85-adbe-32d593e0a9b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:14 localhost podman[316629]: Nov 28 05:04:14 localhost podman[316629]: 2025-11-28 10:04:14.400972747 +0000 UTC m=+0.069766360 container create a79eaa4fe1bf0a89976149cbfcff470125d78840781bcc91d75c0f886caf95aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:04:14 localhost systemd[1]: Started libpod-conmon-a79eaa4fe1bf0a89976149cbfcff470125d78840781bcc91d75c0f886caf95aa.scope. Nov 28 05:04:14 localhost systemd[1]: Started libcrun container. Nov 28 05:04:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61899381419cb9e9ab4e8e7d65a63a600b6a344482cd29fb8ea61037f050100c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:04:14 localhost podman[316629]: 2025-11-28 10:04:14.374762085 +0000 UTC m=+0.043555748 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:14 localhost podman[316629]: 2025-11-28 10:04:14.48100599 +0000 UTC m=+0.149799593 container init a79eaa4fe1bf0a89976149cbfcff470125d78840781bcc91d75c0f886caf95aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3) Nov 28 05:04:14 localhost podman[316629]: 2025-11-28 10:04:14.490522773 +0000 UTC m=+0.159316386 container start a79eaa4fe1bf0a89976149cbfcff470125d78840781bcc91d75c0f886caf95aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3) Nov 28 05:04:14 localhost dnsmasq[316647]: started, version 2.85 cachesize 150 Nov 28 05:04:14 localhost dnsmasq[316647]: DNS service limited to local subnets Nov 28 05:04:14 localhost dnsmasq[316647]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:04:14 localhost dnsmasq[316647]: warning: no upstream servers configured Nov 28 05:04:14 localhost dnsmasq-dhcp[316647]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:04:14 localhost dnsmasq[316647]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:14 localhost dnsmasq-dhcp[316647]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:14 localhost dnsmasq-dhcp[316647]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:14 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:14.746 261084 INFO neutron.agent.dhcp.agent [None req-88f5cbd4-a475-472e-829e-0920b4855d51 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '5fa5dbd0-889e-4133-9ba4-6f3810999535'} is completed#033[00m Nov 28 05:04:14 localhost ovn_controller[152322]: 2025-11-28T10:04:14Z|00259|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:04:14 localhost nova_compute[279673]: 2025-11-28 10:04:14.853 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:14 localhost dnsmasq[316647]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:14 localhost dnsmasq-dhcp[316647]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:14 localhost dnsmasq-dhcp[316647]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:14 localhost podman[316663]: 2025-11-28 10:04:14.87396668 +0000 UTC m=+0.067892457 container kill a79eaa4fe1bf0a89976149cbfcff470125d78840781bcc91d75c0f886caf95aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 05:04:14 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:04:14 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e147 do_prune osdmap full prune enabled Nov 28 05:04:15 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e148 e148: 6 total, 6 up, 6 in Nov 28 05:04:15 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e148: 6 total, 6 up, 6 in Nov 28 05:04:15 localhost dnsmasq[315715]: exiting on receipt of SIGTERM Nov 28 05:04:15 localhost podman[316701]: 2025-11-28 10:04:15.242066617 +0000 UTC m=+0.060097843 container kill 3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:04:15 localhost systemd[1]: libpod-3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b.scope: Deactivated successfully. Nov 28 05:04:15 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:15.312 261084 INFO neutron.agent.dhcp.agent [None req-dc5e7cc5-9842-4dcf-9c74-6d14e6032755 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '5fa5dbd0-889e-4133-9ba4-6f3810999535'} is completed#033[00m Nov 28 05:04:15 localhost podman[316715]: 2025-11-28 10:04:15.315780709 +0000 UTC m=+0.056524340 container died 3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 28 05:04:15 localhost podman[316715]: 2025-11-28 10:04:15.349856645 +0000 UTC m=+0.090600206 container cleanup 3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 05:04:15 localhost systemd[1]: libpod-conmon-3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b.scope: Deactivated successfully. Nov 28 05:04:15 localhost podman[316716]: 2025-11-28 10:04:15.397435619 +0000 UTC m=+0.133016152 container remove 3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a8ba88fa-1415-4e6a-82ae-4f41cdd912f1, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3) Nov 28 05:04:15 localhost systemd[1]: var-lib-containers-storage-overlay-3b7a3bb605cc17c39a0a6c043199db98db1eab94e3efb6892ac928f5feb4c4b2-merged.mount: Deactivated successfully. Nov 28 05:04:15 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3b72d8de6aa8c764c17fd7a82828490581c31b022f355ca771ab13d3fa6d8e0b-userdata-shm.mount: Deactivated successfully. Nov 28 05:04:15 localhost nova_compute[279673]: 2025-11-28 10:04:15.504 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:15 localhost systemd[1]: tmp-crun.OZ1zaR.mount: Deactivated successfully. Nov 28 05:04:15 localhost dnsmasq[316647]: exiting on receipt of SIGTERM Nov 28 05:04:15 localhost podman[316761]: 2025-11-28 10:04:15.656961696 +0000 UTC m=+0.079747916 container kill a79eaa4fe1bf0a89976149cbfcff470125d78840781bcc91d75c0f886caf95aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true) Nov 28 05:04:15 localhost systemd[1]: libpod-a79eaa4fe1bf0a89976149cbfcff470125d78840781bcc91d75c0f886caf95aa.scope: Deactivated successfully. Nov 28 05:04:15 localhost podman[316775]: 2025-11-28 10:04:15.733391605 +0000 UTC m=+0.060825923 container died a79eaa4fe1bf0a89976149cbfcff470125d78840781bcc91d75c0f886caf95aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:04:15 localhost podman[316775]: 2025-11-28 10:04:15.764995082 +0000 UTC m=+0.092429360 container cleanup a79eaa4fe1bf0a89976149cbfcff470125d78840781bcc91d75c0f886caf95aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:04:15 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:15.766 261084 INFO neutron.agent.dhcp.agent [None req-0c0eeedf-cde3-419e-bb30-183a07fa525f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:04:15 localhost systemd[1]: libpod-conmon-a79eaa4fe1bf0a89976149cbfcff470125d78840781bcc91d75c0f886caf95aa.scope: Deactivated successfully. Nov 28 05:04:15 localhost podman[316777]: 2025-11-28 10:04:15.823161348 +0000 UTC m=+0.142807453 container remove a79eaa4fe1bf0a89976149cbfcff470125d78840781bcc91d75c0f886caf95aa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 28 05:04:15 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:15.894 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:04:16 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:16.316 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:04:16 localhost systemd[1]: var-lib-containers-storage-overlay-61899381419cb9e9ab4e8e7d65a63a600b6a344482cd29fb8ea61037f050100c-merged.mount: Deactivated successfully. Nov 28 05:04:16 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a79eaa4fe1bf0a89976149cbfcff470125d78840781bcc91d75c0f886caf95aa-userdata-shm.mount: Deactivated successfully. Nov 28 05:04:16 localhost systemd[1]: run-netns-qdhcp\x2da8ba88fa\x2d1415\x2d4e6a\x2d82ae\x2d4f41cdd912f1.mount: Deactivated successfully. Nov 28 05:04:17 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:17.263 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2 2001:db8::f816:3eff:fef8:ad87'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 2001:db8::f816:3eff:fef8:ad87'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:17 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:17.266 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated#033[00m Nov 28 05:04:17 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:17.268 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 3fb0e974-4428-459d-a213-c32931747eec IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:04:17 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:17.268 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:04:17 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:17.269 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[a32cc698-200b-40d9-9a66-875cd064f557]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:04:17 localhost systemd[1]: tmp-crun.1XsHMz.mount: Deactivated successfully. Nov 28 05:04:17 localhost podman[316829]: 2025-11-28 10:04:17.857750527 +0000 UTC m=+0.093087799 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc.) Nov 28 05:04:17 localhost podman[316829]: 2025-11-28 10:04:17.900620515 +0000 UTC m=+0.135957787 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers) Nov 28 05:04:17 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:04:18 localhost openstack_network_exporter[240658]: ERROR 10:04:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:04:18 localhost openstack_network_exporter[240658]: ERROR 10:04:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:04:18 localhost openstack_network_exporter[240658]: ERROR 10:04:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:04:18 localhost openstack_network_exporter[240658]: ERROR 10:04:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:04:18 localhost openstack_network_exporter[240658]: Nov 28 05:04:18 localhost openstack_network_exporter[240658]: ERROR 10:04:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:04:18 localhost openstack_network_exporter[240658]: Nov 28 05:04:18 localhost neutron_sriov_agent[254147]: 2025-11-28 10:04:18.312 2 INFO neutron.agent.securitygroups_rpc [None req-5939fc78-1573-4689-b1d7-9426dbeeb10b 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:04:18 localhost podman[316876]: Nov 28 05:04:18 localhost podman[316876]: 2025-11-28 10:04:18.374201775 +0000 UTC m=+0.097734902 container create ab464d5a065c66b18e0752124e2cf46cce105d1bd1129c15c15dbcfc9c312de2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:04:18 localhost systemd[1]: Started libpod-conmon-ab464d5a065c66b18e0752124e2cf46cce105d1bd1129c15c15dbcfc9c312de2.scope. Nov 28 05:04:18 localhost podman[316876]: 2025-11-28 10:04:18.327140156 +0000 UTC m=+0.050673323 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:18 localhost systemd[1]: Started libcrun container. Nov 28 05:04:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b82bff0f99f41361d4047b7d9eb46cf6ab0cee59ade8c6d5a2773ccc718f27d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:04:18 localhost podman[316876]: 2025-11-28 10:04:18.454153606 +0000 UTC m=+0.177686733 container init ab464d5a065c66b18e0752124e2cf46cce105d1bd1129c15c15dbcfc9c312de2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 28 05:04:18 localhost podman[316876]: 2025-11-28 10:04:18.464141342 +0000 UTC m=+0.187674469 container start ab464d5a065c66b18e0752124e2cf46cce105d1bd1129c15c15dbcfc9c312de2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:18 localhost dnsmasq[316894]: started, version 2.85 cachesize 150 Nov 28 05:04:18 localhost dnsmasq[316894]: DNS service limited to local subnets Nov 28 05:04:18 localhost dnsmasq[316894]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:04:18 localhost dnsmasq[316894]: warning: no upstream servers configured Nov 28 05:04:18 localhost dnsmasq-dhcp[316894]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:04:18 localhost dnsmasq-dhcp[316894]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:04:18 localhost dnsmasq[316894]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:18 localhost dnsmasq-dhcp[316894]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:18 localhost dnsmasq-dhcp[316894]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:18 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:18.553 261084 INFO neutron.agent.dhcp.agent [None req-a87e3802-90fd-4535-aae0-7e569047fd7f - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:17Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=b40a20d3-aa39-40cf-a7d0-da0fcde9a098, ip_allocation=immediate, mac_address=fa:16:3e:7e:0c:8c, name=tempest-NetworksTestDHCPv6-869146364, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=27, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['00f69316-162e-439e-a6c1-9cd70f0d853b', 'e92409df-693c-4ecf-a085-6f53220a2361'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:14Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1580, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:18Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1#033[00m Nov 28 05:04:18 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:18.735 261084 INFO neutron.agent.dhcp.agent [None req-1e745301-6ff5-4cd7-942b-a11ad0bca8bd - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '5fa5dbd0-889e-4133-9ba4-6f3810999535'} is completed#033[00m Nov 28 05:04:18 localhost dnsmasq[316894]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 2 addresses Nov 28 05:04:18 localhost podman[316913]: 2025-11-28 10:04:18.824261381 +0000 UTC m=+0.066119816 container kill ab464d5a065c66b18e0752124e2cf46cce105d1bd1129c15c15dbcfc9c312de2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:04:18 localhost dnsmasq-dhcp[316894]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:18 localhost dnsmasq-dhcp[316894]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:19 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:19.182 261084 INFO neutron.agent.dhcp.agent [None req-0f8ba0e6-6c52-45c0-8e2d-67bbdce3e3dd - - - - - -] DHCP configuration for ports {'b40a20d3-aa39-40cf-a7d0-da0fcde9a098'} is completed#033[00m Nov 28 05:04:19 localhost neutron_sriov_agent[254147]: 2025-11-28 10:04:19.189 2 INFO neutron.agent.securitygroups_rpc [None req-582a65ec-d5d3-451f-981d-4b6bb2c1b94e 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:04:19 localhost dnsmasq[316894]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:19 localhost dnsmasq-dhcp[316894]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:19 localhost podman[316952]: 2025-11-28 10:04:19.429237057 +0000 UTC m=+0.065815258 container kill ab464d5a065c66b18e0752124e2cf46cce105d1bd1129c15c15dbcfc9c312de2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Nov 28 05:04:19 localhost dnsmasq-dhcp[316894]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:19 localhost systemd[1]: virtsecretd.service: Deactivated successfully. Nov 28 05:04:19 localhost dnsmasq[316894]: exiting on receipt of SIGTERM Nov 28 05:04:19 localhost podman[316993]: 2025-11-28 10:04:19.964262497 +0000 UTC m=+0.072444337 container kill ab464d5a065c66b18e0752124e2cf46cce105d1bd1129c15c15dbcfc9c312de2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125) Nov 28 05:04:19 localhost systemd[1]: libpod-ab464d5a065c66b18e0752124e2cf46cce105d1bd1129c15c15dbcfc9c312de2.scope: Deactivated successfully. Nov 28 05:04:20 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:04:20 localhost podman[317019]: 2025-11-28 10:04:20.051335622 +0000 UTC m=+0.064865710 container died ab464d5a065c66b18e0752124e2cf46cce105d1bd1129c15c15dbcfc9c312de2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 28 05:04:20 localhost systemd[1]: tmp-crun.H1o6si.mount: Deactivated successfully. Nov 28 05:04:20 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ab464d5a065c66b18e0752124e2cf46cce105d1bd1129c15c15dbcfc9c312de2-userdata-shm.mount: Deactivated successfully. Nov 28 05:04:20 localhost systemd[1]: var-lib-containers-storage-overlay-1b82bff0f99f41361d4047b7d9eb46cf6ab0cee59ade8c6d5a2773ccc718f27d-merged.mount: Deactivated successfully. Nov 28 05:04:20 localhost podman[317019]: 2025-11-28 10:04:20.110398024 +0000 UTC m=+0.123928072 container remove ab464d5a065c66b18e0752124e2cf46cce105d1bd1129c15c15dbcfc9c312de2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:04:20 localhost systemd[1]: libpod-conmon-ab464d5a065c66b18e0752124e2cf46cce105d1bd1129c15c15dbcfc9c312de2.scope: Deactivated successfully. Nov 28 05:04:20 localhost nova_compute[279673]: 2025-11-28 10:04:20.508 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:20 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:20.886 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2 2001:db8::f816:3eff:fef8:ad87'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:20 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:20.888 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated#033[00m Nov 28 05:04:20 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:20.891 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 3fb0e974-4428-459d-a213-c32931747eec IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:04:20 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:20.891 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:04:20 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:20.892 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[86fc169a-647c-46e9-a60b-735d11464c26]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:20 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 05:04:20 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:04:21 localhost podman[317148]: Nov 28 05:04:21 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 05:04:21 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:04:21 localhost podman[317148]: 2025-11-28 10:04:21.025712551 +0000 UTC m=+0.095889648 container create 941638d49a9614b2dc93143a8abbafd739a5b82ed995976d94d0c2ac9d377d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 28 05:04:21 localhost systemd[1]: Started libpod-conmon-941638d49a9614b2dc93143a8abbafd739a5b82ed995976d94d0c2ac9d377d85.scope. Nov 28 05:04:21 localhost podman[317148]: 2025-11-28 10:04:20.979536058 +0000 UTC m=+0.049713195 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:21 localhost systemd[1]: Started libcrun container. Nov 28 05:04:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ee3114bc9472005d8cdf6f03db49c7ae6c8046cdba7b397d3300bce3c362549/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:04:21 localhost podman[317148]: 2025-11-28 10:04:21.131350418 +0000 UTC m=+0.201527525 container init 941638d49a9614b2dc93143a8abbafd739a5b82ed995976d94d0c2ac9d377d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3) Nov 28 05:04:21 localhost podman[317148]: 2025-11-28 10:04:21.140794749 +0000 UTC m=+0.210971856 container start 941638d49a9614b2dc93143a8abbafd739a5b82ed995976d94d0c2ac9d377d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 28 05:04:21 localhost dnsmasq[317185]: started, version 2.85 cachesize 150 Nov 28 05:04:21 localhost dnsmasq[317185]: DNS service limited to local subnets Nov 28 05:04:21 localhost dnsmasq[317185]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:04:21 localhost dnsmasq[317185]: warning: no upstream servers configured Nov 28 05:04:21 localhost dnsmasq-dhcp[317185]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:04:21 localhost dnsmasq[317185]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:21 localhost dnsmasq-dhcp[317185]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:21 localhost dnsmasq-dhcp[317185]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:21 localhost neutron_sriov_agent[254147]: 2025-11-28 10:04:21.269 2 INFO neutron.agent.securitygroups_rpc [None req-2bb5c88d-1a1c-4245-b22f-59b37a9a0aaf 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']#033[00m Nov 28 05:04:21 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:21.333 261084 INFO neutron.agent.dhcp.agent [None req-5a2acf10-c36b-4973-ac0a-755c4942cdd0 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '5fa5dbd0-889e-4133-9ba4-6f3810999535'} is completed#033[00m Nov 28 05:04:21 localhost podman[317204]: 2025-11-28 10:04:21.500562948 +0000 UTC m=+0.069462042 container kill 941638d49a9614b2dc93143a8abbafd739a5b82ed995976d94d0c2ac9d377d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 28 05:04:21 localhost dnsmasq[317185]: exiting on receipt of SIGTERM Nov 28 05:04:21 localhost systemd[1]: libpod-941638d49a9614b2dc93143a8abbafd739a5b82ed995976d94d0c2ac9d377d85.scope: Deactivated successfully. Nov 28 05:04:21 localhost podman[317222]: 2025-11-28 10:04:21.575400072 +0000 UTC m=+0.049946762 container died 941638d49a9614b2dc93143a8abbafd739a5b82ed995976d94d0c2ac9d377d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 28 05:04:21 localhost podman[317222]: 2025-11-28 10:04:21.621620366 +0000 UTC m=+0.096167046 container remove 941638d49a9614b2dc93143a8abbafd739a5b82ed995976d94d0c2ac9d377d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:04:21 localhost systemd[1]: libpod-conmon-941638d49a9614b2dc93143a8abbafd739a5b82ed995976d94d0c2ac9d377d85.scope: Deactivated successfully. Nov 28 05:04:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:04:22 localhost systemd[1]: var-lib-containers-storage-overlay-7ee3114bc9472005d8cdf6f03db49c7ae6c8046cdba7b397d3300bce3c362549-merged.mount: Deactivated successfully. Nov 28 05:04:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-941638d49a9614b2dc93143a8abbafd739a5b82ed995976d94d0c2ac9d377d85-userdata-shm.mount: Deactivated successfully. Nov 28 05:04:22 localhost podman[317245]: 2025-11-28 10:04:22.099918751 +0000 UTC m=+0.084721488 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 05:04:22 localhost podman[317245]: 2025-11-28 10:04:22.114549471 +0000 UTC m=+0.099352208 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 05:04:22 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:04:22 localhost neutron_sriov_agent[254147]: 2025-11-28 10:04:22.613 2 INFO neutron.agent.securitygroups_rpc [None req-038c15bb-00b9-42a6-bcc5-a72acb379335 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']#033[00m Nov 28 05:04:23 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:23.081 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2 2001:db8::f816:3eff:fef8:ad87'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:23 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:23.085 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated#033[00m Nov 28 05:04:23 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:23.088 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 3fb0e974-4428-459d-a213-c32931747eec IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:04:23 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:23.088 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:04:23 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:23.089 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[71bd525c-a759-4974-83f1-eccbcfd83c66]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:23 localhost podman[317319]: Nov 28 05:04:23 localhost podman[317319]: 2025-11-28 10:04:23.116814399 +0000 UTC m=+0.111829185 container create 5921a5dec60b5690f2ebdc26cd46bbcdc4c837261c9b15a4e6ffe8b1da0c7a62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:04:23 localhost systemd[1]: Started libpod-conmon-5921a5dec60b5690f2ebdc26cd46bbcdc4c837261c9b15a4e6ffe8b1da0c7a62.scope. Nov 28 05:04:23 localhost podman[317319]: 2025-11-28 10:04:23.070319017 +0000 UTC m=+0.065333823 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:23 localhost systemd[1]: Started libcrun container. Nov 28 05:04:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f8494eb7c4c1eacaab847e2fba5c6ba8e96cf0679e520f61c5dec5de3bf11885/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:04:23 localhost podman[317319]: 2025-11-28 10:04:23.197112821 +0000 UTC m=+0.192127637 container init 5921a5dec60b5690f2ebdc26cd46bbcdc4c837261c9b15a4e6ffe8b1da0c7a62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:23 localhost podman[317319]: 2025-11-28 10:04:23.206080347 +0000 UTC m=+0.201095123 container start 5921a5dec60b5690f2ebdc26cd46bbcdc4c837261c9b15a4e6ffe8b1da0c7a62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 28 05:04:23 localhost dnsmasq[317337]: started, version 2.85 cachesize 150 Nov 28 05:04:23 localhost dnsmasq[317337]: DNS service limited to local subnets Nov 28 05:04:23 localhost dnsmasq[317337]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:04:23 localhost dnsmasq[317337]: warning: no upstream servers configured Nov 28 05:04:23 localhost dnsmasq-dhcp[317337]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:04:23 localhost dnsmasq[317337]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:23 localhost dnsmasq-dhcp[317337]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:23 localhost dnsmasq-dhcp[317337]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:23 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:23.577 261084 INFO neutron.agent.dhcp.agent [None req-b9b7d2dd-3c6e-4de4-a296-d0f17396a2d3 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '5fa5dbd0-889e-4133-9ba4-6f3810999535'} is completed#033[00m Nov 28 05:04:23 localhost dnsmasq[317337]: exiting on receipt of SIGTERM Nov 28 05:04:23 localhost podman[317355]: 2025-11-28 10:04:23.813207544 +0000 UTC m=+0.065080206 container kill 5921a5dec60b5690f2ebdc26cd46bbcdc4c837261c9b15a4e6ffe8b1da0c7a62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:23 localhost systemd[1]: libpod-5921a5dec60b5690f2ebdc26cd46bbcdc4c837261c9b15a4e6ffe8b1da0c7a62.scope: Deactivated successfully. Nov 28 05:04:23 localhost podman[317367]: 2025-11-28 10:04:23.893804443 +0000 UTC m=+0.064967212 container died 5921a5dec60b5690f2ebdc26cd46bbcdc4c837261c9b15a4e6ffe8b1da0c7a62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:04:23 localhost podman[317367]: 2025-11-28 10:04:23.92823913 +0000 UTC m=+0.099401869 container cleanup 5921a5dec60b5690f2ebdc26cd46bbcdc4c837261c9b15a4e6ffe8b1da0c7a62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:04:23 localhost systemd[1]: libpod-conmon-5921a5dec60b5690f2ebdc26cd46bbcdc4c837261c9b15a4e6ffe8b1da0c7a62.scope: Deactivated successfully. Nov 28 05:04:23 localhost podman[317369]: 2025-11-28 10:04:23.982739552 +0000 UTC m=+0.142408702 container remove 5921a5dec60b5690f2ebdc26cd46bbcdc4c837261c9b15a4e6ffe8b1da0c7a62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3) Nov 28 05:04:24 localhost systemd[1]: var-lib-containers-storage-overlay-f8494eb7c4c1eacaab847e2fba5c6ba8e96cf0679e520f61c5dec5de3bf11885-merged.mount: Deactivated successfully. Nov 28 05:04:24 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5921a5dec60b5690f2ebdc26cd46bbcdc4c837261c9b15a4e6ffe8b1da0c7a62-userdata-shm.mount: Deactivated successfully. Nov 28 05:04:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:04:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:04:24 localhost neutron_sriov_agent[254147]: 2025-11-28 10:04:24.856 2 INFO neutron.agent.securitygroups_rpc [None req-5be6ae81-d286-424b-afd6-2b6865c77664 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:04:24 localhost podman[317427]: 2025-11-28 10:04:24.883229584 +0000 UTC m=+0.113622227 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:24 localhost podman[317427]: 2025-11-28 10:04:24.96965918 +0000 UTC m=+0.200051813 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent) Nov 28 05:04:24 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:04:25 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:04:25 localhost podman[317476]: Nov 28 05:04:25 localhost podman[317476]: 2025-11-28 10:04:25.029100254 +0000 UTC m=+0.110864858 container create 94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3) Nov 28 05:04:25 localhost podman[317476]: 2025-11-28 10:04:24.972238124 +0000 UTC m=+0.054002748 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:25 localhost podman[317426]: 2025-11-28 10:04:25.028505636 +0000 UTC m=+0.259787614 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:04:25 localhost systemd[1]: Started libpod-conmon-94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1.scope. Nov 28 05:04:25 localhost systemd[1]: Started libcrun container. Nov 28 05:04:25 localhost podman[317426]: 2025-11-28 10:04:25.131522969 +0000 UTC m=+0.362804987 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3) Nov 28 05:04:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87952802ebce232016498b37ed1bc68b69594690c353cd872596f34af65028ef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:04:25 localhost podman[317476]: 2025-11-28 10:04:25.145354125 +0000 UTC m=+0.227118719 container init 94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:04:25 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:04:25 localhost systemd[1]: tmp-crun.9X1tl8.mount: Deactivated successfully. Nov 28 05:04:25 localhost podman[317476]: 2025-11-28 10:04:25.167132289 +0000 UTC m=+0.248896883 container start 94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true) Nov 28 05:04:25 localhost dnsmasq[317512]: started, version 2.85 cachesize 150 Nov 28 05:04:25 localhost dnsmasq[317512]: DNS service limited to local subnets Nov 28 05:04:25 localhost dnsmasq[317512]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:04:25 localhost dnsmasq[317512]: warning: no upstream servers configured Nov 28 05:04:25 localhost dnsmasq-dhcp[317512]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:04:25 localhost dnsmasq[317512]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:25 localhost dnsmasq-dhcp[317512]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:25 localhost dnsmasq-dhcp[317512]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:25 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:25.230 261084 INFO neutron.agent.dhcp.agent [None req-4a879e38-f8b7-488f-a04a-96cd6030db32 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:24Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=4dd844d7-c4a5-45a5-8561-d62489aaa9e8, ip_allocation=immediate, mac_address=fa:16:3e:71:8f:32, name=tempest-NetworksTestDHCPv6-1013711471, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=31, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['1e4a40e1-a5dc-42b4-816e-f8554b6a9964', '6c8448d9-9093-4c59-8a0f-3d17b74604a1'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:21Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1614, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:24Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1#033[00m Nov 28 05:04:25 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:25.479 261084 INFO neutron.agent.dhcp.agent [None req-07dea47e-4fdd-4ae0-a660-9825ca60fb26 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '5fa5dbd0-889e-4133-9ba4-6f3810999535'} is completed#033[00m Nov 28 05:04:25 localhost dnsmasq[317512]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 2 addresses Nov 28 05:04:25 localhost dnsmasq-dhcp[317512]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:25 localhost dnsmasq-dhcp[317512]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:25 localhost podman[317531]: 2025-11-28 10:04:25.495138618 +0000 UTC m=+0.066754224 container kill 94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:04:25 localhost nova_compute[279673]: 2025-11-28 10:04:25.511 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:25 localhost nova_compute[279673]: 2025-11-28 10:04:25.516 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:25 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:25.812 261084 INFO neutron.agent.dhcp.agent [None req-ee8500d6-7313-44a5-9e23-08e406f60d88 - - - - - -] DHCP configuration for ports {'4dd844d7-c4a5-45a5-8561-d62489aaa9e8'} is completed#033[00m Nov 28 05:04:25 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 28 05:04:25 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:04:26 localhost neutron_sriov_agent[254147]: 2025-11-28 10:04:26.165 2 INFO neutron.agent.securitygroups_rpc [None req-f516662f-fbee-4914-9fae-83fcd2f7d639 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:04:26 localhost dnsmasq[317512]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:26 localhost dnsmasq-dhcp[317512]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:26 localhost dnsmasq-dhcp[317512]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:26 localhost podman[317573]: 2025-11-28 10:04:26.43051254 +0000 UTC m=+0.064455537 container kill 94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:04:26 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:26.738 261084 INFO neutron.agent.linux.ip_lib [None req-843ba996-86a5-49b0-9513-4e771998a713 - - - - - -] Device tap404598dd-47 cannot be used as it has no MAC address#033[00m Nov 28 05:04:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:04:26 localhost nova_compute[279673]: 2025-11-28 10:04:26.803 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:26 localhost kernel: device tap404598dd-47 entered promiscuous mode Nov 28 05:04:26 localhost NetworkManager[5967]: [1764324266.8169] manager: (tap404598dd-47): new Generic device (/org/freedesktop/NetworkManager/Devices/45) Nov 28 05:04:26 localhost nova_compute[279673]: 2025-11-28 10:04:26.814 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:26 localhost ovn_controller[152322]: 2025-11-28T10:04:26Z|00260|binding|INFO|Claiming lport 404598dd-4706-4aa3-a857-56207d0fd483 for this chassis. Nov 28 05:04:26 localhost ovn_controller[152322]: 2025-11-28T10:04:26Z|00261|binding|INFO|404598dd-4706-4aa3-a857-56207d0fd483: Claiming unknown Nov 28 05:04:26 localhost systemd-udevd[317614]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:04:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:26.827 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-1a246530-be70-4846-9202-8f9cd6d862ae', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a246530-be70-4846-9202-8f9cd6d862ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '50a1392ce96c4024bcd36a3df403ca29', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d9cc17b-2c39-4130-a2f2-9d12894eaf52, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=404598dd-4706-4aa3-a857-56207d0fd483) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:26.829 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 404598dd-4706-4aa3-a857-56207d0fd483 in datapath 1a246530-be70-4846-9202-8f9cd6d862ae bound to our chassis#033[00m Nov 28 05:04:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:26.831 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1a246530-be70-4846-9202-8f9cd6d862ae or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:04:26 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:26.832 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[9ae799ae-1384-4f78-9cb3-cf456e6640d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:26 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:04:26 localhost journal[227875]: ethtool ioctl error on tap404598dd-47: No such device Nov 28 05:04:26 localhost ovn_controller[152322]: 2025-11-28T10:04:26Z|00262|binding|INFO|Setting lport 404598dd-4706-4aa3-a857-56207d0fd483 ovn-installed in OVS Nov 28 05:04:26 localhost ovn_controller[152322]: 2025-11-28T10:04:26Z|00263|binding|INFO|Setting lport 404598dd-4706-4aa3-a857-56207d0fd483 up in Southbound Nov 28 05:04:26 localhost journal[227875]: ethtool ioctl error on tap404598dd-47: No such device Nov 28 05:04:26 localhost nova_compute[279673]: 2025-11-28 10:04:26.862 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:26 localhost journal[227875]: ethtool ioctl error on tap404598dd-47: No such device Nov 28 05:04:26 localhost journal[227875]: ethtool ioctl error on tap404598dd-47: No such device Nov 28 05:04:26 localhost journal[227875]: ethtool ioctl error on tap404598dd-47: No such device Nov 28 05:04:26 localhost journal[227875]: ethtool ioctl error on tap404598dd-47: No such device Nov 28 05:04:26 localhost journal[227875]: ethtool ioctl error on tap404598dd-47: No such device Nov 28 05:04:26 localhost journal[227875]: ethtool ioctl error on tap404598dd-47: No such device Nov 28 05:04:26 localhost nova_compute[279673]: 2025-11-28 10:04:26.901 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:26 localhost podman[317604]: 2025-11-28 10:04:26.922546538 +0000 UTC m=+0.150287946 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm) Nov 28 05:04:26 localhost podman[317604]: 2025-11-28 10:04:26.934868412 +0000 UTC m=+0.162609790 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm) Nov 28 05:04:26 localhost nova_compute[279673]: 2025-11-28 10:04:26.949 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:26 localhost ovn_controller[152322]: 2025-11-28T10:04:26Z|00264|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:04:26 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:04:27 localhost dnsmasq[317512]: exiting on receipt of SIGTERM Nov 28 05:04:27 localhost podman[317681]: 2025-11-28 10:04:27.389133618 +0000 UTC m=+0.072341674 container kill 94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3) Nov 28 05:04:27 localhost systemd[1]: libpod-94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1.scope: Deactivated successfully. Nov 28 05:04:27 localhost podman[317697]: 2025-11-28 10:04:27.47887005 +0000 UTC m=+0.071468579 container died 94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 28 05:04:27 localhost systemd[1]: tmp-crun.0eJiGp.mount: Deactivated successfully. Nov 28 05:04:27 localhost podman[317697]: 2025-11-28 10:04:27.526180305 +0000 UTC m=+0.118778784 container cleanup 94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Nov 28 05:04:27 localhost systemd[1]: libpod-conmon-94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1.scope: Deactivated successfully. Nov 28 05:04:27 localhost podman[317699]: 2025-11-28 10:04:27.614112025 +0000 UTC m=+0.199829817 container remove 94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Nov 28 05:04:27 localhost nova_compute[279673]: 2025-11-28 10:04:27.757 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:27 localhost podman[317764]: Nov 28 05:04:27 localhost podman[317764]: 2025-11-28 10:04:27.91512023 +0000 UTC m=+0.100517091 container create 57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a246530-be70-4846-9202-8f9cd6d862ae, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Nov 28 05:04:27 localhost podman[317764]: 2025-11-28 10:04:27.867111555 +0000 UTC m=+0.052508476 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:27 localhost systemd[1]: Started libpod-conmon-57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5.scope. Nov 28 05:04:27 localhost systemd[1]: Started libcrun container. Nov 28 05:04:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f57866dc5c6162f1b2474906fd9f4e8b712b44716f63d14a9891a2098561e5e3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:04:28 localhost podman[317764]: 2025-11-28 10:04:28.003271375 +0000 UTC m=+0.188668226 container init 57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a246530-be70-4846-9202-8f9cd6d862ae, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:04:28 localhost podman[317764]: 2025-11-28 10:04:28.013960372 +0000 UTC m=+0.199357223 container start 57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a246530-be70-4846-9202-8f9cd6d862ae, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:04:28 localhost dnsmasq[317792]: started, version 2.85 cachesize 150 Nov 28 05:04:28 localhost dnsmasq[317792]: DNS service limited to local subnets Nov 28 05:04:28 localhost dnsmasq[317792]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:04:28 localhost dnsmasq[317792]: warning: no upstream servers configured Nov 28 05:04:28 localhost dnsmasq-dhcp[317792]: DHCPv6, static leases only on 2001:db8:1::, lease time 1d Nov 28 05:04:28 localhost dnsmasq[317792]: read /var/lib/neutron/dhcp/1a246530-be70-4846-9202-8f9cd6d862ae/addn_hosts - 0 addresses Nov 28 05:04:28 localhost dnsmasq-dhcp[317792]: read /var/lib/neutron/dhcp/1a246530-be70-4846-9202-8f9cd6d862ae/host Nov 28 05:04:28 localhost dnsmasq-dhcp[317792]: read /var/lib/neutron/dhcp/1a246530-be70-4846-9202-8f9cd6d862ae/opts Nov 28 05:04:28 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:28.111 261084 INFO neutron.agent.dhcp.agent [None req-843ba996-86a5-49b0-9513-4e771998a713 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:26Z, description=, device_id=1c1f4bc1-9864-4ab2-ab0b-3419e92647f6, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7d15d485-b9d0-4a4c-92f9-e215283394d6, ip_allocation=immediate, mac_address=fa:16:3e:29:8f:09, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:04:23Z, description=, dns_domain=, id=1a246530-be70-4846-9202-8f9cd6d862ae, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-2019588082, port_security_enabled=True, project_id=50a1392ce96c4024bcd36a3df403ca29, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37301, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1609, status=ACTIVE, subnets=['64b56fea-cfed-4ba1-b5bf-14193e8cd8a5'], tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:04:25Z, vlan_transparent=None, network_id=1a246530-be70-4846-9202-8f9cd6d862ae, port_security_enabled=False, project_id=50a1392ce96c4024bcd36a3df403ca29, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1633, status=DOWN, tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:04:26Z on network 1a246530-be70-4846-9202-8f9cd6d862ae#033[00m Nov 28 05:04:28 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:28.298 261084 INFO neutron.agent.dhcp.agent [None req-323e6704-3489-427e-b1a6-bdf78087d826 - - - - - -] DHCP configuration for ports {'d1d8da55-f963-488f-b18a-dae5fc16078a'} is completed#033[00m Nov 28 05:04:28 localhost dnsmasq[317792]: read /var/lib/neutron/dhcp/1a246530-be70-4846-9202-8f9cd6d862ae/addn_hosts - 1 addresses Nov 28 05:04:28 localhost dnsmasq-dhcp[317792]: read /var/lib/neutron/dhcp/1a246530-be70-4846-9202-8f9cd6d862ae/host Nov 28 05:04:28 localhost dnsmasq-dhcp[317792]: read /var/lib/neutron/dhcp/1a246530-be70-4846-9202-8f9cd6d862ae/opts Nov 28 05:04:28 localhost podman[317816]: 2025-11-28 10:04:28.332372716 +0000 UTC m=+0.064789827 container kill 57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a246530-be70-4846-9202-8f9cd6d862ae, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:28 localhost neutron_sriov_agent[254147]: 2025-11-28 10:04:28.461 2 INFO neutron.agent.securitygroups_rpc [None req-35c0af25-6cf6-4373-be02-f6ff138ff337 e7c9d49fbf5f41059b8d32426e1740a6 517e4cc7e34e4fe7b49313300e5db635 - - default default] Security group member updated ['d7bc25b0-2d77-4fba-a003-707609b573d6']#033[00m Nov 28 05:04:28 localhost systemd[1]: var-lib-containers-storage-overlay-87952802ebce232016498b37ed1bc68b69594690c353cd872596f34af65028ef-merged.mount: Deactivated successfully. Nov 28 05:04:28 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-94fb03d5bbd0e112351572f1114fcf88da7f2878183421c43f856c6b3437eee1-userdata-shm.mount: Deactivated successfully. Nov 28 05:04:28 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:28.499 261084 INFO neutron.agent.dhcp.agent [None req-843ba996-86a5-49b0-9513-4e771998a713 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:26Z, description=, device_id=1c1f4bc1-9864-4ab2-ab0b-3419e92647f6, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7d15d485-b9d0-4a4c-92f9-e215283394d6, ip_allocation=immediate, mac_address=fa:16:3e:29:8f:09, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:04:23Z, description=, dns_domain=, id=1a246530-be70-4846-9202-8f9cd6d862ae, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-2019588082, port_security_enabled=True, project_id=50a1392ce96c4024bcd36a3df403ca29, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37301, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1609, status=ACTIVE, subnets=['64b56fea-cfed-4ba1-b5bf-14193e8cd8a5'], tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:04:25Z, vlan_transparent=None, network_id=1a246530-be70-4846-9202-8f9cd6d862ae, port_security_enabled=False, project_id=50a1392ce96c4024bcd36a3df403ca29, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1633, status=DOWN, tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:04:26Z on network 1a246530-be70-4846-9202-8f9cd6d862ae#033[00m Nov 28 05:04:28 localhost podman[317856]: Nov 28 05:04:28 localhost podman[317856]: 2025-11-28 10:04:28.552606456 +0000 UTC m=+0.098634407 container create dd76ff50412881725b1ad013e1aba74711e1b2dee39620f9091f578a4ef40f7d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:04:28 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:28.578 261084 INFO neutron.agent.dhcp.agent [None req-36b16de0-af32-493f-bf2a-d6ed00e46133 - - - - - -] DHCP configuration for ports {'7d15d485-b9d0-4a4c-92f9-e215283394d6'} is completed#033[00m Nov 28 05:04:28 localhost systemd[1]: Started libpod-conmon-dd76ff50412881725b1ad013e1aba74711e1b2dee39620f9091f578a4ef40f7d.scope. Nov 28 05:04:28 localhost systemd[1]: Started libcrun container. Nov 28 05:04:28 localhost podman[317856]: 2025-11-28 10:04:28.510628093 +0000 UTC m=+0.056656124 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e082c3b6a4213499386110c64e819aa13cccd28eb6ce3df98bc97cfa2e4eb5d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:04:28 localhost podman[317856]: 2025-11-28 10:04:28.624341712 +0000 UTC m=+0.170369693 container init dd76ff50412881725b1ad013e1aba74711e1b2dee39620f9091f578a4ef40f7d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true) Nov 28 05:04:28 localhost podman[317856]: 2025-11-28 10:04:28.634903315 +0000 UTC m=+0.180931286 container start dd76ff50412881725b1ad013e1aba74711e1b2dee39620f9091f578a4ef40f7d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 28 05:04:28 localhost dnsmasq[317892]: started, version 2.85 cachesize 150 Nov 28 05:04:28 localhost dnsmasq[317892]: DNS service limited to local subnets Nov 28 05:04:28 localhost dnsmasq[317892]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:04:28 localhost dnsmasq[317892]: warning: no upstream servers configured Nov 28 05:04:28 localhost dnsmasq[317892]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:28 localhost dnsmasq[317792]: read /var/lib/neutron/dhcp/1a246530-be70-4846-9202-8f9cd6d862ae/addn_hosts - 1 addresses Nov 28 05:04:28 localhost dnsmasq-dhcp[317792]: read /var/lib/neutron/dhcp/1a246530-be70-4846-9202-8f9cd6d862ae/host Nov 28 05:04:28 localhost podman[317895]: 2025-11-28 10:04:28.789935457 +0000 UTC m=+0.068628157 container kill 57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a246530-be70-4846-9202-8f9cd6d862ae, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 28 05:04:28 localhost dnsmasq-dhcp[317792]: read /var/lib/neutron/dhcp/1a246530-be70-4846-9202-8f9cd6d862ae/opts Nov 28 05:04:28 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:28.885 261084 INFO neutron.agent.dhcp.agent [None req-c5abd135-c599-4696-8090-d8bdebb79c7f - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '5fa5dbd0-889e-4133-9ba4-6f3810999535'} is completed#033[00m Nov 28 05:04:29 localhost dnsmasq[317892]: exiting on receipt of SIGTERM Nov 28 05:04:29 localhost podman[317932]: 2025-11-28 10:04:29.023169109 +0000 UTC m=+0.069668987 container kill dd76ff50412881725b1ad013e1aba74711e1b2dee39620f9091f578a4ef40f7d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:29 localhost systemd[1]: libpod-dd76ff50412881725b1ad013e1aba74711e1b2dee39620f9091f578a4ef40f7d.scope: Deactivated successfully. Nov 28 05:04:29 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:29.074 261084 INFO neutron.agent.dhcp.agent [None req-d9a1fe49-64c0-483d-a122-e87042550d1f - - - - - -] DHCP configuration for ports {'7d15d485-b9d0-4a4c-92f9-e215283394d6'} is completed#033[00m Nov 28 05:04:29 localhost podman[317946]: 2025-11-28 10:04:29.09994762 +0000 UTC m=+0.061084401 container died dd76ff50412881725b1ad013e1aba74711e1b2dee39620f9091f578a4ef40f7d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 28 05:04:29 localhost podman[317946]: 2025-11-28 10:04:29.13452451 +0000 UTC m=+0.095661241 container cleanup dd76ff50412881725b1ad013e1aba74711e1b2dee39620f9091f578a4ef40f7d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:29 localhost systemd[1]: libpod-conmon-dd76ff50412881725b1ad013e1aba74711e1b2dee39620f9091f578a4ef40f7d.scope: Deactivated successfully. Nov 28 05:04:29 localhost podman[317952]: 2025-11-28 10:04:29.188668862 +0000 UTC m=+0.136162412 container remove dd76ff50412881725b1ad013e1aba74711e1b2dee39620f9091f578a4ef40f7d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:04:29 localhost neutron_sriov_agent[254147]: 2025-11-28 10:04:29.311 2 INFO neutron.agent.securitygroups_rpc [None req-535fb80e-1678-409f-9e3d-b2eaa82a20b5 e7c9d49fbf5f41059b8d32426e1740a6 517e4cc7e34e4fe7b49313300e5db635 - - default default] Security group member updated ['d7bc25b0-2d77-4fba-a003-707609b573d6']#033[00m Nov 28 05:04:29 localhost systemd[1]: var-lib-containers-storage-overlay-4e082c3b6a4213499386110c64e819aa13cccd28eb6ce3df98bc97cfa2e4eb5d-merged.mount: Deactivated successfully. Nov 28 05:04:29 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dd76ff50412881725b1ad013e1aba74711e1b2dee39620f9091f578a4ef40f7d-userdata-shm.mount: Deactivated successfully. Nov 28 05:04:29 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:29.605 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2 2001:db8::f816:3eff:fef8:ad87'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:29 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:29.608 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated#033[00m Nov 28 05:04:29 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:29.612 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 3fb0e974-4428-459d-a213-c32931747eec IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:04:29 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:29.613 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:04:29 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:29.614 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[2256f7f6-f854-4736-8d4a-e789867a1e01]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:30 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:04:30 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:30.499 261084 INFO neutron.agent.linux.ip_lib [None req-553b78a4-862c-404f-8023-7f862ca89787 - - - - - -] Device tapaad9e073-ac cannot be used as it has no MAC address#033[00m Nov 28 05:04:30 localhost nova_compute[279673]: 2025-11-28 10:04:30.535 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:30 localhost nova_compute[279673]: 2025-11-28 10:04:30.553 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:30 localhost kernel: device tapaad9e073-ac entered promiscuous mode Nov 28 05:04:30 localhost NetworkManager[5967]: [1764324270.5599] manager: (tapaad9e073-ac): new Generic device (/org/freedesktop/NetworkManager/Devices/46) Nov 28 05:04:30 localhost ovn_controller[152322]: 2025-11-28T10:04:30Z|00265|binding|INFO|Claiming lport aad9e073-acbe-49ad-8b8e-e03f91cd53c9 for this chassis. Nov 28 05:04:30 localhost ovn_controller[152322]: 2025-11-28T10:04:30Z|00266|binding|INFO|aad9e073-acbe-49ad-8b8e-e03f91cd53c9: Claiming unknown Nov 28 05:04:30 localhost nova_compute[279673]: 2025-11-28 10:04:30.564 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:30 localhost systemd-udevd[318028]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:04:30 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:30.581 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:2::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '50a1392ce96c4024bcd36a3df403ca29', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7becd79-4f22-46be-87af-f81de3e971b9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=aad9e073-acbe-49ad-8b8e-e03f91cd53c9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:30 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:30.584 158130 INFO neutron.agent.ovn.metadata.agent [-] Port aad9e073-acbe-49ad-8b8e-e03f91cd53c9 in datapath f6bc7039-ebcb-4d5c-bff1-81be4c2607bb bound to our chassis#033[00m Nov 28 05:04:30 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:30.586 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f6bc7039-ebcb-4d5c-bff1-81be4c2607bb or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:04:30 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:30.587 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[b33c8077-075a-40bb-a53b-3e91e35be371]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:30 localhost journal[227875]: ethtool ioctl error on tapaad9e073-ac: No such device Nov 28 05:04:30 localhost ovn_controller[152322]: 2025-11-28T10:04:30Z|00267|binding|INFO|Setting lport aad9e073-acbe-49ad-8b8e-e03f91cd53c9 ovn-installed in OVS Nov 28 05:04:30 localhost ovn_controller[152322]: 2025-11-28T10:04:30Z|00268|binding|INFO|Setting lport aad9e073-acbe-49ad-8b8e-e03f91cd53c9 up in Southbound Nov 28 05:04:30 localhost journal[227875]: ethtool ioctl error on tapaad9e073-ac: No such device Nov 28 05:04:30 localhost nova_compute[279673]: 2025-11-28 10:04:30.596 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:30 localhost journal[227875]: ethtool ioctl error on tapaad9e073-ac: No such device Nov 28 05:04:30 localhost journal[227875]: ethtool ioctl error on tapaad9e073-ac: No such device Nov 28 05:04:30 localhost journal[227875]: ethtool ioctl error on tapaad9e073-ac: No such device Nov 28 05:04:30 localhost journal[227875]: ethtool ioctl error on tapaad9e073-ac: No such device Nov 28 05:04:30 localhost journal[227875]: ethtool ioctl error on tapaad9e073-ac: No such device Nov 28 05:04:30 localhost journal[227875]: ethtool ioctl error on tapaad9e073-ac: No such device Nov 28 05:04:30 localhost nova_compute[279673]: 2025-11-28 10:04:30.654 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:30 localhost nova_compute[279673]: 2025-11-28 10:04:30.680 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:30 localhost podman[318064]: Nov 28 05:04:30 localhost podman[318064]: 2025-11-28 10:04:30.778523258 +0000 UTC m=+0.104002501 container create cc917b7b25b70227e828e17b9dcf42c71ef1878b4d8b2156d8f88edd6aed1124 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true) Nov 28 05:04:30 localhost systemd[1]: Started libpod-conmon-cc917b7b25b70227e828e17b9dcf42c71ef1878b4d8b2156d8f88edd6aed1124.scope. Nov 28 05:04:30 localhost systemd[1]: Started libcrun container. Nov 28 05:04:30 localhost podman[318064]: 2025-11-28 10:04:30.733180268 +0000 UTC m=+0.058659541 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05622c3ff6db343faa6e0358a4ee5a25704b2d77fda63eb773bb03dec5bdf99e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:04:30 localhost podman[318064]: 2025-11-28 10:04:30.844463006 +0000 UTC m=+0.169942249 container init cc917b7b25b70227e828e17b9dcf42c71ef1878b4d8b2156d8f88edd6aed1124 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:04:30 localhost podman[318064]: 2025-11-28 10:04:30.856600965 +0000 UTC m=+0.182080228 container start cc917b7b25b70227e828e17b9dcf42c71ef1878b4d8b2156d8f88edd6aed1124 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 28 05:04:30 localhost dnsmasq[318092]: started, version 2.85 cachesize 150 Nov 28 05:04:30 localhost dnsmasq[318092]: DNS service limited to local subnets Nov 28 05:04:30 localhost dnsmasq[318092]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:04:30 localhost dnsmasq[318092]: warning: no upstream servers configured Nov 28 05:04:30 localhost dnsmasq-dhcp[318092]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:04:30 localhost dnsmasq[318092]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:30 localhost dnsmasq-dhcp[318092]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:30 localhost dnsmasq-dhcp[318092]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:31 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:31.146 261084 INFO neutron.agent.dhcp.agent [None req-56b7b415-30d4-4ab6-9aa5-fde3ab2f2ff0 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '5fa5dbd0-889e-4133-9ba4-6f3810999535'} is completed#033[00m Nov 28 05:04:31 localhost podman[318130]: Nov 28 05:04:31 localhost podman[318130]: 2025-11-28 10:04:31.47660832 +0000 UTC m=+0.046224615 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:31 localhost podman[318130]: 2025-11-28 10:04:31.585659585 +0000 UTC m=+0.155275850 container create 10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true) Nov 28 05:04:31 localhost systemd[1]: Started libpod-conmon-10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730.scope. Nov 28 05:04:31 localhost systemd[1]: Started libcrun container. Nov 28 05:04:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/578830e16dc254a672039252f1b4aca3106a0a33f8e3c5f9da3689144398b5d1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:04:31 localhost podman[318130]: 2025-11-28 10:04:31.651655827 +0000 UTC m=+0.221272082 container init 10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:04:31 localhost podman[318130]: 2025-11-28 10:04:31.66119732 +0000 UTC m=+0.230813585 container start 10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 28 05:04:31 localhost dnsmasq[318149]: started, version 2.85 cachesize 150 Nov 28 05:04:31 localhost dnsmasq[318149]: DNS service limited to local subnets Nov 28 05:04:31 localhost dnsmasq[318149]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:04:31 localhost dnsmasq[318149]: warning: no upstream servers configured Nov 28 05:04:31 localhost dnsmasq-dhcp[318149]: DHCPv6, static leases only on 2001:db8:2::, lease time 1d Nov 28 05:04:31 localhost dnsmasq[318149]: read /var/lib/neutron/dhcp/f6bc7039-ebcb-4d5c-bff1-81be4c2607bb/addn_hosts - 0 addresses Nov 28 05:04:31 localhost dnsmasq-dhcp[318149]: read /var/lib/neutron/dhcp/f6bc7039-ebcb-4d5c-bff1-81be4c2607bb/host Nov 28 05:04:31 localhost dnsmasq-dhcp[318149]: read /var/lib/neutron/dhcp/f6bc7039-ebcb-4d5c-bff1-81be4c2607bb/opts Nov 28 05:04:31 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:31.717 261084 INFO neutron.agent.dhcp.agent [None req-553b78a4-862c-404f-8023-7f862ca89787 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:30Z, description=, device_id=1c1f4bc1-9864-4ab2-ab0b-3419e92647f6, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d578c17e-efea-42d9-9b0e-1fc0d3472d19, ip_allocation=immediate, mac_address=fa:16:3e:d4:f8:84, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:04:27Z, description=, dns_domain=, id=f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1221052021, port_security_enabled=True, project_id=50a1392ce96c4024bcd36a3df403ca29, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9596, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1646, status=ACTIVE, subnets=['d7f1568e-313e-48e3-9b4d-32767b1bddcf'], tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:04:29Z, vlan_transparent=None, network_id=f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, port_security_enabled=False, project_id=50a1392ce96c4024bcd36a3df403ca29, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1660, status=DOWN, tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:04:30Z on network f6bc7039-ebcb-4d5c-bff1-81be4c2607bb#033[00m Nov 28 05:04:31 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:31.803 261084 INFO neutron.agent.dhcp.agent [None req-ae08e445-9120-431b-88c1-1eef8075a573 - - - - - -] DHCP configuration for ports {'f17c271d-1e05-4708-b116-d1572df0ff8c'} is completed#033[00m Nov 28 05:04:31 localhost dnsmasq[318149]: read /var/lib/neutron/dhcp/f6bc7039-ebcb-4d5c-bff1-81be4c2607bb/addn_hosts - 1 addresses Nov 28 05:04:31 localhost dnsmasq-dhcp[318149]: read /var/lib/neutron/dhcp/f6bc7039-ebcb-4d5c-bff1-81be4c2607bb/host Nov 28 05:04:31 localhost podman[318168]: 2025-11-28 10:04:31.91911743 +0000 UTC m=+0.061640427 container kill 10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 28 05:04:31 localhost dnsmasq-dhcp[318149]: read /var/lib/neutron/dhcp/f6bc7039-ebcb-4d5c-bff1-81be4c2607bb/opts Nov 28 05:04:32 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:32.151 261084 INFO neutron.agent.dhcp.agent [None req-7716dda8-e09e-4c11-845f-b15d2294fa79 - - - - - -] DHCP configuration for ports {'d578c17e-efea-42d9-9b0e-1fc0d3472d19'} is completed#033[00m Nov 28 05:04:33 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:33.049 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2 2001:db8::f816:3eff:fef8:ad87'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:33 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:33.052 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated#033[00m Nov 28 05:04:33 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:33.056 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 3fb0e974-4428-459d-a213-c32931747eec IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:04:33 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:33.056 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:04:33 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:33.058 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[5256cdbb-a39e-4137-8899-cf76d95fa840]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:33 localhost neutron_sriov_agent[254147]: 2025-11-28 10:04:33.170 2 INFO neutron.agent.securitygroups_rpc [None req-e7bb8635-ea1b-4f3c-951e-d95d847ad39e 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']#033[00m Nov 28 05:04:33 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:33.295 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:30Z, description=, device_id=1c1f4bc1-9864-4ab2-ab0b-3419e92647f6, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d578c17e-efea-42d9-9b0e-1fc0d3472d19, ip_allocation=immediate, mac_address=fa:16:3e:d4:f8:84, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:04:27Z, description=, dns_domain=, id=f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1221052021, port_security_enabled=True, project_id=50a1392ce96c4024bcd36a3df403ca29, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9596, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1646, status=ACTIVE, subnets=['d7f1568e-313e-48e3-9b4d-32767b1bddcf'], tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:04:29Z, vlan_transparent=None, network_id=f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, port_security_enabled=False, project_id=50a1392ce96c4024bcd36a3df403ca29, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1660, status=DOWN, tags=[], tenant_id=50a1392ce96c4024bcd36a3df403ca29, updated_at=2025-11-28T10:04:30Z on network f6bc7039-ebcb-4d5c-bff1-81be4c2607bb#033[00m Nov 28 05:04:33 localhost dnsmasq[318092]: exiting on receipt of SIGTERM Nov 28 05:04:33 localhost podman[318222]: 2025-11-28 10:04:33.526310403 +0000 UTC m=+0.085126871 container kill cc917b7b25b70227e828e17b9dcf42c71ef1878b4d8b2156d8f88edd6aed1124 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:33 localhost systemd[1]: tmp-crun.kwqwY2.mount: Deactivated successfully. Nov 28 05:04:33 localhost systemd[1]: libpod-cc917b7b25b70227e828e17b9dcf42c71ef1878b4d8b2156d8f88edd6aed1124.scope: Deactivated successfully. Nov 28 05:04:33 localhost dnsmasq[318149]: read /var/lib/neutron/dhcp/f6bc7039-ebcb-4d5c-bff1-81be4c2607bb/addn_hosts - 1 addresses Nov 28 05:04:33 localhost dnsmasq-dhcp[318149]: read /var/lib/neutron/dhcp/f6bc7039-ebcb-4d5c-bff1-81be4c2607bb/host Nov 28 05:04:33 localhost dnsmasq-dhcp[318149]: read /var/lib/neutron/dhcp/f6bc7039-ebcb-4d5c-bff1-81be4c2607bb/opts Nov 28 05:04:33 localhost podman[318234]: 2025-11-28 10:04:33.587664151 +0000 UTC m=+0.077095910 container kill 10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:04:33 localhost podman[318241]: 2025-11-28 10:04:33.599954853 +0000 UTC m=+0.057617351 container died cc917b7b25b70227e828e17b9dcf42c71ef1878b4d8b2156d8f88edd6aed1124 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 28 05:04:33 localhost podman[318241]: 2025-11-28 10:04:33.775050421 +0000 UTC m=+0.232712919 container cleanup cc917b7b25b70227e828e17b9dcf42c71ef1878b4d8b2156d8f88edd6aed1124 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 28 05:04:33 localhost systemd[1]: libpod-conmon-cc917b7b25b70227e828e17b9dcf42c71ef1878b4d8b2156d8f88edd6aed1124.scope: Deactivated successfully. Nov 28 05:04:33 localhost podman[318252]: 2025-11-28 10:04:33.803150436 +0000 UTC m=+0.239432202 container remove cc917b7b25b70227e828e17b9dcf42c71ef1878b4d8b2156d8f88edd6aed1124 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:04:33 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:33.807 261084 INFO neutron.agent.dhcp.agent [None req-9c86f485-3701-4964-8ca6-129bac63be7e - - - - - -] DHCP configuration for ports {'d578c17e-efea-42d9-9b0e-1fc0d3472d19'} is completed#033[00m Nov 28 05:04:34 localhost systemd[1]: var-lib-containers-storage-overlay-05622c3ff6db343faa6e0358a4ee5a25704b2d77fda63eb773bb03dec5bdf99e-merged.mount: Deactivated successfully. Nov 28 05:04:34 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cc917b7b25b70227e828e17b9dcf42c71ef1878b4d8b2156d8f88edd6aed1124-userdata-shm.mount: Deactivated successfully. Nov 28 05:04:34 localhost neutron_sriov_agent[254147]: 2025-11-28 10:04:34.522 2 INFO neutron.agent.securitygroups_rpc [None req-406cfd0d-88dd-4d36-9649-40665b36b8d2 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:04:34 localhost podman[318335]: Nov 28 05:04:34 localhost podman[318335]: 2025-11-28 10:04:34.810975763 +0000 UTC m=+0.099278955 container create a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:04:34 localhost systemd[1]: Started libpod-conmon-a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b.scope. Nov 28 05:04:34 localhost podman[318335]: 2025-11-28 10:04:34.76341806 +0000 UTC m=+0.051721302 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:34 localhost systemd[1]: Started libcrun container. Nov 28 05:04:34 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27c3d356aba2ea0d98747d934ff4c0b639ec6e4c8c998430d9479d7fef6a3cf4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:04:34 localhost podman[318335]: 2025-11-28 10:04:34.888400792 +0000 UTC m=+0.176703984 container init a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:34 localhost podman[318335]: 2025-11-28 10:04:34.899108339 +0000 UTC m=+0.187411531 container start a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:34 localhost dnsmasq[318354]: started, version 2.85 cachesize 150 Nov 28 05:04:34 localhost dnsmasq[318354]: DNS service limited to local subnets Nov 28 05:04:34 localhost dnsmasq[318354]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:04:34 localhost dnsmasq[318354]: warning: no upstream servers configured Nov 28 05:04:34 localhost dnsmasq-dhcp[318354]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:04:34 localhost dnsmasq-dhcp[318354]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:04:34 localhost dnsmasq[318354]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:34 localhost dnsmasq-dhcp[318354]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:34 localhost dnsmasq-dhcp[318354]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:34 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:34.956 261084 INFO neutron.agent.dhcp.agent [None req-616e4e52-5787-4518-8ed1-3f9d7ee5c0ab - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:33Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=cddc71e8-dfe8-4bc4-ac32-c8e62bc2fd52, ip_allocation=immediate, mac_address=fa:16:3e:ab:b5:4e, name=tempest-NetworksTestDHCPv6-1132459699, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=35, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['56931ec4-adb1-4ad8-98f9-465d3608c4a7', '7498cd52-8707-4758-a964-4aef6a4317ac'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:30Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1683, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:34Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1#033[00m Nov 28 05:04:35 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:04:35 localhost dnsmasq[318354]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 2 addresses Nov 28 05:04:35 localhost dnsmasq-dhcp[318354]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:35 localhost dnsmasq-dhcp[318354]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:35 localhost podman[318373]: 2025-11-28 10:04:35.17309548 +0000 UTC m=+0.046154634 container kill a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125) Nov 28 05:04:35 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:35.236 261084 INFO neutron.agent.dhcp.agent [None req-fca8271d-824a-4f4a-9fc6-3b70da62dc94 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '5fa5dbd0-889e-4133-9ba4-6f3810999535'} is completed#033[00m Nov 28 05:04:35 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:35.472 261084 INFO neutron.agent.dhcp.agent [None req-ec1a19c5-4a32-42d6-a3a5-f85a251b619d - - - - - -] DHCP configuration for ports {'cddc71e8-dfe8-4bc4-ac32-c8e62bc2fd52'} is completed#033[00m Nov 28 05:04:35 localhost nova_compute[279673]: 2025-11-28 10:04:35.539 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:35 localhost neutron_sriov_agent[254147]: 2025-11-28 10:04:35.893 2 INFO neutron.agent.securitygroups_rpc [None req-e5e44e33-1445-4a9a-aa0f-e3e5f136c603 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:04:36 localhost dnsmasq[318354]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:36 localhost dnsmasq-dhcp[318354]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:36 localhost dnsmasq-dhcp[318354]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:36 localhost podman[318413]: 2025-11-28 10:04:36.148158529 +0000 UTC m=+0.062365218 container kill a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 05:04:36 localhost neutron_sriov_agent[254147]: 2025-11-28 10:04:36.670 2 INFO neutron.agent.securitygroups_rpc [None req-260c5785-8a89-47b1-924c-143d50af86e5 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']#033[00m Nov 28 05:04:36 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:36.686 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:04:37 localhost dnsmasq[318354]: exiting on receipt of SIGTERM Nov 28 05:04:37 localhost systemd[1]: libpod-a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b.scope: Deactivated successfully. Nov 28 05:04:37 localhost podman[318454]: 2025-11-28 10:04:37.173125959 +0000 UTC m=+0.065126168 container kill a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:04:37 localhost podman[318467]: 2025-11-28 10:04:37.246682636 +0000 UTC m=+0.062910974 container died a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:37 localhost systemd[1]: tmp-crun.L4kkfN.mount: Deactivated successfully. Nov 28 05:04:37 localhost podman[318467]: 2025-11-28 10:04:37.295434143 +0000 UTC m=+0.111662421 container cleanup a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:04:37 localhost systemd[1]: libpod-conmon-a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b.scope: Deactivated successfully. Nov 28 05:04:37 localhost podman[318474]: 2025-11-28 10:04:37.334108191 +0000 UTC m=+0.134362681 container remove a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Nov 28 05:04:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:04:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:04:38 localhost podman[318525]: 2025-11-28 10:04:38.104364192 +0000 UTC m=+0.084018798 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 28 05:04:38 localhost systemd[1]: var-lib-containers-storage-overlay-27c3d356aba2ea0d98747d934ff4c0b639ec6e4c8c998430d9479d7fef6a3cf4-merged.mount: Deactivated successfully. Nov 28 05:04:38 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a30265201545b0b4641951c085d3c996892c5ea517ae2f221afc27d821c8106b-userdata-shm.mount: Deactivated successfully. Nov 28 05:04:38 localhost podman[318526]: 2025-11-28 10:04:38.172366821 +0000 UTC m=+0.145866771 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 05:04:38 localhost podman[318525]: 2025-11-28 10:04:38.249144601 +0000 UTC m=+0.228799157 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:04:38 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:04:38 localhost podman[318526]: 2025-11-28 10:04:38.306891335 +0000 UTC m=+0.280391265 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 28 05:04:38 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:04:38 localhost podman[318588]: Nov 28 05:04:38 localhost podman[318588]: 2025-11-28 10:04:38.335922408 +0000 UTC m=+0.099334868 container create 34ef2e8e26f363da5521fa32c45211d9260281108154bd7307ddad892c879223 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:04:38 localhost systemd[1]: Started libpod-conmon-34ef2e8e26f363da5521fa32c45211d9260281108154bd7307ddad892c879223.scope. Nov 28 05:04:38 localhost podman[318588]: 2025-11-28 10:04:38.283520646 +0000 UTC m=+0.046933156 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:38 localhost systemd[1]: Started libcrun container. Nov 28 05:04:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/088e71771c3215ab6b11832e0cd5acfbf62a86028c4cb9bd53ee87603664aa23/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:04:38 localhost podman[318588]: 2025-11-28 10:04:38.412927354 +0000 UTC m=+0.176339804 container init 34ef2e8e26f363da5521fa32c45211d9260281108154bd7307ddad892c879223 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2) Nov 28 05:04:38 localhost podman[318588]: 2025-11-28 10:04:38.42254482 +0000 UTC m=+0.185957260 container start 34ef2e8e26f363da5521fa32c45211d9260281108154bd7307ddad892c879223 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:04:38 localhost dnsmasq[318607]: started, version 2.85 cachesize 150 Nov 28 05:04:38 localhost dnsmasq[318607]: DNS service limited to local subnets Nov 28 05:04:38 localhost dnsmasq[318607]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:04:38 localhost dnsmasq[318607]: warning: no upstream servers configured Nov 28 05:04:38 localhost dnsmasq-dhcp[318607]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:04:38 localhost dnsmasq[318607]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:38 localhost dnsmasq-dhcp[318607]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:38 localhost dnsmasq-dhcp[318607]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:38 localhost dnsmasq[318607]: exiting on receipt of SIGTERM Nov 28 05:04:38 localhost podman[318625]: 2025-11-28 10:04:38.781970618 +0000 UTC m=+0.062987286 container kill 34ef2e8e26f363da5521fa32c45211d9260281108154bd7307ddad892c879223 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:04:38 localhost systemd[1]: libpod-34ef2e8e26f363da5521fa32c45211d9260281108154bd7307ddad892c879223.scope: Deactivated successfully. Nov 28 05:04:38 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:38.825 261084 INFO neutron.agent.dhcp.agent [None req-0bedb45d-e760-4fd5-877a-989c79e8bf6b - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '5fa5dbd0-889e-4133-9ba4-6f3810999535'} is completed#033[00m Nov 28 05:04:38 localhost podman[318637]: 2025-11-28 10:04:38.859582262 +0000 UTC m=+0.060452223 container died 34ef2e8e26f363da5521fa32c45211d9260281108154bd7307ddad892c879223 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:04:38 localhost podman[318637]: 2025-11-28 10:04:38.89547449 +0000 UTC m=+0.096344411 container cleanup 34ef2e8e26f363da5521fa32c45211d9260281108154bd7307ddad892c879223 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 05:04:38 localhost systemd[1]: libpod-conmon-34ef2e8e26f363da5521fa32c45211d9260281108154bd7307ddad892c879223.scope: Deactivated successfully. Nov 28 05:04:38 localhost podman[318639]: 2025-11-28 10:04:38.935840237 +0000 UTC m=+0.128037270 container remove 34ef2e8e26f363da5521fa32c45211d9260281108154bd7307ddad892c879223 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:04:39 localhost ovn_controller[152322]: 2025-11-28T10:04:39Z|00269|binding|INFO|Releasing lport 5fa5dbd0-889e-4133-9ba4-6f3810999535 from this chassis (sb_readonly=0) Nov 28 05:04:39 localhost ovn_controller[152322]: 2025-11-28T10:04:39Z|00270|binding|INFO|Setting lport 5fa5dbd0-889e-4133-9ba4-6f3810999535 down in Southbound Nov 28 05:04:39 localhost nova_compute[279673]: 2025-11-28 10:04:39.007 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:39 localhost kernel: device tap5fa5dbd0-88 left promiscuous mode Nov 28 05:04:39 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:39.016 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fe5a:389/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5fa5dbd0-889e-4133-9ba4-6f3810999535) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:39 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:39.018 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 5fa5dbd0-889e-4133-9ba4-6f3810999535 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis#033[00m Nov 28 05:04:39 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:39.022 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:04:39 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:39.023 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[71752455-9ff2-42af-bdc4-fe84195c5469]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:39 localhost nova_compute[279673]: 2025-11-28 10:04:39.031 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:39 localhost systemd[1]: tmp-crun.TpxJiB.mount: Deactivated successfully. Nov 28 05:04:39 localhost systemd[1]: var-lib-containers-storage-overlay-088e71771c3215ab6b11832e0cd5acfbf62a86028c4cb9bd53ee87603664aa23-merged.mount: Deactivated successfully. Nov 28 05:04:39 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-34ef2e8e26f363da5521fa32c45211d9260281108154bd7307ddad892c879223-userdata-shm.mount: Deactivated successfully. Nov 28 05:04:39 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:39.262 261084 INFO neutron.agent.dhcp.agent [None req-e36f3119-506c-475f-a669-5900c821c31a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:04:39 localhost systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully. Nov 28 05:04:39 localhost neutron_sriov_agent[254147]: 2025-11-28 10:04:39.535 2 INFO neutron.agent.securitygroups_rpc [None req-cb662c54-c616-44d2-81c7-5ec9bf360652 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']#033[00m Nov 28 05:04:39 localhost nova_compute[279673]: 2025-11-28 10:04:39.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:04:39 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:39.998 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 2001:db8::f816:3eff:fef8:ad87'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '18', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 10.100.0.2 2001:db8::f816:3eff:fef8:ad87'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:40 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:40.000 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated#033[00m Nov 28 05:04:40 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:40.004 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:04:40 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:40.005 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[f6085299-67af-48de-ae80-0a054aa20982]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:40 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:04:40 localhost podman[238687]: time="2025-11-28T10:04:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:04:40 localhost podman[238687]: @ - - [28/Nov/2025:10:04:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159326 "" "Go-http-client/1.1" Nov 28 05:04:40 localhost podman[238687]: @ - - [28/Nov/2025:10:04:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20219 "" "Go-http-client/1.1" Nov 28 05:04:40 localhost nova_compute[279673]: 2025-11-28 10:04:40.585 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:40 localhost nova_compute[279673]: 2025-11-28 10:04:40.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:04:40 localhost nova_compute[279673]: 2025-11-28 10:04:40.770 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 05:04:40 localhost nova_compute[279673]: 2025-11-28 10:04:40.777 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:40 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:40.890 261084 INFO neutron.agent.linux.ip_lib [None req-9a421c33-65be-46b0-adca-b28d3ed45bb5 - - - - - -] Device tap7ba9b167-5b cannot be used as it has no MAC address#033[00m Nov 28 05:04:40 localhost neutron_sriov_agent[254147]: 2025-11-28 10:04:40.902 2 INFO neutron.agent.securitygroups_rpc [None req-019d9e71-83ce-440c-bb58-c6c7df87e29f 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:04:40 localhost nova_compute[279673]: 2025-11-28 10:04:40.908 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:40 localhost kernel: device tap7ba9b167-5b entered promiscuous mode Nov 28 05:04:40 localhost ovn_controller[152322]: 2025-11-28T10:04:40Z|00271|binding|INFO|Claiming lport 7ba9b167-5b98-46f1-8868-b05a091f3734 for this chassis. Nov 28 05:04:40 localhost ovn_controller[152322]: 2025-11-28T10:04:40Z|00272|binding|INFO|7ba9b167-5b98-46f1-8868-b05a091f3734: Claiming unknown Nov 28 05:04:40 localhost nova_compute[279673]: 2025-11-28 10:04:40.917 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:40 localhost NetworkManager[5967]: [1764324280.9178] manager: (tap7ba9b167-5b): new Generic device (/org/freedesktop/NetworkManager/Devices/47) Nov 28 05:04:40 localhost systemd-udevd[318680]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:04:40 localhost ovn_controller[152322]: 2025-11-28T10:04:40Z|00273|binding|INFO|Setting lport 7ba9b167-5b98-46f1-8868-b05a091f3734 ovn-installed in OVS Nov 28 05:04:40 localhost ovn_controller[152322]: 2025-11-28T10:04:40Z|00274|binding|INFO|Setting lport 7ba9b167-5b98-46f1-8868-b05a091f3734 up in Southbound Nov 28 05:04:40 localhost nova_compute[279673]: 2025-11-28 10:04:40.929 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:40 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:40.929 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fefb:5081/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7ba9b167-5b98-46f1-8868-b05a091f3734) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:40 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:40.932 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 7ba9b167-5b98-46f1-8868-b05a091f3734 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis#033[00m Nov 28 05:04:40 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:40.936 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 4132ac1f-fd14-4d9d-a235-bdf90abef7f7 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:04:40 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:40.936 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:04:40 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:40.940 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[e402a50c-cc47-463d-8aa7-b5033f48119f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:40 localhost journal[227875]: ethtool ioctl error on tap7ba9b167-5b: No such device Nov 28 05:04:40 localhost nova_compute[279673]: 2025-11-28 10:04:40.949 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:40 localhost journal[227875]: ethtool ioctl error on tap7ba9b167-5b: No such device Nov 28 05:04:40 localhost journal[227875]: ethtool ioctl error on tap7ba9b167-5b: No such device Nov 28 05:04:40 localhost journal[227875]: ethtool ioctl error on tap7ba9b167-5b: No such device Nov 28 05:04:40 localhost journal[227875]: ethtool ioctl error on tap7ba9b167-5b: No such device Nov 28 05:04:40 localhost journal[227875]: ethtool ioctl error on tap7ba9b167-5b: No such device Nov 28 05:04:40 localhost journal[227875]: ethtool ioctl error on tap7ba9b167-5b: No such device Nov 28 05:04:40 localhost journal[227875]: ethtool ioctl error on tap7ba9b167-5b: No such device Nov 28 05:04:40 localhost nova_compute[279673]: 2025-11-28 10:04:40.986 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:41 localhost nova_compute[279673]: 2025-11-28 10:04:41.007 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:41 localhost neutron_sriov_agent[254147]: 2025-11-28 10:04:41.389 2 INFO neutron.agent.securitygroups_rpc [None req-bb6bd82f-3718-4e2c-b707-6af77a5385f7 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:04:41 localhost podman[318751]: Nov 28 05:04:41 localhost podman[318751]: 2025-11-28 10:04:41.700933578 +0000 UTC m=+0.066494596 container create eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 28 05:04:41 localhost systemd[1]: Started libpod-conmon-eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213.scope. Nov 28 05:04:41 localhost systemd[1]: Started libcrun container. Nov 28 05:04:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b5412e2fb13d3391b4a358fa30deba9085121686461d41d600f84f99fbf2d44/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:04:41 localhost podman[318751]: 2025-11-28 10:04:41.747294146 +0000 UTC m=+0.112855164 container init eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 05:04:41 localhost podman[318751]: 2025-11-28 10:04:41.754888214 +0000 UTC m=+0.120449222 container start eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Nov 28 05:04:41 localhost podman[318751]: 2025-11-28 10:04:41.667910922 +0000 UTC m=+0.033471980 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:41 localhost dnsmasq[318769]: started, version 2.85 cachesize 150 Nov 28 05:04:41 localhost dnsmasq[318769]: DNS service limited to local subnets Nov 28 05:04:41 localhost dnsmasq[318769]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:04:41 localhost dnsmasq[318769]: warning: no upstream servers configured Nov 28 05:04:41 localhost dnsmasq[318769]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:41 localhost nova_compute[279673]: 2025-11-28 10:04:41.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:04:41 localhost nova_compute[279673]: 2025-11-28 10:04:41.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:04:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:41.803 261084 INFO neutron.agent.dhcp.agent [None req-9a421c33-65be-46b0-adca-b28d3ed45bb5 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:40Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=84c9c7ae-4b7f-4790-858d-fbc75245c737, ip_allocation=immediate, mac_address=fa:16:3e:60:5e:59, name=tempest-NetworksTestDHCPv6-1618658599, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=38, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['75d6b4c5-411d-4659-ad69-2f7df9b2832a'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:39Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1735, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:40Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1#033[00m Nov 28 05:04:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:41.905 261084 INFO neutron.agent.dhcp.agent [None req-10b65453-8fd6-440c-9cdd-4bffc9aa6417 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed#033[00m Nov 28 05:04:41 localhost dnsmasq[318769]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses Nov 28 05:04:41 localhost podman[318786]: 2025-11-28 10:04:41.937080264 +0000 UTC m=+0.038496274 container kill eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Nov 28 05:04:42 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:42.121 261084 INFO neutron.agent.dhcp.agent [None req-43a8babd-8b8b-43e8-a67c-22b76e0db50c - - - - - -] DHCP configuration for ports {'84c9c7ae-4b7f-4790-858d-fbc75245c737'} is completed#033[00m Nov 28 05:04:42 localhost dnsmasq[318769]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:42 localhost podman[318824]: 2025-11-28 10:04:42.137124797 +0000 UTC m=+0.039068050 container kill eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 28 05:04:42 localhost dnsmasq[318769]: exiting on receipt of SIGTERM Nov 28 05:04:42 localhost podman[318862]: 2025-11-28 10:04:42.531738254 +0000 UTC m=+0.061447402 container kill eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Nov 28 05:04:42 localhost systemd[1]: libpod-eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213.scope: Deactivated successfully. Nov 28 05:04:42 localhost podman[318876]: 2025-11-28 10:04:42.609320947 +0000 UTC m=+0.060718441 container died eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 05:04:42 localhost podman[318876]: 2025-11-28 10:04:42.640307145 +0000 UTC m=+0.091704609 container cleanup eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 28 05:04:42 localhost systemd[1]: libpod-conmon-eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213.scope: Deactivated successfully. Nov 28 05:04:42 localhost podman[318877]: 2025-11-28 10:04:42.678957362 +0000 UTC m=+0.126838615 container remove eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:04:42 localhost ovn_controller[152322]: 2025-11-28T10:04:42Z|00275|binding|INFO|Releasing lport 7ba9b167-5b98-46f1-8868-b05a091f3734 from this chassis (sb_readonly=0) Nov 28 05:04:42 localhost ovn_controller[152322]: 2025-11-28T10:04:42Z|00276|binding|INFO|Setting lport 7ba9b167-5b98-46f1-8868-b05a091f3734 down in Southbound Nov 28 05:04:42 localhost nova_compute[279673]: 2025-11-28 10:04:42.692 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:42 localhost kernel: device tap7ba9b167-5b left promiscuous mode Nov 28 05:04:42 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:42.703 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fefb:5081/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7ba9b167-5b98-46f1-8868-b05a091f3734) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:42 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:42.706 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 7ba9b167-5b98-46f1-8868-b05a091f3734 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis#033[00m Nov 28 05:04:42 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:42.712 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:04:42 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:42.713 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[040f5bc3-b86f-4461-90df-105db279ea18]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:42 localhost nova_compute[279673]: 2025-11-28 10:04:42.715 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:42 localhost nova_compute[279673]: 2025-11-28 10:04:42.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:04:42 localhost systemd[1]: var-lib-containers-storage-overlay-1b5412e2fb13d3391b4a358fa30deba9085121686461d41d600f84f99fbf2d44-merged.mount: Deactivated successfully. Nov 28 05:04:42 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eaedbc688cf57c6d20f6699e169c24aa209259dd8157ce1da52cd5b02d2fa213-userdata-shm.mount: Deactivated successfully. Nov 28 05:04:43 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:43.037 261084 INFO neutron.agent.dhcp.agent [None req-269b331e-c9a9-442b-b4be-4979b0d42bba - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:04:43 localhost systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully. Nov 28 05:04:43 localhost nova_compute[279673]: 2025-11-28 10:04:43.767 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:04:43 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:43.857 261084 INFO neutron.agent.linux.ip_lib [None req-36f2721d-0343-4138-a16d-75e529a647c3 - - - - - -] Device tap8f633916-5b cannot be used as it has no MAC address#033[00m Nov 28 05:04:43 localhost nova_compute[279673]: 2025-11-28 10:04:43.909 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:43 localhost kernel: device tap8f633916-5b entered promiscuous mode Nov 28 05:04:43 localhost NetworkManager[5967]: [1764324283.9147] manager: (tap8f633916-5b): new Generic device (/org/freedesktop/NetworkManager/Devices/48) Nov 28 05:04:43 localhost ovn_controller[152322]: 2025-11-28T10:04:43Z|00277|binding|INFO|Claiming lport 8f633916-5bf0-443d-a81d-61c50c415e1a for this chassis. Nov 28 05:04:43 localhost ovn_controller[152322]: 2025-11-28T10:04:43Z|00278|binding|INFO|8f633916-5bf0-443d-a81d-61c50c415e1a: Claiming unknown Nov 28 05:04:43 localhost nova_compute[279673]: 2025-11-28 10:04:43.917 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:43 localhost ovn_controller[152322]: 2025-11-28T10:04:43Z|00279|binding|INFO|Setting lport 8f633916-5bf0-443d-a81d-61c50c415e1a ovn-installed in OVS Nov 28 05:04:43 localhost ovn_controller[152322]: 2025-11-28T10:04:43Z|00280|binding|INFO|Setting lport 8f633916-5bf0-443d-a81d-61c50c415e1a up in Southbound Nov 28 05:04:43 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:43.925 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe63:6927/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8f633916-5bf0-443d-a81d-61c50c415e1a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:43 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:43.926 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 8f633916-5bf0-443d-a81d-61c50c415e1a in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis#033[00m Nov 28 05:04:43 localhost nova_compute[279673]: 2025-11-28 10:04:43.927 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:43 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:43.930 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 1148230f-ceb2-4a9b-9c0f-9849b9ab7a86 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:04:43 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:43.930 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:04:43 localhost nova_compute[279673]: 2025-11-28 10:04:43.930 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:43 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:43.931 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[7b3942be-a134-4456-976a-1b0d99c4ea31]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:43 localhost nova_compute[279673]: 2025-11-28 10:04:43.956 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:43 localhost nova_compute[279673]: 2025-11-28 10:04:43.996 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:44 localhost nova_compute[279673]: 2025-11-28 10:04:44.023 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e148 do_prune osdmap full prune enabled Nov 28 05:04:44 localhost neutron_sriov_agent[254147]: 2025-11-28 10:04:44.214 2 INFO neutron.agent.securitygroups_rpc [None req-364c5261-76b8-4bbe-a1a4-6cba1a17e718 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']#033[00m Nov 28 05:04:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e149 e149: 6 total, 6 up, 6 in Nov 28 05:04:44 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e149: 6 total, 6 up, 6 in Nov 28 05:04:44 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:44.484 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:04:44 localhost neutron_sriov_agent[254147]: 2025-11-28 10:04:44.655 2 INFO neutron.agent.securitygroups_rpc [None req-32328951-8e5a-4539-abdc-e00522344c39 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:04:44 localhost nova_compute[279673]: 2025-11-28 10:04:44.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:04:44 localhost nova_compute[279673]: 2025-11-28 10:04:44.799 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:04:44 localhost nova_compute[279673]: 2025-11-28 10:04:44.800 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:04:44 localhost nova_compute[279673]: 2025-11-28 10:04:44.801 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:04:44 localhost nova_compute[279673]: 2025-11-28 10:04:44.801 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 05:04:44 localhost nova_compute[279673]: 2025-11-28 10:04:44.802 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:04:44 localhost podman[318970]: Nov 28 05:04:44 localhost podman[318970]: 2025-11-28 10:04:44.837183873 +0000 UTC m=+0.082142074 container create a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Nov 28 05:04:44 localhost podman[318970]: 2025-11-28 10:04:44.787909292 +0000 UTC m=+0.032867523 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:44 localhost systemd[1]: Started libpod-conmon-a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796.scope. Nov 28 05:04:44 localhost systemd[1]: Started libcrun container. Nov 28 05:04:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f8555d1fc38767bef1d7de04762368666c8c66ba5ae9285326628ef97b61012/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:04:44 localhost podman[318970]: 2025-11-28 10:04:44.938655461 +0000 UTC m=+0.183613652 container init a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true) Nov 28 05:04:44 localhost podman[318970]: 2025-11-28 10:04:44.949059059 +0000 UTC m=+0.194017240 container start a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:44 localhost dnsmasq[319005]: started, version 2.85 cachesize 150 Nov 28 05:04:44 localhost dnsmasq[319005]: DNS service limited to local subnets Nov 28 05:04:44 localhost dnsmasq[319005]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:04:44 localhost dnsmasq[319005]: warning: no upstream servers configured Nov 28 05:04:44 localhost dnsmasq-dhcp[319005]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:04:44 localhost dnsmasq[319005]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:44 localhost dnsmasq-dhcp[319005]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:44 localhost dnsmasq-dhcp[319005]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:45 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:45.010 261084 INFO neutron.agent.dhcp.agent [None req-36f2721d-0343-4138-a16d-75e529a647c3 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:43Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=4f2aa82b-68f8-4b54-9ba9-fd01c33f0c08, ip_allocation=immediate, mac_address=fa:16:3e:23:50:5c, name=tempest-NetworksTestDHCPv6-1248319796, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=40, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['f23fa24d-eb1f-4c78-8443-1b19f0eaff59'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:42Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1756, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:44Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1#033[00m Nov 28 05:04:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:04:45 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:45.146 261084 INFO neutron.agent.dhcp.agent [None req-81fe2799-3cb2-4c61-8245-0786de8e7908 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed#033[00m Nov 28 05:04:45 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:45.197 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:45 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:45.198 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 28 05:04:45 localhost nova_compute[279673]: 2025-11-28 10:04:45.207 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e149 do_prune osdmap full prune enabled Nov 28 05:04:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e150 e150: 6 total, 6 up, 6 in Nov 28 05:04:45 localhost dnsmasq[319005]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses Nov 28 05:04:45 localhost dnsmasq-dhcp[319005]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:45 localhost dnsmasq-dhcp[319005]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:45 localhost podman[319024]: 2025-11-28 10:04:45.247690226 +0000 UTC m=+0.107834712 container kill a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 28 05:04:45 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e150: 6 total, 6 up, 6 in Nov 28 05:04:45 localhost nova_compute[279673]: 2025-11-28 10:04:45.329 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.526s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:04:45 localhost nova_compute[279673]: 2025-11-28 10:04:45.403 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:04:45 localhost nova_compute[279673]: 2025-11-28 10:04:45.403 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:04:45 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:45.501 261084 INFO neutron.agent.dhcp.agent [None req-966550fc-4b5b-4ee5-84fb-5ee8b77fbe95 - - - - - -] DHCP configuration for ports {'4f2aa82b-68f8-4b54-9ba9-fd01c33f0c08'} is completed#033[00m Nov 28 05:04:45 localhost nova_compute[279673]: 2025-11-28 10:04:45.592 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:45 localhost nova_compute[279673]: 2025-11-28 10:04:45.626 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 05:04:45 localhost nova_compute[279673]: 2025-11-28 10:04:45.627 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11199MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 05:04:45 localhost nova_compute[279673]: 2025-11-28 10:04:45.628 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:04:45 localhost nova_compute[279673]: 2025-11-28 10:04:45.628 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:04:45 localhost nova_compute[279673]: 2025-11-28 10:04:45.732 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 05:04:45 localhost nova_compute[279673]: 2025-11-28 10:04:45.732 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 05:04:45 localhost nova_compute[279673]: 2025-11-28 10:04:45.733 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 05:04:45 localhost neutron_sriov_agent[254147]: 2025-11-28 10:04:45.746 2 INFO neutron.agent.securitygroups_rpc [None req-bac2ff8c-e0f4-436a-a2ef-9ffa5731fa74 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:04:45 localhost nova_compute[279673]: 2025-11-28 10:04:45.781 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:04:45 localhost dnsmasq[319005]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:45 localhost dnsmasq-dhcp[319005]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:45 localhost dnsmasq-dhcp[319005]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:45 localhost podman[319067]: 2025-11-28 10:04:45.984755216 +0000 UTC m=+0.072402116 container kill a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:04:46 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:04:46 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1907177804' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:04:46 localhost nova_compute[279673]: 2025-11-28 10:04:46.234 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:04:46 localhost nova_compute[279673]: 2025-11-28 10:04:46.242 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 05:04:46 localhost nova_compute[279673]: 2025-11-28 10:04:46.262 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 05:04:46 localhost nova_compute[279673]: 2025-11-28 10:04:46.265 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 05:04:46 localhost nova_compute[279673]: 2025-11-28 10:04:46.265 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.637s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:04:47 localhost dnsmasq[319005]: exiting on receipt of SIGTERM Nov 28 05:04:47 localhost podman[319127]: 2025-11-28 10:04:47.227166286 +0000 UTC m=+0.070227694 container kill a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:04:47 localhost systemd[1]: tmp-crun.j2wArk.mount: Deactivated successfully. Nov 28 05:04:47 localhost systemd[1]: libpod-a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796.scope: Deactivated successfully. Nov 28 05:04:47 localhost nova_compute[279673]: 2025-11-28 10:04:47.266 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:04:47 localhost nova_compute[279673]: 2025-11-28 10:04:47.267 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 05:04:47 localhost nova_compute[279673]: 2025-11-28 10:04:47.267 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0. Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.300127) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46 Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324287300172, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 2391, "num_deletes": 268, "total_data_size": 2586776, "memory_usage": 2634144, "flush_reason": "Manual Compaction"} Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started Nov 28 05:04:47 localhost podman[319139]: 2025-11-28 10:04:47.309126775 +0000 UTC m=+0.065300803 container died a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324287315906, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 2510994, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25621, "largest_seqno": 28011, "table_properties": {"data_size": 2500880, "index_size": 6491, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 22084, "raw_average_key_size": 21, "raw_value_size": 2480323, "raw_average_value_size": 2455, "num_data_blocks": 275, "num_entries": 1010, "num_filter_entries": 1010, "num_deletions": 268, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324137, "oldest_key_time": 1764324137, "file_creation_time": 1764324287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}} Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 15858 microseconds, and 7044 cpu microseconds. Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.315978) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 2510994 bytes OK Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.316011) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.318321) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.318351) EVENT_LOG_v1 {"time_micros": 1764324287318343, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.318377) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 2576671, prev total WAL file size 2576671, number of live WAL files 2. Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.319460) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131373937' seq:72057594037927935, type:22 .. '7061786F73003132303439' seq:0, type:0; will stop at (end) Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(2452KB)], [45(15MB)] Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324287319500, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 18631561, "oldest_snapshot_seqno": -1} Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 12508 keys, 16635212 bytes, temperature: kUnknown Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324287406437, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 16635212, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16564008, "index_size": 38847, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31301, "raw_key_size": 334625, "raw_average_key_size": 26, "raw_value_size": 16351241, "raw_average_value_size": 1307, "num_data_blocks": 1476, "num_entries": 12508, "num_filter_entries": 12508, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764324287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}} Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 05:04:47 localhost podman[319139]: 2025-11-28 10:04:47.408646126 +0000 UTC m=+0.164820164 container cleanup a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.407190) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 16635212 bytes Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.412135) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 214.1 rd, 191.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 15.4 +0.0 blob) out(15.9 +0.0 blob), read-write-amplify(14.0) write-amplify(6.6) OK, records in: 13055, records dropped: 547 output_compression: NoCompression Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.412182) EVENT_LOG_v1 {"time_micros": 1764324287412164, "job": 26, "event": "compaction_finished", "compaction_time_micros": 87038, "compaction_time_cpu_micros": 40633, "output_level": 6, "num_output_files": 1, "total_output_size": 16635212, "num_input_records": 13055, "num_output_records": 12508, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324287412637, "job": 26, "event": "table_file_deletion", "file_number": 47} Nov 28 05:04:47 localhost systemd[1]: libpod-conmon-a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796.scope: Deactivated successfully. Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324287414368, "job": 26, "event": "table_file_deletion", "file_number": 45} Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.319332) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.414466) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.414475) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.414479) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.414482) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:04:47 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:04:47.414484) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:04:47 localhost nova_compute[279673]: 2025-11-28 10:04:47.415 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 05:04:47 localhost nova_compute[279673]: 2025-11-28 10:04:47.415 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 05:04:47 localhost nova_compute[279673]: 2025-11-28 10:04:47.416 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 05:04:47 localhost nova_compute[279673]: 2025-11-28 10:04:47.416 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:04:47 localhost podman[319141]: 2025-11-28 10:04:47.433135998 +0000 UTC m=+0.184221490 container remove a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true) Nov 28 05:04:47 localhost nova_compute[279673]: 2025-11-28 10:04:47.452 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:47 localhost ovn_controller[152322]: 2025-11-28T10:04:47Z|00281|binding|INFO|Releasing lport 8f633916-5bf0-443d-a81d-61c50c415e1a from this chassis (sb_readonly=0) Nov 28 05:04:47 localhost kernel: device tap8f633916-5b left promiscuous mode Nov 28 05:04:47 localhost ovn_controller[152322]: 2025-11-28T10:04:47Z|00282|binding|INFO|Setting lport 8f633916-5bf0-443d-a81d-61c50c415e1a down in Southbound Nov 28 05:04:47 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:47.462 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe63:6927/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8f633916-5bf0-443d-a81d-61c50c415e1a) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:47 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:47.464 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 8f633916-5bf0-443d-a81d-61c50c415e1a in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis#033[00m Nov 28 05:04:47 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:47.467 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:04:47 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:47.468 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[1401545f-128d-4253-b4d8-3ec921770db1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:47 localhost nova_compute[279673]: 2025-11-28 10:04:47.479 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:47 localhost nova_compute[279673]: 2025-11-28 10:04:47.483 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:47 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:47.755 261084 INFO neutron.agent.dhcp.agent [None req-fcee2cb3-8024-4735-b25a-a157dfa5c984 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:04:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:04:48 localhost openstack_network_exporter[240658]: ERROR 10:04:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:04:48 localhost openstack_network_exporter[240658]: ERROR 10:04:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:04:48 localhost openstack_network_exporter[240658]: Nov 28 05:04:48 localhost openstack_network_exporter[240658]: ERROR 10:04:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:04:48 localhost openstack_network_exporter[240658]: ERROR 10:04:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:04:48 localhost openstack_network_exporter[240658]: ERROR 10:04:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:04:48 localhost openstack_network_exporter[240658]: Nov 28 05:04:48 localhost podman[319166]: 2025-11-28 10:04:48.143123971 +0000 UTC m=+0.131302983 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 28 05:04:48 localhost podman[319166]: 2025-11-28 10:04:48.156432323 +0000 UTC m=+0.144611335 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, container_name=openstack_network_exporter, release=1755695350, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal) Nov 28 05:04:48 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:04:48 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:48.199 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:04:48 localhost systemd[1]: var-lib-containers-storage-overlay-7f8555d1fc38767bef1d7de04762368666c8c66ba5ae9285326628ef97b61012-merged.mount: Deactivated successfully. Nov 28 05:04:48 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a6c9b6c83f08f3a073604ffe50805fca3acd72114a9f989864857c358b01a796-userdata-shm.mount: Deactivated successfully. Nov 28 05:04:48 localhost systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully. Nov 28 05:04:48 localhost nova_compute[279673]: 2025-11-28 10:04:48.817 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:04:48 localhost nova_compute[279673]: 2025-11-28 10:04:48.832 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 05:04:48 localhost nova_compute[279673]: 2025-11-28 10:04:48.833 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 05:04:48 localhost nova_compute[279673]: 2025-11-28 10:04:48.833 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:04:49 localhost ovn_controller[152322]: 2025-11-28T10:04:49Z|00283|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:04:49 localhost nova_compute[279673]: 2025-11-28 10:04:49.202 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:49 localhost nova_compute[279673]: 2025-11-28 10:04:49.573 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:50 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:04:50 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e150 do_prune osdmap full prune enabled Nov 28 05:04:50 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e151 e151: 6 total, 6 up, 6 in Nov 28 05:04:50 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e151: 6 total, 6 up, 6 in Nov 28 05:04:50 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:50.285 261084 INFO neutron.agent.linux.ip_lib [None req-8ae4241f-8dc9-4b9b-8146-7b59c1764f43 - - - - - -] Device tapd9d6658c-69 cannot be used as it has no MAC address#033[00m Nov 28 05:04:50 localhost nova_compute[279673]: 2025-11-28 10:04:50.350 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:50 localhost kernel: device tapd9d6658c-69 entered promiscuous mode Nov 28 05:04:50 localhost ovn_controller[152322]: 2025-11-28T10:04:50Z|00284|binding|INFO|Claiming lport d9d6658c-69f3-434a-a139-9146d8ddb475 for this chassis. Nov 28 05:04:50 localhost ovn_controller[152322]: 2025-11-28T10:04:50Z|00285|binding|INFO|d9d6658c-69f3-434a-a139-9146d8ddb475: Claiming unknown Nov 28 05:04:50 localhost nova_compute[279673]: 2025-11-28 10:04:50.359 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:50 localhost NetworkManager[5967]: [1764324290.3618] manager: (tapd9d6658c-69): new Generic device (/org/freedesktop/NetworkManager/Devices/49) Nov 28 05:04:50 localhost systemd-udevd[319211]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:04:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:50.373 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.103.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-02cd8163-742c-4849-a0f3-35dad7f4a404', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02cd8163-742c-4849-a0f3-35dad7f4a404', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c66e098e4fb4a349dc2bb4293454135', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6fc78fcb-5ea1-409a-899f-c137c1b47b0b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d9d6658c-69f3-434a-a139-9146d8ddb475) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:50.377 158130 INFO neutron.agent.ovn.metadata.agent [-] Port d9d6658c-69f3-434a-a139-9146d8ddb475 in datapath 02cd8163-742c-4849-a0f3-35dad7f4a404 bound to our chassis#033[00m Nov 28 05:04:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:50.382 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 02cd8163-742c-4849-a0f3-35dad7f4a404 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:04:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:50.384 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[8088318f-a440-4158-9fdd-97194e04564a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:50 localhost ovn_controller[152322]: 2025-11-28T10:04:50Z|00286|binding|INFO|Setting lport d9d6658c-69f3-434a-a139-9146d8ddb475 ovn-installed in OVS Nov 28 05:04:50 localhost ovn_controller[152322]: 2025-11-28T10:04:50Z|00287|binding|INFO|Setting lport d9d6658c-69f3-434a-a139-9146d8ddb475 up in Southbound Nov 28 05:04:50 localhost nova_compute[279673]: 2025-11-28 10:04:50.392 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:50 localhost nova_compute[279673]: 2025-11-28 10:04:50.424 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:50 localhost dnsmasq[318149]: read /var/lib/neutron/dhcp/f6bc7039-ebcb-4d5c-bff1-81be4c2607bb/addn_hosts - 0 addresses Nov 28 05:04:50 localhost dnsmasq-dhcp[318149]: read /var/lib/neutron/dhcp/f6bc7039-ebcb-4d5c-bff1-81be4c2607bb/host Nov 28 05:04:50 localhost podman[319208]: 2025-11-28 10:04:50.456243952 +0000 UTC m=+0.083238746 container kill 10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 28 05:04:50 localhost dnsmasq-dhcp[318149]: read /var/lib/neutron/dhcp/f6bc7039-ebcb-4d5c-bff1-81be4c2607bb/opts Nov 28 05:04:50 localhost nova_compute[279673]: 2025-11-28 10:04:50.476 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:50 localhost nova_compute[279673]: 2025-11-28 10:04:50.509 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:50 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:50.519 261084 INFO neutron.agent.linux.ip_lib [None req-5b788fcf-a1ed-4f86-9533-2f641daf2beb - - - - - -] Device tap4929710e-eb cannot be used as it has no MAC address#033[00m Nov 28 05:04:50 localhost nova_compute[279673]: 2025-11-28 10:04:50.581 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:50 localhost kernel: device tap4929710e-eb entered promiscuous mode Nov 28 05:04:50 localhost nova_compute[279673]: 2025-11-28 10:04:50.589 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:50 localhost NetworkManager[5967]: [1764324290.5908] manager: (tap4929710e-eb): new Generic device (/org/freedesktop/NetworkManager/Devices/50) Nov 28 05:04:50 localhost ovn_controller[152322]: 2025-11-28T10:04:50Z|00288|binding|INFO|Claiming lport 4929710e-eb4c-4144-9bca-64efc297e299 for this chassis. Nov 28 05:04:50 localhost ovn_controller[152322]: 2025-11-28T10:04:50Z|00289|binding|INFO|4929710e-eb4c-4144-9bca-64efc297e299: Claiming unknown Nov 28 05:04:50 localhost nova_compute[279673]: 2025-11-28 10:04:50.600 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:50.607 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-553c7f35-d914-4af1-9846-a8cbe21f53f3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-553c7f35-d914-4af1-9846-a8cbe21f53f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'aa5be61eafca4d96976422f0e0103210', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=40693dd3-cde5-4c50-9ed5-4dc8ef3313af, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=4929710e-eb4c-4144-9bca-64efc297e299) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:50.611 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 4929710e-eb4c-4144-9bca-64efc297e299 in datapath 553c7f35-d914-4af1-9846-a8cbe21f53f3 bound to our chassis#033[00m Nov 28 05:04:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:50.614 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 553c7f35-d914-4af1-9846-a8cbe21f53f3 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:04:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:50.616 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[41a2dfe4-7c23-43f2-8881-be684852fc4a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:50 localhost nova_compute[279673]: 2025-11-28 10:04:50.635 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:50 localhost ovn_controller[152322]: 2025-11-28T10:04:50Z|00290|binding|INFO|Setting lport 4929710e-eb4c-4144-9bca-64efc297e299 ovn-installed in OVS Nov 28 05:04:50 localhost ovn_controller[152322]: 2025-11-28T10:04:50Z|00291|binding|INFO|Setting lport 4929710e-eb4c-4144-9bca-64efc297e299 up in Southbound Nov 28 05:04:50 localhost nova_compute[279673]: 2025-11-28 10:04:50.640 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:50 localhost nova_compute[279673]: 2025-11-28 10:04:50.714 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:50 localhost nova_compute[279673]: 2025-11-28 10:04:50.757 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:50 localhost ovn_controller[152322]: 2025-11-28T10:04:50Z|00292|binding|INFO|Releasing lport aad9e073-acbe-49ad-8b8e-e03f91cd53c9 from this chassis (sb_readonly=1) Nov 28 05:04:50 localhost kernel: device tapaad9e073-ac left promiscuous mode Nov 28 05:04:50 localhost ovn_controller[152322]: 2025-11-28T10:04:50Z|00293|if_status|INFO|Not setting lport aad9e073-acbe-49ad-8b8e-e03f91cd53c9 down as sb is readonly Nov 28 05:04:50 localhost ovn_controller[152322]: 2025-11-28T10:04:50Z|00294|binding|INFO|Setting lport aad9e073-acbe-49ad-8b8e-e03f91cd53c9 down in Southbound Nov 28 05:04:50 localhost nova_compute[279673]: 2025-11-28 10:04:50.766 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:50.773 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:2::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '50a1392ce96c4024bcd36a3df403ca29', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f7becd79-4f22-46be-87af-f81de3e971b9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=aad9e073-acbe-49ad-8b8e-e03f91cd53c9) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:50.774 158130 INFO neutron.agent.ovn.metadata.agent [-] Port aad9e073-acbe-49ad-8b8e-e03f91cd53c9 in datapath f6bc7039-ebcb-4d5c-bff1-81be4c2607bb unbound from our chassis#033[00m Nov 28 05:04:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:50.776 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f6bc7039-ebcb-4d5c-bff1-81be4c2607bb or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:04:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:50.776 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[09054bd0-c1ef-444d-93c1-d0a3c26b3a36]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:50 localhost nova_compute[279673]: 2025-11-28 10:04:50.784 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:50 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:50.841 261084 INFO neutron.agent.linux.ip_lib [None req-59b9ca3f-4ae0-4c11-9d2d-542918d9c063 - - - - - -] Device tap02d1d927-32 cannot be used as it has no MAC address#033[00m Nov 28 05:04:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:50.843 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:04:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:50.844 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:04:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:50.844 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:04:50 localhost nova_compute[279673]: 2025-11-28 10:04:50.870 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:50 localhost kernel: device tap02d1d927-32 entered promiscuous mode Nov 28 05:04:50 localhost nova_compute[279673]: 2025-11-28 10:04:50.875 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:50 localhost ovn_controller[152322]: 2025-11-28T10:04:50Z|00295|binding|INFO|Claiming lport 02d1d927-321d-4f5a-aba3-0b3dba5bfaf4 for this chassis. Nov 28 05:04:50 localhost NetworkManager[5967]: [1764324290.8764] manager: (tap02d1d927-32): new Generic device (/org/freedesktop/NetworkManager/Devices/51) Nov 28 05:04:50 localhost ovn_controller[152322]: 2025-11-28T10:04:50Z|00296|binding|INFO|02d1d927-321d-4f5a-aba3-0b3dba5bfaf4: Claiming unknown Nov 28 05:04:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:50.892 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fed1:953f/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=02d1d927-321d-4f5a-aba3-0b3dba5bfaf4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:50.893 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 02d1d927-321d-4f5a-aba3-0b3dba5bfaf4 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis#033[00m Nov 28 05:04:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:50.896 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 37b04545-d7a0-418e-bffa-b4f816d3d9d5 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:04:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:50.896 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:04:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:50.897 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[1bee9c27-c373-4f51-a323-782ce99f1144]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:50 localhost nova_compute[279673]: 2025-11-28 10:04:50.906 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:50 localhost ovn_controller[152322]: 2025-11-28T10:04:50Z|00297|binding|INFO|Setting lport 02d1d927-321d-4f5a-aba3-0b3dba5bfaf4 ovn-installed in OVS Nov 28 05:04:50 localhost ovn_controller[152322]: 2025-11-28T10:04:50Z|00298|binding|INFO|Setting lport 02d1d927-321d-4f5a-aba3-0b3dba5bfaf4 up in Southbound Nov 28 05:04:50 localhost nova_compute[279673]: 2025-11-28 10:04:50.912 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:50 localhost nova_compute[279673]: 2025-11-28 10:04:50.913 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:50 localhost nova_compute[279673]: 2025-11-28 10:04:50.955 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:50 localhost nova_compute[279673]: 2025-11-28 10:04:50.984 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:51 localhost neutron_sriov_agent[254147]: 2025-11-28 10:04:51.135 2 INFO neutron.agent.securitygroups_rpc [None req-54cf6d1c-df90-4434-9ef1-afe91707ca30 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:04:51 localhost ovn_controller[152322]: 2025-11-28T10:04:51Z|00299|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:04:51 localhost nova_compute[279673]: 2025-11-28 10:04:51.459 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:51 localhost dnsmasq[318149]: exiting on receipt of SIGTERM Nov 28 05:04:51 localhost podman[319359]: 2025-11-28 10:04:51.513841057 +0000 UTC m=+0.050974511 container kill 10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Nov 28 05:04:51 localhost systemd[1]: libpod-10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730.scope: Deactivated successfully. Nov 28 05:04:51 localhost podman[319388]: 2025-11-28 10:04:51.568840182 +0000 UTC m=+0.040478690 container died 10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:04:51 localhost systemd[1]: tmp-crun.rIHmMB.mount: Deactivated successfully. Nov 28 05:04:51 localhost podman[319388]: 2025-11-28 10:04:51.615468279 +0000 UTC m=+0.087106767 container cleanup 10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 28 05:04:51 localhost systemd[1]: libpod-conmon-10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730.scope: Deactivated successfully. Nov 28 05:04:51 localhost podman[319414]: Nov 28 05:04:51 localhost podman[319414]: 2025-11-28 10:04:51.644388617 +0000 UTC m=+0.084272446 container create 4a2296a67e0de640baea4a3d45180d4e83115603ecdbae322602dd4db3ff95c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 28 05:04:51 localhost systemd[1]: Started libpod-conmon-4a2296a67e0de640baea4a3d45180d4e83115603ecdbae322602dd4db3ff95c0.scope. Nov 28 05:04:51 localhost systemd[1]: Started libcrun container. Nov 28 05:04:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e9304802cff54ff9919214d297ca735a4bd2139de3fddfd5405c9e013af228f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:04:51 localhost podman[319390]: 2025-11-28 10:04:51.702138422 +0000 UTC m=+0.170077154 container remove 10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f6bc7039-ebcb-4d5c-bff1-81be4c2607bb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 05:04:51 localhost podman[319414]: 2025-11-28 10:04:51.61550121 +0000 UTC m=+0.055385069 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:51 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:51.729 261084 INFO neutron.agent.dhcp.agent [None req-23531d2a-0141-488a-84c1-8f0df7621e66 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:04:51 localhost podman[319414]: 2025-11-28 10:04:51.739800521 +0000 UTC m=+0.179684380 container init 4a2296a67e0de640baea4a3d45180d4e83115603ecdbae322602dd4db3ff95c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:51 localhost podman[319414]: 2025-11-28 10:04:51.747629015 +0000 UTC m=+0.187512884 container start 4a2296a67e0de640baea4a3d45180d4e83115603ecdbae322602dd4db3ff95c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:51 localhost dnsmasq[319457]: started, version 2.85 cachesize 150 Nov 28 05:04:51 localhost dnsmasq[319457]: DNS service limited to local subnets Nov 28 05:04:51 localhost dnsmasq[319457]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:04:51 localhost dnsmasq[319457]: warning: no upstream servers configured Nov 28 05:04:51 localhost dnsmasq-dhcp[319457]: DHCP, static leases only on 10.103.0.0, lease time 1d Nov 28 05:04:51 localhost dnsmasq[319457]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/addn_hosts - 0 addresses Nov 28 05:04:51 localhost dnsmasq-dhcp[319457]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/host Nov 28 05:04:51 localhost dnsmasq-dhcp[319457]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/opts Nov 28 05:04:51 localhost dnsmasq[319457]: exiting on receipt of SIGTERM Nov 28 05:04:51 localhost systemd[1]: libpod-4a2296a67e0de640baea4a3d45180d4e83115603ecdbae322602dd4db3ff95c0.scope: Deactivated successfully. Nov 28 05:04:51 localhost podman[319466]: 2025-11-28 10:04:51.901143164 +0000 UTC m=+0.127010130 container died 4a2296a67e0de640baea4a3d45180d4e83115603ecdbae322602dd4db3ff95c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:51 localhost neutron_sriov_agent[254147]: 2025-11-28 10:04:51.909 2 INFO neutron.agent.securitygroups_rpc [None req-263d211e-775e-47b3-9274-70437dee437e 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:04:51 localhost podman[319466]: 2025-11-28 10:04:51.935578981 +0000 UTC m=+0.161445896 container cleanup 4a2296a67e0de640baea4a3d45180d4e83115603ecdbae322602dd4db3ff95c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:04:51 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:51.939 261084 INFO neutron.agent.dhcp.agent [None req-4fbd5d11-3723-4a2f-89f6-06cd393da79f - - - - - -] DHCP configuration for ports {'7cbbd458-cac1-440b-b157-44ec4d7deea5'} is completed#033[00m Nov 28 05:04:51 localhost podman[319507]: 2025-11-28 10:04:51.988871908 +0000 UTC m=+0.083663859 container cleanup 4a2296a67e0de640baea4a3d45180d4e83115603ecdbae322602dd4db3ff95c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 05:04:51 localhost systemd[1]: libpod-conmon-4a2296a67e0de640baea4a3d45180d4e83115603ecdbae322602dd4db3ff95c0.scope: Deactivated successfully. Nov 28 05:04:51 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:51.997 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:04:52 localhost podman[319497]: Nov 28 05:04:52 localhost podman[319497]: 2025-11-28 10:04:52.027225267 +0000 UTC m=+0.154854998 container create a3d71f781cb699f41bc11302acbeaa2d8e5c53cbc9f376de46a48a8241faf409 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 28 05:04:52 localhost podman[319524]: 2025-11-28 10:04:52.030313226 +0000 UTC m=+0.077017278 container remove 4a2296a67e0de640baea4a3d45180d4e83115603ecdbae322602dd4db3ff95c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 28 05:04:52 localhost systemd[1]: Started libpod-conmon-a3d71f781cb699f41bc11302acbeaa2d8e5c53cbc9f376de46a48a8241faf409.scope. Nov 28 05:04:52 localhost podman[319497]: 2025-11-28 10:04:51.977665867 +0000 UTC m=+0.105295688 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:52 localhost systemd[1]: Started libcrun container. Nov 28 05:04:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b0456f1e21b6dc3596ed21a7e347b4c98e62d6679920882846a52b9c4ebe74b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:04:52 localhost podman[319497]: 2025-11-28 10:04:52.100784175 +0000 UTC m=+0.228413926 container init a3d71f781cb699f41bc11302acbeaa2d8e5c53cbc9f376de46a48a8241faf409 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:52 localhost podman[319497]: 2025-11-28 10:04:52.109984249 +0000 UTC m=+0.237614000 container start a3d71f781cb699f41bc11302acbeaa2d8e5c53cbc9f376de46a48a8241faf409 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Nov 28 05:04:52 localhost dnsmasq[319562]: started, version 2.85 cachesize 150 Nov 28 05:04:52 localhost dnsmasq[319562]: DNS service limited to local subnets Nov 28 05:04:52 localhost dnsmasq[319562]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:04:52 localhost dnsmasq[319562]: warning: no upstream servers configured Nov 28 05:04:52 localhost dnsmasq[319562]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:52 localhost podman[319543]: Nov 28 05:04:52 localhost podman[319543]: 2025-11-28 10:04:52.137013073 +0000 UTC m=+0.086716166 container create 2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-553c7f35-d914-4af1-9846-a8cbe21f53f3, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:52 localhost systemd[1]: Started libpod-conmon-2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd.scope. Nov 28 05:04:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:04:52 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:52.171 261084 INFO neutron.agent.dhcp.agent [None req-59b9ca3f-4ae0-4c11-9d2d-542918d9c063 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:50Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f63a2be6-1492-465e-b2eb-d9c3e14e96a2, ip_allocation=immediate, mac_address=fa:16:3e:36:5b:fc, name=tempest-NetworksTestDHCPv6-1872098773, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=42, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['5fc650ff-2fcc-48db-bb2f-2b0928cc0382'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:47Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1801, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:50Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1#033[00m Nov 28 05:04:52 localhost systemd[1]: Started libcrun container. Nov 28 05:04:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bcc12744151ca7e94791368e387c8ab451571c56040070ed3ec7c54499fb7f3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:04:52 localhost podman[319543]: 2025-11-28 10:04:52.091294393 +0000 UTC m=+0.040997526 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:52 localhost podman[319543]: 2025-11-28 10:04:52.193064219 +0000 UTC m=+0.142767312 container init 2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-553c7f35-d914-4af1-9846-a8cbe21f53f3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Nov 28 05:04:52 localhost dnsmasq[319577]: started, version 2.85 cachesize 150 Nov 28 05:04:52 localhost dnsmasq[319577]: DNS service limited to local subnets Nov 28 05:04:52 localhost podman[319543]: 2025-11-28 10:04:52.213835765 +0000 UTC m=+0.163538828 container start 2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-553c7f35-d914-4af1-9846-a8cbe21f53f3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 28 05:04:52 localhost dnsmasq[319577]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:04:52 localhost dnsmasq[319577]: warning: no upstream servers configured Nov 28 05:04:52 localhost dnsmasq-dhcp[319577]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:04:52 localhost dnsmasq[319577]: read /var/lib/neutron/dhcp/553c7f35-d914-4af1-9846-a8cbe21f53f3/addn_hosts - 0 addresses Nov 28 05:04:52 localhost dnsmasq-dhcp[319577]: read /var/lib/neutron/dhcp/553c7f35-d914-4af1-9846-a8cbe21f53f3/host Nov 28 05:04:52 localhost dnsmasq-dhcp[319577]: read /var/lib/neutron/dhcp/553c7f35-d914-4af1-9846-a8cbe21f53f3/opts Nov 28 05:04:52 localhost podman[319566]: 2025-11-28 10:04:52.272299709 +0000 UTC m=+0.088007362 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 05:04:52 localhost podman[319566]: 2025-11-28 10:04:52.278777515 +0000 UTC m=+0.094485198 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 05:04:52 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:04:52 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:52.293 261084 INFO neutron.agent.dhcp.agent [None req-b05cf7bc-5b05-478e-bd7a-cd72748cb9da - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed#033[00m Nov 28 05:04:52 localhost dnsmasq[319562]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses Nov 28 05:04:52 localhost podman[319610]: 2025-11-28 10:04:52.419865018 +0000 UTC m=+0.067811264 container kill a3d71f781cb699f41bc11302acbeaa2d8e5c53cbc9f376de46a48a8241faf409 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Nov 28 05:04:52 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:52.431 261084 INFO neutron.agent.dhcp.agent [None req-fa590fc1-dfab-4180-a82b-49a69d1c0c8c - - - - - -] DHCP configuration for ports {'06954665-2137-4b8e-888c-1d5516ae6541'} is completed#033[00m Nov 28 05:04:52 localhost systemd[1]: tmp-crun.xsm86W.mount: Deactivated successfully. Nov 28 05:04:52 localhost systemd[1]: var-lib-containers-storage-overlay-578830e16dc254a672039252f1b4aca3106a0a33f8e3c5f9da3689144398b5d1-merged.mount: Deactivated successfully. Nov 28 05:04:52 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-10a06f70e7dec31415a1ef25bcab7edf7cf9e9c5cfc814d8d21d52652069f730-userdata-shm.mount: Deactivated successfully. Nov 28 05:04:52 localhost systemd[1]: run-netns-qdhcp\x2df6bc7039\x2debcb\x2d4d5c\x2dbff1\x2d81be4c2607bb.mount: Deactivated successfully. Nov 28 05:04:52 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:52.646 261084 INFO neutron.agent.dhcp.agent [None req-c2992044-d336-44ca-b6d0-6ce8c8535f9a - - - - - -] DHCP configuration for ports {'f63a2be6-1492-465e-b2eb-d9c3e14e96a2'} is completed#033[00m Nov 28 05:04:52 localhost ovn_controller[152322]: 2025-11-28T10:04:52Z|00300|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:04:52 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:52.722 261084 ERROR neutron.agent.linux.external_process [-] dnsmasq for dhcp with uuid 02cd8163-742c-4849-a0f3-35dad7f4a404 not found. The process should not have died#033[00m Nov 28 05:04:52 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:52.722 261084 WARNING neutron.agent.linux.external_process [-] Respawning dnsmasq for uuid 02cd8163-742c-4849-a0f3-35dad7f4a404#033[00m Nov 28 05:04:52 localhost nova_compute[279673]: 2025-11-28 10:04:52.762 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:52 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:52.784 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:52Z, description=, device_id=65a5fc61-8378-41fb-8a6b-788254c76348, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ca552b8f-155d-47a1-be25-a7aeb0006de8, ip_allocation=immediate, mac_address=fa:16:3e:c1:e6:2e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:04:46Z, description=, dns_domain=, id=02cd8163-742c-4849-a0f3-35dad7f4a404, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-162114118, port_security_enabled=True, project_id=8c66e098e4fb4a349dc2bb4293454135, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25348, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1774, status=ACTIVE, subnets=['d24e0b52-5dd2-4d29-98da-71dd46882b44'], tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:04:49Z, vlan_transparent=None, network_id=02cd8163-742c-4849-a0f3-35dad7f4a404, port_security_enabled=False, project_id=8c66e098e4fb4a349dc2bb4293454135, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1810, status=DOWN, tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:04:52Z on network 02cd8163-742c-4849-a0f3-35dad7f4a404#033[00m Nov 28 05:04:52 localhost systemd[1]: tmp-crun.wXtWlT.mount: Deactivated successfully. Nov 28 05:04:52 localhost dnsmasq[319562]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:52 localhost podman[319648]: 2025-11-28 10:04:52.831721589 +0000 UTC m=+0.048748597 container kill a3d71f781cb699f41bc11302acbeaa2d8e5c53cbc9f376de46a48a8241faf409 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3) Nov 28 05:04:53 localhost podman[319716]: Nov 28 05:04:53 localhost podman[319716]: 2025-11-28 10:04:53.236518778 +0000 UTC m=+0.137905393 container create 0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:04:53 localhost podman[319746]: 2025-11-28 10:04:53.224651108 +0000 UTC m=+0.047762200 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:53 localhost podman[319716]: 2025-11-28 10:04:53.183661594 +0000 UTC m=+0.085048269 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:53 localhost systemd[1]: Started libpod-conmon-0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0.scope. Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.305 261084 ERROR neutron.agent.linux.utils [None req-1cdd1bc3-5801-4696-a775-dc6c954a1d08 - - - - - -] Exit code: 125; Cmd: ['ip', 'netns', 'exec', 'qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404', 'env', 'PROCESS_TAG=dnsmasq-02cd8163-742c-4849-a0f3-35dad7f4a404', 'dnsmasq', '--no-hosts', '--no-resolv', '--pid-file=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/pid', '--dhcp-hostsfile=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/host', '--addn-hosts=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/addn_hosts', '--dhcp-optsfile=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/opts', '--dhcp-leasefile=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/leases', '--dhcp-match=set:ipxe,175', '--dhcp-userclass=set:ipxe6,iPXE', '--local-service', '--bind-dynamic', '--dhcp-range=set:subnet-d24e0b52-5dd2-4d29-98da-71dd46882b44,10.103.0.0,static,255.255.255.240,86400s', '--dhcp-option-force=option:mtu,1442', '--dhcp-lease-max=16', '--conf-file=/dev/null', '--domain=openstacklocal']; Stdin: ; Stdout: Starting a new child container neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404 Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: ; Stderr: Error: creating container storage: the container name "neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404" is already in use by 0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0. You have to remove that container to be able to reuse that name: that name is already in use Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: #033[00m Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent [None req-1cdd1bc3-5801-4696-a775-dc6c954a1d08 - - - - - -] Unable to reload_allocations dhcp for 02cd8163-742c-4849-a0f3-35dad7f4a404.: neutron_lib.exceptions.ProcessExecutionError: Exit code: 125; Cmd: ['ip', 'netns', 'exec', 'qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404', 'env', 'PROCESS_TAG=dnsmasq-02cd8163-742c-4849-a0f3-35dad7f4a404', 'dnsmasq', '--no-hosts', '--no-resolv', '--pid-file=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/pid', '--dhcp-hostsfile=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/host', '--addn-hosts=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/addn_hosts', '--dhcp-optsfile=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/opts', '--dhcp-leasefile=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/leases', '--dhcp-match=set:ipxe,175', '--dhcp-userclass=set:ipxe6,iPXE', '--local-service', '--bind-dynamic', '--dhcp-range=set:subnet-d24e0b52-5dd2-4d29-98da-71dd46882b44,10.103.0.0,static,255.255.255.240,86400s', '--dhcp-option-force=option:mtu,1442', '--dhcp-lease-max=16', '--conf-file=/dev/null', '--domain=openstacklocal']; Stdin: ; Stdout: Starting a new child container neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404 Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: ; Stderr: Error: creating container storage: the container name "neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404" is already in use by 0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0. You have to remove that container to be able to reuse that name: that name is already in use Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 671, in reload_allocations Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent self._spawn_or_reload_process(reload_with_HUP=True) Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 603, in _spawn_or_reload_process Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent pm.enable(reload_cfg=reload_with_HUP, ensure_active=True) Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/external_process.py", line 105, in enable Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent ip_wrapper.netns.execute(cmd, addl_env=self.cmd_addl_env, Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 775, in execute Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent return utils.execute(cmd, check_exit_code=check_exit_code, Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py", line 156, in execute Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent raise exceptions.ProcessExecutionError(msg, Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent neutron_lib.exceptions.ProcessExecutionError: Exit code: 125; Cmd: ['ip', 'netns', 'exec', 'qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404', 'env', 'PROCESS_TAG=dnsmasq-02cd8163-742c-4849-a0f3-35dad7f4a404', 'dnsmasq', '--no-hosts', '--no-resolv', '--pid-file=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/pid', '--dhcp-hostsfile=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/host', '--addn-hosts=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/addn_hosts', '--dhcp-optsfile=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/opts', '--dhcp-leasefile=/var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/leases', '--dhcp-match=set:ipxe,175', '--dhcp-userclass=set:ipxe6,iPXE', '--local-service', '--bind-dynamic', '--dhcp-range=set:subnet-d24e0b52-5dd2-4d29-98da-71dd46882b44,10.103.0.0,static,255.255.255.240,86400s', '--dhcp-option-force=option:mtu,1442', '--dhcp-lease-max=16', '--conf-file=/dev/null', '--domain=openstacklocal']; Stdin: ; Stdout: Starting a new child container neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404 Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent ; Stderr: Error: creating container storage: the container name "neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404" is already in use by 0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0. You have to remove that container to be able to reuse that name: that name is already in use Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.307 261084 ERROR neutron.agent.dhcp.agent #033[00m Nov 28 05:04:53 localhost dnsmasq[319562]: exiting on receipt of SIGTERM Nov 28 05:04:53 localhost podman[319759]: 2025-11-28 10:04:53.317634923 +0000 UTC m=+0.072109208 container kill a3d71f781cb699f41bc11302acbeaa2d8e5c53cbc9f376de46a48a8241faf409 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 28 05:04:53 localhost systemd[1]: Started libcrun container. Nov 28 05:04:53 localhost systemd[1]: libpod-a3d71f781cb699f41bc11302acbeaa2d8e5c53cbc9f376de46a48a8241faf409.scope: Deactivated successfully. Nov 28 05:04:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/22ad0fbf2ca34ae47ded5d35885e553d449cf886f83c141d08fd858e674bbcc6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:04:53 localhost podman[319716]: 2025-11-28 10:04:53.332713704 +0000 UTC m=+0.234100319 container init 0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 05:04:53 localhost podman[319716]: 2025-11-28 10:04:53.34236422 +0000 UTC m=+0.243750825 container start 0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0) Nov 28 05:04:53 localhost dnsmasq[319793]: started, version 2.85 cachesize 150 Nov 28 05:04:53 localhost dnsmasq[319793]: DNS service limited to local subnets Nov 28 05:04:53 localhost dnsmasq[319793]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:04:53 localhost dnsmasq[319793]: warning: no upstream servers configured Nov 28 05:04:53 localhost dnsmasq-dhcp[319793]: DHCP, static leases only on 10.103.0.0, lease time 1d Nov 28 05:04:53 localhost dnsmasq[319793]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/addn_hosts - 1 addresses Nov 28 05:04:53 localhost dnsmasq-dhcp[319793]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/host Nov 28 05:04:53 localhost dnsmasq-dhcp[319793]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/opts Nov 28 05:04:53 localhost podman[319786]: 2025-11-28 10:04:53.377742645 +0000 UTC m=+0.038966368 container died a3d71f781cb699f41bc11302acbeaa2d8e5c53cbc9f376de46a48a8241faf409 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.421 261084 INFO neutron.agent.dhcp.agent [None req-4b96f896-8a23-4d2e-9b85-484ff0d7515a - - - - - -] DHCP configuration for ports {'ca552b8f-155d-47a1-be25-a7aeb0006de8'} is completed#033[00m Nov 28 05:04:53 localhost podman[319786]: 2025-11-28 10:04:53.431774532 +0000 UTC m=+0.092998225 container remove a3d71f781cb699f41bc11302acbeaa2d8e5c53cbc9f376de46a48a8241faf409 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:53 localhost systemd[1]: libpod-conmon-a3d71f781cb699f41bc11302acbeaa2d8e5c53cbc9f376de46a48a8241faf409.scope: Deactivated successfully. Nov 28 05:04:53 localhost ovn_controller[152322]: 2025-11-28T10:04:53Z|00301|binding|INFO|Releasing lport 02d1d927-321d-4f5a-aba3-0b3dba5bfaf4 from this chassis (sb_readonly=0) Nov 28 05:04:53 localhost nova_compute[279673]: 2025-11-28 10:04:53.446 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:53 localhost ovn_controller[152322]: 2025-11-28T10:04:53Z|00302|binding|INFO|Setting lport 02d1d927-321d-4f5a-aba3-0b3dba5bfaf4 down in Southbound Nov 28 05:04:53 localhost kernel: device tap02d1d927-32 left promiscuous mode Nov 28 05:04:53 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:53.455 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fed1:953f/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=02d1d927-321d-4f5a-aba3-0b3dba5bfaf4) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:53 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:53.457 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 02d1d927-321d-4f5a-aba3-0b3dba5bfaf4 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis#033[00m Nov 28 05:04:53 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:53.459 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:04:53 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:53.460 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[69571c45-a761-439a-8724-50402ddb0d51]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:53 localhost nova_compute[279673]: 2025-11-28 10:04:53.463 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:53 localhost systemd[1]: var-lib-containers-storage-overlay-7b0456f1e21b6dc3596ed21a7e347b4c98e62d6679920882846a52b9c4ebe74b-merged.mount: Deactivated successfully. Nov 28 05:04:53 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a3d71f781cb699f41bc11302acbeaa2d8e5c53cbc9f376de46a48a8241faf409-userdata-shm.mount: Deactivated successfully. Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.726 261084 INFO neutron.agent.dhcp.agent [None req-3a4c8826-6eeb-4084-b6e5-81cd4ed1d4d8 - - - - - -] Synchronizing state#033[00m Nov 28 05:04:53 localhost systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully. Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.968 261084 INFO neutron.agent.dhcp.agent [None req-8059cd0d-68ea-4676-82eb-987dfc66d573 - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.969 261084 INFO neutron.agent.dhcp.agent [-] Starting network 02cd8163-742c-4849-a0f3-35dad7f4a404 dhcp configuration#033[00m Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.973 261084 INFO neutron.agent.dhcp.agent [-] Starting network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 dhcp configuration#033[00m Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.977 261084 INFO neutron.agent.dhcp.agent [-] Starting network c8ccf767-8631-44f9-8e16-74febe5b399d dhcp configuration#033[00m Nov 28 05:04:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:53.977 261084 INFO neutron.agent.dhcp.agent [-] Finished network c8ccf767-8631-44f9-8e16-74febe5b399d dhcp configuration#033[00m Nov 28 05:04:54 localhost dnsmasq[319793]: exiting on receipt of SIGTERM Nov 28 05:04:54 localhost systemd[1]: tmp-crun.upphEw.mount: Deactivated successfully. Nov 28 05:04:54 localhost podman[319823]: 2025-11-28 10:04:54.156256832 +0000 UTC m=+0.066639681 container kill 0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 05:04:54 localhost systemd[1]: libpod-0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0.scope: Deactivated successfully. Nov 28 05:04:54 localhost podman[319837]: 2025-11-28 10:04:54.236318726 +0000 UTC m=+0.066526647 container died 0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:54 localhost podman[319837]: 2025-11-28 10:04:54.320178308 +0000 UTC m=+0.150386209 container cleanup 0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125) Nov 28 05:04:54 localhost systemd[1]: libpod-conmon-0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0.scope: Deactivated successfully. Nov 28 05:04:54 localhost podman[319839]: 2025-11-28 10:04:54.342523828 +0000 UTC m=+0.161260781 container remove 0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 05:04:54 localhost systemd[1]: var-lib-containers-storage-overlay-22ad0fbf2ca34ae47ded5d35885e553d449cf886f83c141d08fd858e674bbcc6-merged.mount: Deactivated successfully. Nov 28 05:04:54 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0dd9774c60eeff113346934982dee32cfbdf8a631641d931487a6dcb5cf487e0-userdata-shm.mount: Deactivated successfully. Nov 28 05:04:54 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:54.713 261084 INFO neutron.agent.linux.ip_lib [None req-6b10c8c6-0c7b-4f9b-89b3-b17147669ef4 - - - - - -] Device tap46c16dfb-cb cannot be used as it has no MAC address#033[00m Nov 28 05:04:54 localhost nova_compute[279673]: 2025-11-28 10:04:54.754 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:54 localhost kernel: device tap46c16dfb-cb entered promiscuous mode Nov 28 05:04:54 localhost ovn_controller[152322]: 2025-11-28T10:04:54Z|00303|binding|INFO|Claiming lport 46c16dfb-cb51-4790-9470-74b6d8c3c674 for this chassis. Nov 28 05:04:54 localhost nova_compute[279673]: 2025-11-28 10:04:54.761 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:54 localhost ovn_controller[152322]: 2025-11-28T10:04:54Z|00304|binding|INFO|46c16dfb-cb51-4790-9470-74b6d8c3c674: Claiming unknown Nov 28 05:04:54 localhost NetworkManager[5967]: [1764324294.7631] manager: (tap46c16dfb-cb): new Generic device (/org/freedesktop/NetworkManager/Devices/52) Nov 28 05:04:54 localhost systemd-udevd[319897]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:04:54 localhost ovn_controller[152322]: 2025-11-28T10:04:54Z|00305|binding|INFO|Setting lport 46c16dfb-cb51-4790-9470-74b6d8c3c674 ovn-installed in OVS Nov 28 05:04:54 localhost ovn_controller[152322]: 2025-11-28T10:04:54Z|00306|binding|INFO|Setting lport 46c16dfb-cb51-4790-9470-74b6d8c3c674 up in Southbound Nov 28 05:04:54 localhost nova_compute[279673]: 2025-11-28 10:04:54.777 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:54 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:54.776 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe19:31/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=46c16dfb-cb51-4790-9470-74b6d8c3c674) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:54 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:54.784 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 46c16dfb-cb51-4790-9470-74b6d8c3c674 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis#033[00m Nov 28 05:04:54 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:54.789 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 7b0638af-ec04-4eda-8f5f-a54aa07bc574 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:04:54 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:54.789 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:04:54 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:54.790 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[e00d664e-c3b5-4ab7-9d82-4dcb511ab153]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:54 localhost nova_compute[279673]: 2025-11-28 10:04:54.799 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:54 localhost nova_compute[279673]: 2025-11-28 10:04:54.804 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:54 localhost nova_compute[279673]: 2025-11-28 10:04:54.847 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:54 localhost nova_compute[279673]: 2025-11-28 10:04:54.873 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:55 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:04:55 localhost podman[319956]: Nov 28 05:04:55 localhost podman[319956]: 2025-11-28 10:04:55.447662885 +0000 UTC m=+0.088138237 container create 44f2d086efef886c98e16013dc782e118ca35448399d85a6897ed0acd02024d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:04:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:04:55 localhost systemd[1]: Started libpod-conmon-44f2d086efef886c98e16013dc782e118ca35448399d85a6897ed0acd02024d9.scope. Nov 28 05:04:55 localhost podman[319956]: 2025-11-28 10:04:55.404755686 +0000 UTC m=+0.045231078 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:55 localhost systemd[1]: Started libcrun container. Nov 28 05:04:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c235e185164438599917caaee605008003fd8fff25d4dac6b9b93fcc8e24479/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:04:55 localhost podman[319956]: 2025-11-28 10:04:55.538072476 +0000 UTC m=+0.178547838 container init 44f2d086efef886c98e16013dc782e118ca35448399d85a6897ed0acd02024d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Nov 28 05:04:55 localhost podman[319956]: 2025-11-28 10:04:55.548264758 +0000 UTC m=+0.188740110 container start 44f2d086efef886c98e16013dc782e118ca35448399d85a6897ed0acd02024d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125) Nov 28 05:04:55 localhost dnsmasq[320009]: started, version 2.85 cachesize 150 Nov 28 05:04:55 localhost dnsmasq[320009]: DNS service limited to local subnets Nov 28 05:04:55 localhost dnsmasq[320009]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:04:55 localhost dnsmasq[320009]: warning: no upstream servers configured Nov 28 05:04:55 localhost dnsmasq-dhcp[320009]: DHCP, static leases only on 10.103.0.0, lease time 1d Nov 28 05:04:55 localhost dnsmasq[320009]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/addn_hosts - 1 addresses Nov 28 05:04:55 localhost dnsmasq-dhcp[320009]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/host Nov 28 05:04:55 localhost dnsmasq-dhcp[320009]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/opts Nov 28 05:04:55 localhost podman[319970]: 2025-11-28 10:04:55.589556931 +0000 UTC m=+0.100802589 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 28 05:04:55 localhost podman[319970]: 2025-11-28 10:04:55.594097691 +0000 UTC m=+0.105343329 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:55 localhost nova_compute[279673]: 2025-11-28 10:04:55.605 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:55 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:04:55 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:55.615 261084 INFO neutron.agent.dhcp.agent [None req-180a39a0-3291-4909-9e5b-bc144ab088b8 - - - - - -] Finished network 02cd8163-742c-4849-a0f3-35dad7f4a404 dhcp configuration#033[00m Nov 28 05:04:55 localhost nova_compute[279673]: 2025-11-28 10:04:55.669 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:55 localhost podman[319969]: 2025-11-28 10:04:55.686865969 +0000 UTC m=+0.200646650 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 28 05:04:55 localhost podman[319969]: 2025-11-28 10:04:55.771426303 +0000 UTC m=+0.285206984 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller) Nov 28 05:04:55 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:04:55 localhost nova_compute[279673]: 2025-11-28 10:04:55.803 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:55 localhost podman[320038]: Nov 28 05:04:55 localhost podman[320038]: 2025-11-28 10:04:55.871423957 +0000 UTC m=+0.145069368 container create 6cbf99081af1f9d3a86d645785b96571ba40d1392acc9dad46ae3cc33a2e09ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:55 localhost systemd[1]: Started libpod-conmon-6cbf99081af1f9d3a86d645785b96571ba40d1392acc9dad46ae3cc33a2e09ad.scope. Nov 28 05:04:55 localhost podman[320038]: 2025-11-28 10:04:55.827999084 +0000 UTC m=+0.101644535 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:55 localhost systemd[1]: Started libcrun container. Nov 28 05:04:55 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:55.930 261084 INFO neutron.agent.dhcp.agent [None req-96514ed1-b1f9-4fe3-955b-4da9a2ab1882 - - - - - -] DHCP configuration for ports {'ca552b8f-155d-47a1-be25-a7aeb0006de8', '7cbbd458-cac1-440b-b157-44ec4d7deea5', 'd9d6658c-69f3-434a-a139-9146d8ddb475'} is completed#033[00m Nov 28 05:04:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27bf9817321185a883a9c43ffe6b9dfc96f140d8976c2f56aa39ee4eb45f7b55/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:04:55 localhost podman[320038]: 2025-11-28 10:04:55.941940628 +0000 UTC m=+0.215586029 container init 6cbf99081af1f9d3a86d645785b96571ba40d1392acc9dad46ae3cc33a2e09ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:55 localhost podman[320038]: 2025-11-28 10:04:55.951848842 +0000 UTC m=+0.225494243 container start 6cbf99081af1f9d3a86d645785b96571ba40d1392acc9dad46ae3cc33a2e09ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:04:55 localhost dnsmasq[320057]: started, version 2.85 cachesize 150 Nov 28 05:04:55 localhost dnsmasq[320057]: DNS service limited to local subnets Nov 28 05:04:55 localhost dnsmasq[320057]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:04:55 localhost dnsmasq[320057]: warning: no upstream servers configured Nov 28 05:04:55 localhost dnsmasq-dhcp[320057]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:04:55 localhost dnsmasq[320057]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:55 localhost dnsmasq-dhcp[320057]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:55 localhost dnsmasq-dhcp[320057]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.011 261084 INFO neutron.agent.dhcp.agent [None req-6b10c8c6-0c7b-4f9b-89b3-b17147669ef4 - - - - - -] Finished network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 dhcp configuration#033[00m Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.012 261084 INFO neutron.agent.dhcp.agent [None req-8059cd0d-68ea-4676-82eb-987dfc66d573 - - - - - -] Synchronizing state complete#033[00m Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.019 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:52Z, description=, device_id=65a5fc61-8378-41fb-8a6b-788254c76348, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ca552b8f-155d-47a1-be25-a7aeb0006de8, ip_allocation=immediate, mac_address=fa:16:3e:c1:e6:2e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:04:46Z, description=, dns_domain=, id=02cd8163-742c-4849-a0f3-35dad7f4a404, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-162114118, port_security_enabled=True, project_id=8c66e098e4fb4a349dc2bb4293454135, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25348, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1774, status=ACTIVE, subnets=['d24e0b52-5dd2-4d29-98da-71dd46882b44'], tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:04:49Z, vlan_transparent=None, network_id=02cd8163-742c-4849-a0f3-35dad7f4a404, port_security_enabled=False, project_id=8c66e098e4fb4a349dc2bb4293454135, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1810, status=DOWN, tags=[], tenant_id=8c66e098e4fb4a349dc2bb4293454135, updated_at=2025-11-28T10:04:52Z on network 02cd8163-742c-4849-a0f3-35dad7f4a404#033[00m Nov 28 05:04:56 localhost ovn_controller[152322]: 2025-11-28T10:04:56Z|00307|binding|INFO|Releasing lport 46c16dfb-cb51-4790-9470-74b6d8c3c674 from this chassis (sb_readonly=0) Nov 28 05:04:56 localhost ovn_controller[152322]: 2025-11-28T10:04:56Z|00308|binding|INFO|Setting lport 46c16dfb-cb51-4790-9470-74b6d8c3c674 down in Southbound Nov 28 05:04:56 localhost kernel: device tap46c16dfb-cb left promiscuous mode Nov 28 05:04:56 localhost nova_compute[279673]: 2025-11-28 10:04:56.055 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:56 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:56.062 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe19:31/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=46c16dfb-cb51-4790-9470-74b6d8c3c674) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:56 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:56.065 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 46c16dfb-cb51-4790-9470-74b6d8c3c674 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis#033[00m Nov 28 05:04:56 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:56.069 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:04:56 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:56.071 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[741e55f9-e2b5-48fb-84e2-dcfd2a9a573f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:56 localhost nova_compute[279673]: 2025-11-28 10:04:56.073 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:56 localhost dnsmasq[317792]: read /var/lib/neutron/dhcp/1a246530-be70-4846-9202-8f9cd6d862ae/addn_hosts - 0 addresses Nov 28 05:04:56 localhost dnsmasq-dhcp[317792]: read /var/lib/neutron/dhcp/1a246530-be70-4846-9202-8f9cd6d862ae/host Nov 28 05:04:56 localhost dnsmasq-dhcp[317792]: read /var/lib/neutron/dhcp/1a246530-be70-4846-9202-8f9cd6d862ae/opts Nov 28 05:04:56 localhost podman[320089]: 2025-11-28 10:04:56.213395376 +0000 UTC m=+0.068962976 container kill 57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a246530-be70-4846-9202-8f9cd6d862ae, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 28 05:04:56 localhost dnsmasq[320009]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/addn_hosts - 1 addresses Nov 28 05:04:56 localhost dnsmasq-dhcp[320009]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/host Nov 28 05:04:56 localhost podman[320105]: 2025-11-28 10:04:56.275797295 +0000 UTC m=+0.074899347 container kill 44f2d086efef886c98e16013dc782e118ca35448399d85a6897ed0acd02024d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:04:56 localhost dnsmasq-dhcp[320009]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/opts Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.355 261084 INFO neutron.agent.dhcp.agent [None req-b160f751-2d82-4274-8c9c-feef1c9f442c - - - - - -] DHCP configuration for ports {'ca552b8f-155d-47a1-be25-a7aeb0006de8', 'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '7cbbd458-cac1-440b-b157-44ec4d7deea5', 'd9d6658c-69f3-434a-a139-9146d8ddb475'} is completed#033[00m Nov 28 05:04:56 localhost dnsmasq[320057]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:04:56 localhost dnsmasq-dhcp[320057]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:56 localhost podman[320144]: 2025-11-28 10:04:56.433394991 +0000 UTC m=+0.074169477 container kill 6cbf99081af1f9d3a86d645785b96571ba40d1392acc9dad46ae3cc33a2e09ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3) Nov 28 05:04:56 localhost dnsmasq-dhcp[320057]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent [None req-a9d75015-13fd-417c-ada1-435f3853c34d - - - - - -] Unable to reload_allocations dhcp for 8642adde-54ae-4fc2-b997-bf1962c6c7f1.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap46c16dfb-cb not found in namespace qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1. Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent return fut.result() Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent return self.__get_result() Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent raise self._exception Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap46c16dfb-cb not found in namespace qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1. Nov 28 05:04:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:56.469 261084 ERROR neutron.agent.dhcp.agent #033[00m Nov 28 05:04:56 localhost systemd[1]: tmp-crun.PPVIQv.mount: Deactivated successfully. Nov 28 05:04:56 localhost ovn_controller[152322]: 2025-11-28T10:04:56Z|00309|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:04:57 localhost nova_compute[279673]: 2025-11-28 10:04:57.023 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:57 localhost nova_compute[279673]: 2025-11-28 10:04:57.025 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:57 localhost neutron_sriov_agent[254147]: 2025-11-28 10:04:57.057 2 INFO neutron.agent.securitygroups_rpc [None req-1b122d45-69b6-44ae-9f44-6255649c2a99 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:04:57 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:57.082 261084 INFO neutron.agent.dhcp.agent [None req-b4f5f919-ee1a-4a3b-8b11-6d4d21b4686b - - - - - -] DHCP configuration for ports {'ca552b8f-155d-47a1-be25-a7aeb0006de8', 'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '46c16dfb-cb51-4790-9470-74b6d8c3c674'} is completed#033[00m Nov 28 05:04:57 localhost ovn_controller[152322]: 2025-11-28T10:04:57Z|00310|binding|INFO|Removing iface tap404598dd-47 ovn-installed in OVS Nov 28 05:04:57 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:57.098 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 8dc10fc1-17cf-4461-b700-c1adb49acf0d with type ""#033[00m Nov 28 05:04:57 localhost ovn_controller[152322]: 2025-11-28T10:04:57Z|00311|binding|INFO|Removing lport 404598dd-4706-4aa3-a857-56207d0fd483 ovn-installed in OVS Nov 28 05:04:57 localhost nova_compute[279673]: 2025-11-28 10:04:57.100 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:57 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:57.102 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-1a246530-be70-4846-9202-8f9cd6d862ae', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a246530-be70-4846-9202-8f9cd6d862ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '50a1392ce96c4024bcd36a3df403ca29', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d9cc17b-2c39-4130-a2f2-9d12894eaf52, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=404598dd-4706-4aa3-a857-56207d0fd483) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:57 localhost nova_compute[279673]: 2025-11-28 10:04:57.104 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:57 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:57.107 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 404598dd-4706-4aa3-a857-56207d0fd483 in datapath 1a246530-be70-4846-9202-8f9cd6d862ae unbound from our chassis#033[00m Nov 28 05:04:57 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:57.109 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1a246530-be70-4846-9202-8f9cd6d862ae or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:04:57 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:57.110 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[0b615385-1609-435a-8639-1bcc1e8791dd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:57 localhost dnsmasq[317792]: exiting on receipt of SIGTERM Nov 28 05:04:57 localhost podman[320180]: 2025-11-28 10:04:57.219961279 +0000 UTC m=+0.048373308 container kill 57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a246530-be70-4846-9202-8f9cd6d862ae, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 05:04:57 localhost systemd[1]: libpod-57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5.scope: Deactivated successfully. Nov 28 05:04:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:04:57 localhost podman[320194]: 2025-11-28 10:04:57.298102267 +0000 UTC m=+0.061272206 container died 57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a246530-be70-4846-9202-8f9cd6d862ae, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:04:57 localhost podman[320194]: 2025-11-28 10:04:57.335133449 +0000 UTC m=+0.098303338 container cleanup 57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a246530-be70-4846-9202-8f9cd6d862ae, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Nov 28 05:04:57 localhost systemd[1]: libpod-conmon-57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5.scope: Deactivated successfully. Nov 28 05:04:57 localhost podman[320195]: 2025-11-28 10:04:57.37739737 +0000 UTC m=+0.137979884 container remove 57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a246530-be70-4846-9202-8f9cd6d862ae, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:57 localhost nova_compute[279673]: 2025-11-28 10:04:57.390 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:57 localhost kernel: device tap404598dd-47 left promiscuous mode Nov 28 05:04:57 localhost nova_compute[279673]: 2025-11-28 10:04:57.406 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:57 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:57.422 261084 INFO neutron.agent.dhcp.agent [None req-8059cd0d-68ea-4676-82eb-987dfc66d573 - - - - - -] Synchronizing state#033[00m Nov 28 05:04:57 localhost podman[320206]: 2025-11-28 10:04:57.444491263 +0000 UTC m=+0.191993223 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 28 05:04:57 localhost podman[320206]: 2025-11-28 10:04:57.459483062 +0000 UTC m=+0.206985022 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Nov 28 05:04:57 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:04:57 localhost systemd[1]: var-lib-containers-storage-overlay-f57866dc5c6162f1b2474906fd9f4e8b712b44716f63d14a9891a2098561e5e3-merged.mount: Deactivated successfully. Nov 28 05:04:57 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-57ff87a1297b4078e9c2e8fdd70de8a40b0d134df264eb7de0e31194efa6adb5-userdata-shm.mount: Deactivated successfully. Nov 28 05:04:57 localhost systemd[1]: run-netns-qdhcp\x2d1a246530\x2dbe70\x2d4846\x2d9202\x2d8f9cd6d862ae.mount: Deactivated successfully. Nov 28 05:04:57 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:57.642 261084 INFO neutron.agent.dhcp.agent [None req-ade9d9ba-7777-4bec-957e-af24ecb3c5ae - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 28 05:04:57 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:57.643 261084 INFO neutron.agent.dhcp.agent [-] Starting network 1a246530-be70-4846-9202-8f9cd6d862ae dhcp configuration#033[00m Nov 28 05:04:57 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:57.644 261084 INFO neutron.agent.dhcp.agent [-] Finished network 1a246530-be70-4846-9202-8f9cd6d862ae dhcp configuration#033[00m Nov 28 05:04:57 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:57.644 261084 INFO neutron.agent.dhcp.agent [-] Starting network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 dhcp configuration#033[00m Nov 28 05:04:57 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:57.648 261084 INFO neutron.agent.dhcp.agent [-] Starting network c8ccf767-8631-44f9-8e16-74febe5b399d dhcp configuration#033[00m Nov 28 05:04:57 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:57.649 261084 INFO neutron.agent.dhcp.agent [-] Finished network c8ccf767-8631-44f9-8e16-74febe5b399d dhcp configuration#033[00m Nov 28 05:04:57 localhost dnsmasq[320057]: exiting on receipt of SIGTERM Nov 28 05:04:57 localhost podman[320257]: 2025-11-28 10:04:57.826437587 +0000 UTC m=+0.064435848 container kill 6cbf99081af1f9d3a86d645785b96571ba40d1392acc9dad46ae3cc33a2e09ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 28 05:04:57 localhost systemd[1]: libpod-6cbf99081af1f9d3a86d645785b96571ba40d1392acc9dad46ae3cc33a2e09ad.scope: Deactivated successfully. Nov 28 05:04:57 localhost podman[320270]: 2025-11-28 10:04:57.902901747 +0000 UTC m=+0.057655392 container died 6cbf99081af1f9d3a86d645785b96571ba40d1392acc9dad46ae3cc33a2e09ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:04:57 localhost podman[320270]: 2025-11-28 10:04:57.937946332 +0000 UTC m=+0.092699947 container cleanup 6cbf99081af1f9d3a86d645785b96571ba40d1392acc9dad46ae3cc33a2e09ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:04:57 localhost systemd[1]: libpod-conmon-6cbf99081af1f9d3a86d645785b96571ba40d1392acc9dad46ae3cc33a2e09ad.scope: Deactivated successfully. Nov 28 05:04:57 localhost podman[320271]: 2025-11-28 10:04:57.979861953 +0000 UTC m=+0.128087371 container remove 6cbf99081af1f9d3a86d645785b96571ba40d1392acc9dad46ae3cc33a2e09ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 28 05:04:58 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:58.028 261084 INFO neutron.agent.linux.ip_lib [-] Device tap46c16dfb-cb cannot be used as it has no MAC address#033[00m Nov 28 05:04:58 localhost nova_compute[279673]: 2025-11-28 10:04:58.086 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:58 localhost kernel: device tap46c16dfb-cb entered promiscuous mode Nov 28 05:04:58 localhost nova_compute[279673]: 2025-11-28 10:04:58.097 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:58 localhost NetworkManager[5967]: [1764324298.0979] manager: (tap46c16dfb-cb): new Generic device (/org/freedesktop/NetworkManager/Devices/53) Nov 28 05:04:58 localhost ovn_controller[152322]: 2025-11-28T10:04:58Z|00312|binding|INFO|Claiming lport 46c16dfb-cb51-4790-9470-74b6d8c3c674 for this chassis. Nov 28 05:04:58 localhost ovn_controller[152322]: 2025-11-28T10:04:58Z|00313|binding|INFO|46c16dfb-cb51-4790-9470-74b6d8c3c674: Claiming unknown Nov 28 05:04:58 localhost systemd-udevd[320303]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:04:58 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:58.111 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe19:31/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=46c16dfb-cb51-4790-9470-74b6d8c3c674) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:04:58 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:58.113 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 46c16dfb-cb51-4790-9470-74b6d8c3c674 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis#033[00m Nov 28 05:04:58 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:58.116 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 7b0638af-ec04-4eda-8f5f-a54aa07bc574 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:04:58 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:58.117 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:04:58 localhost ovn_metadata_agent[158125]: 2025-11-28 10:04:58.117 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[d44c8b46-168f-4009-8f50-ff43eb3b543c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:04:58 localhost journal[227875]: ethtool ioctl error on tap46c16dfb-cb: No such device Nov 28 05:04:58 localhost nova_compute[279673]: 2025-11-28 10:04:58.135 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:58 localhost journal[227875]: ethtool ioctl error on tap46c16dfb-cb: No such device Nov 28 05:04:58 localhost ovn_controller[152322]: 2025-11-28T10:04:58Z|00314|binding|INFO|Setting lport 46c16dfb-cb51-4790-9470-74b6d8c3c674 ovn-installed in OVS Nov 28 05:04:58 localhost ovn_controller[152322]: 2025-11-28T10:04:58Z|00315|binding|INFO|Setting lport 46c16dfb-cb51-4790-9470-74b6d8c3c674 up in Southbound Nov 28 05:04:58 localhost nova_compute[279673]: 2025-11-28 10:04:58.140 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:58 localhost journal[227875]: ethtool ioctl error on tap46c16dfb-cb: No such device Nov 28 05:04:58 localhost journal[227875]: ethtool ioctl error on tap46c16dfb-cb: No such device Nov 28 05:04:58 localhost journal[227875]: ethtool ioctl error on tap46c16dfb-cb: No such device Nov 28 05:04:58 localhost journal[227875]: ethtool ioctl error on tap46c16dfb-cb: No such device Nov 28 05:04:58 localhost journal[227875]: ethtool ioctl error on tap46c16dfb-cb: No such device Nov 28 05:04:58 localhost journal[227875]: ethtool ioctl error on tap46c16dfb-cb: No such device Nov 28 05:04:58 localhost nova_compute[279673]: 2025-11-28 10:04:58.182 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:58 localhost nova_compute[279673]: 2025-11-28 10:04:58.217 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:58 localhost ovn_controller[152322]: 2025-11-28T10:04:58Z|00316|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:04:58 localhost systemd[1]: tmp-crun.yVnVTR.mount: Deactivated successfully. Nov 28 05:04:58 localhost systemd[1]: var-lib-containers-storage-overlay-27bf9817321185a883a9c43ffe6b9dfc96f140d8976c2f56aa39ee4eb45f7b55-merged.mount: Deactivated successfully. Nov 28 05:04:58 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6cbf99081af1f9d3a86d645785b96571ba40d1392acc9dad46ae3cc33a2e09ad-userdata-shm.mount: Deactivated successfully. Nov 28 05:04:58 localhost nova_compute[279673]: 2025-11-28 10:04:58.517 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:04:58 localhost snmpd[66832]: empty variable list in _query Nov 28 05:04:58 localhost neutron_sriov_agent[254147]: 2025-11-28 10:04:58.754 2 INFO neutron.agent.securitygroups_rpc [None req-d1d37070-b21e-47b1-9333-9a0acdf29e79 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:04:59 localhost podman[320373]: Nov 28 05:04:59 localhost podman[320373]: 2025-11-28 10:04:59.083804545 +0000 UTC m=+0.095131757 container create 4ec086a463832b18710147d541cd316e0389a79a07b3a2ecfba282c9d00d6bfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:04:59 localhost systemd[1]: Started libpod-conmon-4ec086a463832b18710147d541cd316e0389a79a07b3a2ecfba282c9d00d6bfc.scope. Nov 28 05:04:59 localhost podman[320373]: 2025-11-28 10:04:59.03722524 +0000 UTC m=+0.048552472 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:04:59 localhost systemd[1]: tmp-crun.q1Abty.mount: Deactivated successfully. Nov 28 05:04:59 localhost systemd[1]: Started libcrun container. Nov 28 05:04:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1a859774efe97f31726e746b14ef1f2253fd05145ded52792ece98c18205a30/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:04:59 localhost podman[320373]: 2025-11-28 10:04:59.178625022 +0000 UTC m=+0.189952244 container init 4ec086a463832b18710147d541cd316e0389a79a07b3a2ecfba282c9d00d6bfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:04:59 localhost podman[320373]: 2025-11-28 10:04:59.187330442 +0000 UTC m=+0.198657654 container start 4ec086a463832b18710147d541cd316e0389a79a07b3a2ecfba282c9d00d6bfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:04:59 localhost dnsmasq[320391]: started, version 2.85 cachesize 150 Nov 28 05:04:59 localhost dnsmasq[320391]: DNS service limited to local subnets Nov 28 05:04:59 localhost dnsmasq[320391]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:04:59 localhost dnsmasq[320391]: warning: no upstream servers configured Nov 28 05:04:59 localhost dnsmasq-dhcp[320391]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:04:59 localhost dnsmasq[320391]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses Nov 28 05:04:59 localhost dnsmasq-dhcp[320391]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:59 localhost dnsmasq-dhcp[320391]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:59 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:59.257 261084 INFO neutron.agent.dhcp.agent [-] Finished network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 dhcp configuration#033[00m Nov 28 05:04:59 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:59.257 261084 INFO neutron.agent.dhcp.agent [None req-ade9d9ba-7777-4bec-957e-af24ecb3c5ae - - - - - -] Synchronizing state complete#033[00m Nov 28 05:04:59 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:59.259 261084 INFO neutron.agent.dhcp.agent [None req-d0f1f13b-6b00-42f2-b13c-f49ab1a924af - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:04:59 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:59.260 261084 INFO neutron.agent.dhcp.agent [None req-d0f1f13b-6b00-42f2-b13c-f49ab1a924af - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:04:59 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:59.260 261084 INFO neutron.agent.dhcp.agent [None req-d0f1f13b-6b00-42f2-b13c-f49ab1a924af - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:04:59 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:59.261 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:56Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=584ff574-1899-4b39-a9e3-57a01ba01ba1, ip_allocation=immediate, mac_address=fa:16:3e:0b:06:78, name=tempest-NetworksTestDHCPv6-1476856072, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=44, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['848b098f-2f4e-4153-b931-bae96c03b751'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:53Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1828, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:04:56Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1#033[00m Nov 28 05:04:59 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:59.274 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:57Z, description=, device_id=0f485132-8e02-47a3-b2f5-564423c3ef9b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=8d532c0c-347a-4459-8005-d390d68f5b23, ip_allocation=immediate, mac_address=fa:16:3e:b7:1c:89, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:04:47Z, description=, dns_domain=, id=553c7f35-d914-4af1-9846-a8cbe21f53f3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-1486257592-network, port_security_enabled=True, project_id=aa5be61eafca4d96976422f0e0103210, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=15189, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1778, status=ACTIVE, subnets=['c1ca5641-5630-4c33-a102-f6b9f86bd61c'], tags=[], tenant_id=aa5be61eafca4d96976422f0e0103210, updated_at=2025-11-28T10:04:49Z, vlan_transparent=None, network_id=553c7f35-d914-4af1-9846-a8cbe21f53f3, port_security_enabled=False, project_id=aa5be61eafca4d96976422f0e0103210, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1834, status=DOWN, tags=[], tenant_id=aa5be61eafca4d96976422f0e0103210, updated_at=2025-11-28T10:04:57Z on network 553c7f35-d914-4af1-9846-a8cbe21f53f3#033[00m Nov 28 05:04:59 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:59.354 261084 INFO neutron.agent.dhcp.agent [None req-9edbe1b1-8a4d-493b-877d-766168ff95e6 - - - - - -] DHCP configuration for ports {'584ff574-1899-4b39-a9e3-57a01ba01ba1', 'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '46c16dfb-cb51-4790-9470-74b6d8c3c674'} is completed#033[00m Nov 28 05:04:59 localhost dnsmasq[320391]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses Nov 28 05:04:59 localhost dnsmasq-dhcp[320391]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:04:59 localhost dnsmasq-dhcp[320391]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:04:59 localhost podman[320425]: 2025-11-28 10:04:59.493679659 +0000 UTC m=+0.064108248 container kill 4ec086a463832b18710147d541cd316e0389a79a07b3a2ecfba282c9d00d6bfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 28 05:04:59 localhost podman[320433]: 2025-11-28 10:04:59.516459542 +0000 UTC m=+0.060495354 container kill 2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-553c7f35-d914-4af1-9846-a8cbe21f53f3, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 05:04:59 localhost dnsmasq[319577]: read /var/lib/neutron/dhcp/553c7f35-d914-4af1-9846-a8cbe21f53f3/addn_hosts - 1 addresses Nov 28 05:04:59 localhost dnsmasq-dhcp[319577]: read /var/lib/neutron/dhcp/553c7f35-d914-4af1-9846-a8cbe21f53f3/host Nov 28 05:04:59 localhost dnsmasq-dhcp[319577]: read /var/lib/neutron/dhcp/553c7f35-d914-4af1-9846-a8cbe21f53f3/opts Nov 28 05:04:59 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:59.825 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:04:57Z, description=, device_id=0f485132-8e02-47a3-b2f5-564423c3ef9b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=8d532c0c-347a-4459-8005-d390d68f5b23, ip_allocation=immediate, mac_address=fa:16:3e:b7:1c:89, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:04:47Z, description=, dns_domain=, id=553c7f35-d914-4af1-9846-a8cbe21f53f3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-1486257592-network, port_security_enabled=True, project_id=aa5be61eafca4d96976422f0e0103210, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=15189, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1778, status=ACTIVE, subnets=['c1ca5641-5630-4c33-a102-f6b9f86bd61c'], tags=[], tenant_id=aa5be61eafca4d96976422f0e0103210, updated_at=2025-11-28T10:04:49Z, vlan_transparent=None, network_id=553c7f35-d914-4af1-9846-a8cbe21f53f3, port_security_enabled=False, project_id=aa5be61eafca4d96976422f0e0103210, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1834, status=DOWN, tags=[], tenant_id=aa5be61eafca4d96976422f0e0103210, updated_at=2025-11-28T10:04:57Z on network 553c7f35-d914-4af1-9846-a8cbe21f53f3#033[00m Nov 28 05:04:59 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:04:59.836 261084 INFO neutron.agent.dhcp.agent [None req-25fa700c-a8a5-4f94-8fd2-934793928fac - - - - - -] DHCP configuration for ports {'584ff574-1899-4b39-a9e3-57a01ba01ba1', '8d532c0c-347a-4459-8005-d390d68f5b23'} is completed#033[00m Nov 28 05:05:00 localhost dnsmasq[320391]: exiting on receipt of SIGTERM Nov 28 05:05:00 localhost systemd[1]: libpod-4ec086a463832b18710147d541cd316e0389a79a07b3a2ecfba282c9d00d6bfc.scope: Deactivated successfully. Nov 28 05:05:00 localhost podman[320493]: 2025-11-28 10:05:00.008135301 +0000 UTC m=+0.075693560 container kill 4ec086a463832b18710147d541cd316e0389a79a07b3a2ecfba282c9d00d6bfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 28 05:05:00 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:05:00 localhost podman[320512]: 2025-11-28 10:05:00.073255947 +0000 UTC m=+0.073304692 container kill 2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-553c7f35-d914-4af1-9846-a8cbe21f53f3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 05:05:00 localhost dnsmasq[319577]: read /var/lib/neutron/dhcp/553c7f35-d914-4af1-9846-a8cbe21f53f3/addn_hosts - 1 addresses Nov 28 05:05:00 localhost dnsmasq-dhcp[319577]: read /var/lib/neutron/dhcp/553c7f35-d914-4af1-9846-a8cbe21f53f3/host Nov 28 05:05:00 localhost dnsmasq-dhcp[319577]: read /var/lib/neutron/dhcp/553c7f35-d914-4af1-9846-a8cbe21f53f3/opts Nov 28 05:05:00 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:00.091 2 INFO neutron.agent.securitygroups_rpc [None req-f8f0dbe9-4862-4719-841c-e92cc8d478e0 6aa2d31d87d44bd68f3306827a96cb84 7163b69fa8fc4d998fb494edfa303457 - - default default] Security group member updated ['d5ac7cb5-5e8f-446f-ac61-9c9e90d707c1']#033[00m Nov 28 05:05:00 localhost podman[320519]: 2025-11-28 10:05:00.0970976 +0000 UTC m=+0.072246801 container died 4ec086a463832b18710147d541cd316e0389a79a07b3a2ecfba282c9d00d6bfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 05:05:00 localhost podman[320519]: 2025-11-28 10:05:00.182483036 +0000 UTC m=+0.157632207 container cleanup 4ec086a463832b18710147d541cd316e0389a79a07b3a2ecfba282c9d00d6bfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 05:05:00 localhost systemd[1]: libpod-conmon-4ec086a463832b18710147d541cd316e0389a79a07b3a2ecfba282c9d00d6bfc.scope: Deactivated successfully. Nov 28 05:05:00 localhost podman[320524]: 2025-11-28 10:05:00.212121736 +0000 UTC m=+0.177596410 container remove 4ec086a463832b18710147d541cd316e0389a79a07b3a2ecfba282c9d00d6bfc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3) Nov 28 05:05:00 localhost ovn_controller[152322]: 2025-11-28T10:05:00Z|00317|binding|INFO|Releasing lport 46c16dfb-cb51-4790-9470-74b6d8c3c674 from this chassis (sb_readonly=0) Nov 28 05:05:00 localhost nova_compute[279673]: 2025-11-28 10:05:00.269 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:00 localhost kernel: device tap46c16dfb-cb left promiscuous mode Nov 28 05:05:00 localhost ovn_controller[152322]: 2025-11-28T10:05:00Z|00318|binding|INFO|Setting lport 46c16dfb-cb51-4790-9470-74b6d8c3c674 down in Southbound Nov 28 05:05:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:00.279 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe19:31/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=46c16dfb-cb51-4790-9470-74b6d8c3c674) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:05:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:00.281 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 46c16dfb-cb51-4790-9470-74b6d8c3c674 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis#033[00m Nov 28 05:05:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:00.284 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:05:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:00.285 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[c1b66fbf-2eda-4992-9239-1571b910e761]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:05:00 localhost nova_compute[279673]: 2025-11-28 10:05:00.296 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:00 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:00.452 261084 INFO neutron.agent.dhcp.agent [None req-cf608d3b-1a14-4e77-9c42-108b9264096c - - - - - -] DHCP configuration for ports {'8d532c0c-347a-4459-8005-d390d68f5b23'} is completed#033[00m Nov 28 05:05:00 localhost systemd[1]: var-lib-containers-storage-overlay-f1a859774efe97f31726e746b14ef1f2253fd05145ded52792ece98c18205a30-merged.mount: Deactivated successfully. Nov 28 05:05:00 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4ec086a463832b18710147d541cd316e0389a79a07b3a2ecfba282c9d00d6bfc-userdata-shm.mount: Deactivated successfully. Nov 28 05:05:00 localhost systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully. Nov 28 05:05:00 localhost nova_compute[279673]: 2025-11-28 10:05:00.607 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:00 localhost nova_compute[279673]: 2025-11-28 10:05:00.672 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:01 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:01.175 261084 INFO neutron.agent.linux.ip_lib [None req-b7029a3b-7f98-4652-bc32-843c78e0ba8d - - - - - -] Device tapca3c90c8-d0 cannot be used as it has no MAC address#033[00m Nov 28 05:05:01 localhost nova_compute[279673]: 2025-11-28 10:05:01.201 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:01 localhost kernel: device tapca3c90c8-d0 entered promiscuous mode Nov 28 05:05:01 localhost ovn_controller[152322]: 2025-11-28T10:05:01Z|00319|binding|INFO|Claiming lport ca3c90c8-d07c-44d0-a7ec-af787d805dd0 for this chassis. Nov 28 05:05:01 localhost nova_compute[279673]: 2025-11-28 10:05:01.208 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:01 localhost ovn_controller[152322]: 2025-11-28T10:05:01Z|00320|binding|INFO|ca3c90c8-d07c-44d0-a7ec-af787d805dd0: Claiming unknown Nov 28 05:05:01 localhost NetworkManager[5967]: [1764324301.2123] manager: (tapca3c90c8-d0): new Generic device (/org/freedesktop/NetworkManager/Devices/54) Nov 28 05:05:01 localhost systemd-udevd[320573]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:05:01 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:01.216 2 INFO neutron.agent.securitygroups_rpc [None req-8e254d5a-cb3d-4d3c-8c34-ca57b16025b1 6aa2d31d87d44bd68f3306827a96cb84 7163b69fa8fc4d998fb494edfa303457 - - default default] Security group member updated ['d5ac7cb5-5e8f-446f-ac61-9c9e90d707c1']#033[00m Nov 28 05:05:01 localhost ovn_controller[152322]: 2025-11-28T10:05:01Z|00321|binding|INFO|Setting lport ca3c90c8-d07c-44d0-a7ec-af787d805dd0 ovn-installed in OVS Nov 28 05:05:01 localhost ovn_controller[152322]: 2025-11-28T10:05:01Z|00322|binding|INFO|Setting lport ca3c90c8-d07c-44d0-a7ec-af787d805dd0 up in Southbound Nov 28 05:05:01 localhost nova_compute[279673]: 2025-11-28 10:05:01.220 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:01.217 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ca3c90c8-d07c-44d0-a7ec-af787d805dd0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:05:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:01.219 158130 INFO neutron.agent.ovn.metadata.agent [-] Port ca3c90c8-d07c-44d0-a7ec-af787d805dd0 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis#033[00m Nov 28 05:05:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:01.223 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port fff83133-53ad-4b5f-9ec4-5a9c4ae5262d IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:05:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:01.223 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:05:01 localhost nova_compute[279673]: 2025-11-28 10:05:01.225 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:01.224 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[9f9b720b-f842-4f76-a283-897b95302510]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:05:01 localhost nova_compute[279673]: 2025-11-28 10:05:01.257 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:01 localhost nova_compute[279673]: 2025-11-28 10:05:01.340 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:01 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:01.964 2 INFO neutron.agent.securitygroups_rpc [None req-70ec53ac-b8b3-4633-a977-c5aaaab920ff 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:05:02 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:02.175 261084 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.dhcp_release_cmd', '--privsep_sock_path', '/tmp/tmpdn9o7yfg/privsep.sock']#033[00m Nov 28 05:05:02 localhost podman[320628]: Nov 28 05:05:02 localhost podman[320628]: 2025-11-28 10:05:02.240157098 +0000 UTC m=+0.101765128 container create 8346a586b4f9e87658264afbef3780ccde907177cea02e23a2c2da4cc168dad3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 28 05:05:02 localhost systemd[1]: Started libpod-conmon-8346a586b4f9e87658264afbef3780ccde907177cea02e23a2c2da4cc168dad3.scope. Nov 28 05:05:02 localhost podman[320628]: 2025-11-28 10:05:02.190735651 +0000 UTC m=+0.052343721 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:05:02 localhost systemd[1]: Started libcrun container. Nov 28 05:05:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/263de5eac5bcb424e9045572c27154229aa4a2d3b22ed18705a18fbb1cd475fa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:05:02 localhost podman[320628]: 2025-11-28 10:05:02.347297278 +0000 UTC m=+0.208905318 container init 8346a586b4f9e87658264afbef3780ccde907177cea02e23a2c2da4cc168dad3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:05:02 localhost podman[320628]: 2025-11-28 10:05:02.360970529 +0000 UTC m=+0.222578559 container start 8346a586b4f9e87658264afbef3780ccde907177cea02e23a2c2da4cc168dad3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:05:02 localhost dnsmasq[320649]: started, version 2.85 cachesize 150 Nov 28 05:05:02 localhost dnsmasq[320649]: DNS service limited to local subnets Nov 28 05:05:02 localhost dnsmasq[320649]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:05:02 localhost dnsmasq[320649]: warning: no upstream servers configured Nov 28 05:05:02 localhost dnsmasq-dhcp[320649]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:05:02 localhost dnsmasq[320649]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:05:02 localhost dnsmasq-dhcp[320649]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:05:02 localhost dnsmasq-dhcp[320649]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:05:02 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:02.429 261084 INFO neutron.agent.dhcp.agent [None req-b7029a3b-7f98-4652-bc32-843c78e0ba8d - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:05:01Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=25b54e74-3d8f-4eda-9085-11cfc4264ad0, ip_allocation=immediate, mac_address=fa:16:3e:91:7b:bc, name=tempest-NetworksTestDHCPv6-1477056337, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=46, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['c2382461-c6a6-404b-bdb2-c92b918cec3f'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:05:00Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1841, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:05:01Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1#033[00m Nov 28 05:05:02 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:02.504 261084 INFO neutron.agent.dhcp.agent [None req-07be24c6-1979-4309-8cfa-faced2519064 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed#033[00m Nov 28 05:05:02 localhost podman[320669]: 2025-11-28 10:05:02.629976097 +0000 UTC m=+0.067261548 container kill 8346a586b4f9e87658264afbef3780ccde907177cea02e23a2c2da4cc168dad3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 05:05:02 localhost dnsmasq[320649]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 1 addresses Nov 28 05:05:02 localhost dnsmasq-dhcp[320649]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:05:02 localhost dnsmasq-dhcp[320649]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:05:02 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:02.746 2 INFO neutron.agent.securitygroups_rpc [None req-472369ad-637e-463f-8142-3dff7a706106 6aa2d31d87d44bd68f3306827a96cb84 7163b69fa8fc4d998fb494edfa303457 - - default default] Security group member updated ['d5ac7cb5-5e8f-446f-ac61-9c9e90d707c1']#033[00m Nov 28 05:05:02 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:02.851 261084 INFO neutron.agent.dhcp.agent [None req-488aa661-66d8-441d-8869-df90f7f4f979 - - - - - -] DHCP configuration for ports {'25b54e74-3d8f-4eda-9085-11cfc4264ad0'} is completed#033[00m Nov 28 05:05:02 localhost kernel: device tapca3c90c8-d0 left promiscuous mode Nov 28 05:05:02 localhost nova_compute[279673]: 2025-11-28 10:05:02.876 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:02 localhost ovn_controller[152322]: 2025-11-28T10:05:02Z|00323|binding|INFO|Releasing lport ca3c90c8-d07c-44d0-a7ec-af787d805dd0 from this chassis (sb_readonly=0) Nov 28 05:05:02 localhost ovn_controller[152322]: 2025-11-28T10:05:02Z|00324|binding|INFO|Setting lport ca3c90c8-d07c-44d0-a7ec-af787d805dd0 down in Southbound Nov 28 05:05:02 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:02.888 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ca3c90c8-d07c-44d0-a7ec-af787d805dd0) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:05:02 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:02.890 158130 INFO neutron.agent.ovn.metadata.agent [-] Port ca3c90c8-d07c-44d0-a7ec-af787d805dd0 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis#033[00m Nov 28 05:05:02 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:02.893 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:05:02 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:02.895 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[5350c6fa-94c5-4051-a214-df074ff967b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:05:02 localhost nova_compute[279673]: 2025-11-28 10:05:02.901 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:02 localhost nova_compute[279673]: 2025-11-28 10:05:02.902 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:02 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:02.919 261084 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Nov 28 05:05:02 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:02.795 320689 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Nov 28 05:05:02 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:02.800 320689 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Nov 28 05:05:02 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:02.804 320689 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m Nov 28 05:05:02 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:02.804 320689 INFO oslo.privsep.daemon [-] privsep daemon running as pid 320689#033[00m Nov 28 05:05:02 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:02.967 2 INFO neutron.agent.securitygroups_rpc [None req-a8642c38-ef8d-4c54-aca2-47d1d691b2fe 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:05:03 localhost dnsmasq-dhcp[320009]: DHCPRELEASE(tapd9d6658c-69) 10.103.0.1 fa:16:3e:c1:e6:2e Nov 28 05:05:03 localhost dnsmasq[320649]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:05:03 localhost dnsmasq-dhcp[320649]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:05:03 localhost dnsmasq-dhcp[320649]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:05:03 localhost podman[320713]: 2025-11-28 10:05:03.215537986 +0000 UTC m=+0.069123862 container kill 8346a586b4f9e87658264afbef3780ccde907177cea02e23a2c2da4cc168dad3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent [-] Unable to reload_allocations dhcp for 8642adde-54ae-4fc2-b997-bf1962c6c7f1.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapca3c90c8-d0 not found in namespace qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1. Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent return fut.result() Nov 28 05:05:03 localhost systemd[1]: tmp-crun.L5x7lN.mount: Deactivated successfully. Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent return self.__get_result() Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent raise self._exception Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapca3c90c8-d0 not found in namespace qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1. Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.242 261084 ERROR neutron.agent.dhcp.agent #033[00m Nov 28 05:05:03 localhost dnsmasq[320009]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/addn_hosts - 0 addresses Nov 28 05:05:03 localhost dnsmasq-dhcp[320009]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/host Nov 28 05:05:03 localhost dnsmasq-dhcp[320009]: read /var/lib/neutron/dhcp/02cd8163-742c-4849-a0f3-35dad7f4a404/opts Nov 28 05:05:03 localhost systemd[1]: tmp-crun.8WSddx.mount: Deactivated successfully. Nov 28 05:05:03 localhost podman[320745]: 2025-11-28 10:05:03.699152943 +0000 UTC m=+0.071901991 container kill 44f2d086efef886c98e16013dc782e118ca35448399d85a6897ed0acd02024d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:05:03 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:03.766 2 INFO neutron.agent.securitygroups_rpc [None req-551a0a6f-26c0-4f59-8a71-37fd214c141c 6aa2d31d87d44bd68f3306827a96cb84 7163b69fa8fc4d998fb494edfa303457 - - default default] Security group member updated ['d5ac7cb5-5e8f-446f-ac61-9c9e90d707c1']#033[00m Nov 28 05:05:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:03.852 261084 INFO neutron.agent.dhcp.agent [None req-ade9d9ba-7777-4bec-957e-af24ecb3c5ae - - - - - -] Synchronizing state#033[00m Nov 28 05:05:04 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:04.216 261084 INFO neutron.agent.dhcp.agent [None req-6b0cd380-e608-4081-8967-4a3f74b64491 - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 28 05:05:04 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:04.218 261084 INFO neutron.agent.dhcp.agent [-] Starting network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 dhcp configuration#033[00m Nov 28 05:05:04 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:04.219 261084 INFO neutron.agent.dhcp.agent [-] Finished network 8642adde-54ae-4fc2-b997-bf1962c6c7f1 dhcp configuration#033[00m Nov 28 05:05:04 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:04.219 261084 INFO neutron.agent.dhcp.agent [-] Starting network c8ccf767-8631-44f9-8e16-74febe5b399d dhcp configuration#033[00m Nov 28 05:05:04 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:04.220 261084 INFO neutron.agent.dhcp.agent [-] Finished network c8ccf767-8631-44f9-8e16-74febe5b399d dhcp configuration#033[00m Nov 28 05:05:04 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:04.221 261084 INFO neutron.agent.dhcp.agent [None req-6b0cd380-e608-4081-8967-4a3f74b64491 - - - - - -] Synchronizing state complete#033[00m Nov 28 05:05:04 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:04.321 261084 INFO neutron.agent.dhcp.agent [None req-cd78bbba-0ba2-4bd9-aa83-ffdb71baf0d8 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed#033[00m Nov 28 05:05:04 localhost dnsmasq[320649]: exiting on receipt of SIGTERM Nov 28 05:05:04 localhost systemd[1]: libpod-8346a586b4f9e87658264afbef3780ccde907177cea02e23a2c2da4cc168dad3.scope: Deactivated successfully. Nov 28 05:05:04 localhost podman[320794]: 2025-11-28 10:05:04.516986797 +0000 UTC m=+0.072735985 container kill 8346a586b4f9e87658264afbef3780ccde907177cea02e23a2c2da4cc168dad3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 05:05:04 localhost dnsmasq[320009]: exiting on receipt of SIGTERM Nov 28 05:05:04 localhost podman[320808]: 2025-11-28 10:05:04.563959483 +0000 UTC m=+0.067178716 container kill 44f2d086efef886c98e16013dc782e118ca35448399d85a6897ed0acd02024d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:05:04 localhost systemd[1]: libpod-44f2d086efef886c98e16013dc782e118ca35448399d85a6897ed0acd02024d9.scope: Deactivated successfully. Nov 28 05:05:04 localhost ovn_controller[152322]: 2025-11-28T10:05:04Z|00325|binding|INFO|Removing iface tapd9d6658c-69 ovn-installed in OVS Nov 28 05:05:04 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:04.593 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 01ac1c86-3a21-4ed2-98b0-bdb2d1c9caf2 with type ""#033[00m Nov 28 05:05:04 localhost podman[320822]: 2025-11-28 10:05:04.594363904 +0000 UTC m=+0.050295842 container died 8346a586b4f9e87658264afbef3780ccde907177cea02e23a2c2da4cc168dad3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 05:05:04 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:04.595 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.103.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-02cd8163-742c-4849-a0f3-35dad7f4a404', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02cd8163-742c-4849-a0f3-35dad7f4a404', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c66e098e4fb4a349dc2bb4293454135', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6fc78fcb-5ea1-409a-899f-c137c1b47b0b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d9d6658c-69f3-434a-a139-9146d8ddb475) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:05:04 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:04.597 158130 INFO neutron.agent.ovn.metadata.agent [-] Port d9d6658c-69f3-434a-a139-9146d8ddb475 in datapath 02cd8163-742c-4849-a0f3-35dad7f4a404 unbound from our chassis#033[00m Nov 28 05:05:04 localhost ovn_controller[152322]: 2025-11-28T10:05:04Z|00326|binding|INFO|Removing lport d9d6658c-69f3-434a-a139-9146d8ddb475 ovn-installed in OVS Nov 28 05:05:04 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:04.601 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 02cd8163-742c-4849-a0f3-35dad7f4a404, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:05:04 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:04.627 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[deab7a6e-3026-4761-8c1b-6511280ecdec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:05:04 localhost nova_compute[279673]: 2025-11-28 10:05:04.629 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:04 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8346a586b4f9e87658264afbef3780ccde907177cea02e23a2c2da4cc168dad3-userdata-shm.mount: Deactivated successfully. Nov 28 05:05:04 localhost podman[320822]: 2025-11-28 10:05:04.671033621 +0000 UTC m=+0.126965559 container remove 8346a586b4f9e87658264afbef3780ccde907177cea02e23a2c2da4cc168dad3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 28 05:05:04 localhost podman[320846]: 2025-11-28 10:05:04.680795311 +0000 UTC m=+0.099061989 container died 44f2d086efef886c98e16013dc782e118ca35448399d85a6897ed0acd02024d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true) Nov 28 05:05:04 localhost systemd[1]: libpod-conmon-8346a586b4f9e87658264afbef3780ccde907177cea02e23a2c2da4cc168dad3.scope: Deactivated successfully. Nov 28 05:05:04 localhost podman[320846]: 2025-11-28 10:05:04.723470044 +0000 UTC m=+0.141736682 container remove 44f2d086efef886c98e16013dc782e118ca35448399d85a6897ed0acd02024d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02cd8163-742c-4849-a0f3-35dad7f4a404, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 28 05:05:04 localhost systemd[1]: libpod-conmon-44f2d086efef886c98e16013dc782e118ca35448399d85a6897ed0acd02024d9.scope: Deactivated successfully. Nov 28 05:05:04 localhost nova_compute[279673]: 2025-11-28 10:05:04.736 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:04 localhost kernel: device tapd9d6658c-69 left promiscuous mode Nov 28 05:05:04 localhost nova_compute[279673]: 2025-11-28 10:05:04.759 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:04 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:04.783 261084 INFO neutron.agent.dhcp.agent [None req-b449a3b4-435b-49ad-a640-df2d3f2d856c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:05:04 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:04.784 261084 INFO neutron.agent.dhcp.agent [None req-b449a3b4-435b-49ad-a640-df2d3f2d856c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:05:04 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:04.900 261084 INFO neutron.agent.dhcp.agent [None req-6db7547d-570c-4aa1-acb2-96d3bc174ee7 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:05:05 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:05:05 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:05.223 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:05:05 localhost systemd[1]: var-lib-containers-storage-overlay-263de5eac5bcb424e9045572c27154229aa4a2d3b22ed18705a18fbb1cd475fa-merged.mount: Deactivated successfully. Nov 28 05:05:05 localhost systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully. Nov 28 05:05:05 localhost systemd[1]: var-lib-containers-storage-overlay-4c235e185164438599917caaee605008003fd8fff25d4dac6b9b93fcc8e24479-merged.mount: Deactivated successfully. Nov 28 05:05:05 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-44f2d086efef886c98e16013dc782e118ca35448399d85a6897ed0acd02024d9-userdata-shm.mount: Deactivated successfully. Nov 28 05:05:05 localhost systemd[1]: run-netns-qdhcp\x2d02cd8163\x2d742c\x2d4849\x2da0f3\x2d35dad7f4a404.mount: Deactivated successfully. Nov 28 05:05:05 localhost nova_compute[279673]: 2025-11-28 10:05:05.609 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:05 localhost ovn_controller[152322]: 2025-11-28T10:05:05Z|00327|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:05:05 localhost nova_compute[279673]: 2025-11-28 10:05:05.708 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:05 localhost nova_compute[279673]: 2025-11-28 10:05:05.762 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:07 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:07.308 261084 INFO neutron.agent.linux.ip_lib [None req-400d09d0-dbfd-4c26-b2f8-8d9d865d4979 - - - - - -] Device tap56bccdb5-6f cannot be used as it has no MAC address#033[00m Nov 28 05:05:07 localhost nova_compute[279673]: 2025-11-28 10:05:07.341 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:07 localhost kernel: device tap56bccdb5-6f entered promiscuous mode Nov 28 05:05:07 localhost NetworkManager[5967]: [1764324307.3503] manager: (tap56bccdb5-6f): new Generic device (/org/freedesktop/NetworkManager/Devices/55) Nov 28 05:05:07 localhost ovn_controller[152322]: 2025-11-28T10:05:07Z|00328|binding|INFO|Claiming lport 56bccdb5-6fe9-44a9-92c9-93e6e7a30192 for this chassis. Nov 28 05:05:07 localhost ovn_controller[152322]: 2025-11-28T10:05:07Z|00329|binding|INFO|56bccdb5-6fe9-44a9-92c9-93e6e7a30192: Claiming unknown Nov 28 05:05:07 localhost nova_compute[279673]: 2025-11-28 10:05:07.353 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:07 localhost systemd-udevd[320887]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:05:07 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:07.367 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe3d:5072/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=56bccdb5-6fe9-44a9-92c9-93e6e7a30192) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:05:07 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:07.370 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 56bccdb5-6fe9-44a9-92c9-93e6e7a30192 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis#033[00m Nov 28 05:05:07 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:07.373 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 980ba5c9-99ee-41f5-8394-98a27123bb4d IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:05:07 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:07.373 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:05:07 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:07.374 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[9b8318cc-3444-42a4-91f4-515908b5a752]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:05:07 localhost journal[227875]: ethtool ioctl error on tap56bccdb5-6f: No such device Nov 28 05:05:07 localhost ovn_controller[152322]: 2025-11-28T10:05:07Z|00330|binding|INFO|Setting lport 56bccdb5-6fe9-44a9-92c9-93e6e7a30192 ovn-installed in OVS Nov 28 05:05:07 localhost journal[227875]: ethtool ioctl error on tap56bccdb5-6f: No such device Nov 28 05:05:07 localhost ovn_controller[152322]: 2025-11-28T10:05:07Z|00331|binding|INFO|Setting lport 56bccdb5-6fe9-44a9-92c9-93e6e7a30192 up in Southbound Nov 28 05:05:07 localhost nova_compute[279673]: 2025-11-28 10:05:07.400 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:07 localhost journal[227875]: ethtool ioctl error on tap56bccdb5-6f: No such device Nov 28 05:05:07 localhost journal[227875]: ethtool ioctl error on tap56bccdb5-6f: No such device Nov 28 05:05:07 localhost journal[227875]: ethtool ioctl error on tap56bccdb5-6f: No such device Nov 28 05:05:07 localhost journal[227875]: ethtool ioctl error on tap56bccdb5-6f: No such device Nov 28 05:05:07 localhost journal[227875]: ethtool ioctl error on tap56bccdb5-6f: No such device Nov 28 05:05:07 localhost journal[227875]: ethtool ioctl error on tap56bccdb5-6f: No such device Nov 28 05:05:07 localhost nova_compute[279673]: 2025-11-28 10:05:07.457 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:07 localhost nova_compute[279673]: 2025-11-28 10:05:07.489 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:08 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:08.235 2 INFO neutron.agent.securitygroups_rpc [None req-687de3fc-aa2f-4499-970c-3ba0a56c0388 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']#033[00m Nov 28 05:05:08 localhost podman[320958]: Nov 28 05:05:08 localhost podman[320958]: 2025-11-28 10:05:08.379190523 +0000 UTC m=+0.088496897 container create 0cf4e56e8217bf4cf9c9e2f90f0b7996959dbb50a2a9fcfea8e0c2bee8fcb38f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 05:05:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:05:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:05:08 localhost systemd[1]: Started libpod-conmon-0cf4e56e8217bf4cf9c9e2f90f0b7996959dbb50a2a9fcfea8e0c2bee8fcb38f.scope. Nov 28 05:05:08 localhost podman[320958]: 2025-11-28 10:05:08.336310605 +0000 UTC m=+0.045617029 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:05:08 localhost systemd[1]: tmp-crun.M8Dvxf.mount: Deactivated successfully. Nov 28 05:05:08 localhost systemd[1]: Started libcrun container. Nov 28 05:05:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b6069e3726f8d077e724937be22b6056e6a37f5666120f5627dd7e6f3d4cdecf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:05:08 localhost podman[320958]: 2025-11-28 10:05:08.464643113 +0000 UTC m=+0.173949497 container init 0cf4e56e8217bf4cf9c9e2f90f0b7996959dbb50a2a9fcfea8e0c2bee8fcb38f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:05:08 localhost podman[320958]: 2025-11-28 10:05:08.475310887 +0000 UTC m=+0.184617271 container start 0cf4e56e8217bf4cf9c9e2f90f0b7996959dbb50a2a9fcfea8e0c2bee8fcb38f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125) Nov 28 05:05:08 localhost dnsmasq[320998]: started, version 2.85 cachesize 150 Nov 28 05:05:08 localhost dnsmasq[320998]: DNS service limited to local subnets Nov 28 05:05:08 localhost dnsmasq[320998]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:05:08 localhost dnsmasq[320998]: warning: no upstream servers configured Nov 28 05:05:08 localhost dnsmasq[320998]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:05:08 localhost podman[320972]: 2025-11-28 10:05:08.515730436 +0000 UTC m=+0.087770927 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 28 05:05:08 localhost podman[320972]: 2025-11-28 10:05:08.524729634 +0000 UTC m=+0.096770195 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 05:05:08 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:05:08 localhost podman[320973]: 2025-11-28 10:05:08.569399784 +0000 UTC m=+0.138748467 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd) Nov 28 05:05:08 localhost podman[320973]: 2025-11-28 10:05:08.579172373 +0000 UTC m=+0.148521076 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible) Nov 28 05:05:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:08.586 261084 INFO neutron.agent.dhcp.agent [None req-5d88e603-6df3-4f8d-8add-470612fe3b7f - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed#033[00m Nov 28 05:05:08 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:05:08 localhost dnsmasq[320998]: exiting on receipt of SIGTERM Nov 28 05:05:08 localhost podman[321036]: 2025-11-28 10:05:08.891877514 +0000 UTC m=+0.065700433 container kill 0cf4e56e8217bf4cf9c9e2f90f0b7996959dbb50a2a9fcfea8e0c2bee8fcb38f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 05:05:08 localhost systemd[1]: libpod-0cf4e56e8217bf4cf9c9e2f90f0b7996959dbb50a2a9fcfea8e0c2bee8fcb38f.scope: Deactivated successfully. Nov 28 05:05:08 localhost podman[321050]: 2025-11-28 10:05:08.967086049 +0000 UTC m=+0.057006634 container died 0cf4e56e8217bf4cf9c9e2f90f0b7996959dbb50a2a9fcfea8e0c2bee8fcb38f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 28 05:05:09 localhost podman[321050]: 2025-11-28 10:05:09.000936839 +0000 UTC m=+0.090857454 container cleanup 0cf4e56e8217bf4cf9c9e2f90f0b7996959dbb50a2a9fcfea8e0c2bee8fcb38f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:05:09 localhost systemd[1]: libpod-conmon-0cf4e56e8217bf4cf9c9e2f90f0b7996959dbb50a2a9fcfea8e0c2bee8fcb38f.scope: Deactivated successfully. Nov 28 05:05:09 localhost podman[321051]: 2025-11-28 10:05:09.045930588 +0000 UTC m=+0.131722775 container remove 0cf4e56e8217bf4cf9c9e2f90f0b7996959dbb50a2a9fcfea8e0c2bee8fcb38f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Nov 28 05:05:09 localhost systemd[1]: var-lib-containers-storage-overlay-b6069e3726f8d077e724937be22b6056e6a37f5666120f5627dd7e6f3d4cdecf-merged.mount: Deactivated successfully. Nov 28 05:05:09 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0cf4e56e8217bf4cf9c9e2f90f0b7996959dbb50a2a9fcfea8e0c2bee8fcb38f-userdata-shm.mount: Deactivated successfully. Nov 28 05:05:10 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:05:10 localhost podman[238687]: time="2025-11-28T10:05:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:05:10 localhost podman[238687]: @ - - [28/Nov/2025:10:05:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157512 "" "Go-http-client/1.1" Nov 28 05:05:10 localhost podman[238687]: @ - - [28/Nov/2025:10:05:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19744 "" "Go-http-client/1.1" Nov 28 05:05:10 localhost podman[321127]: Nov 28 05:05:10 localhost podman[321127]: 2025-11-28 10:05:10.331808814 +0000 UTC m=+0.095732744 container create d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 05:05:10 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:10.357 2 INFO neutron.agent.securitygroups_rpc [None req-20bc108e-ba14-415c-a7d5-38bda2943a27 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']#033[00m Nov 28 05:05:10 localhost systemd[1]: Started libpod-conmon-d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4.scope. Nov 28 05:05:10 localhost systemd[1]: Started libcrun container. Nov 28 05:05:10 localhost podman[321127]: 2025-11-28 10:05:10.286056013 +0000 UTC m=+0.049979973 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:05:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd81c9126a3e041d11bddaaea13e7a044ce378612bdfad66b863e26327980a74/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:05:10 localhost podman[321127]: 2025-11-28 10:05:10.399917875 +0000 UTC m=+0.163841815 container init d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:05:10 localhost podman[321127]: 2025-11-28 10:05:10.410511179 +0000 UTC m=+0.174435099 container start d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 28 05:05:10 localhost dnsmasq[321145]: started, version 2.85 cachesize 150 Nov 28 05:05:10 localhost dnsmasq[321145]: DNS service limited to local subnets Nov 28 05:05:10 localhost dnsmasq[321145]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:05:10 localhost dnsmasq[321145]: warning: no upstream servers configured Nov 28 05:05:10 localhost dnsmasq-dhcp[321145]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Nov 28 05:05:10 localhost dnsmasq[321145]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:05:10 localhost dnsmasq-dhcp[321145]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:05:10 localhost dnsmasq-dhcp[321145]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:05:10 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:10.455 2 INFO neutron.agent.securitygroups_rpc [None req-dee3de84-6e3e-4d2c-b4d4-a44f07424a62 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:05:10 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:10.538 261084 INFO neutron.agent.dhcp.agent [None req-7cdb70de-6dff-4cbc-b711-8dce3cd0e260 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '56bccdb5-6fe9-44a9-92c9-93e6e7a30192'} is completed#033[00m Nov 28 05:05:10 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:10.569 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:05:09Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=ef68a835-f86a-4989-8b1b-ac699035eeaa, ip_allocation=immediate, mac_address=fa:16:3e:24:c7:20, name=tempest-NetworksTestDHCPv6-452102375, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=49, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['4a85d5d0-7333-4fd9-9755-4bb7ab639c94', 'c845e4fa-0400-41ee-a338-1ba8f32c337c'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:05:08Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1873, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:05:10Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1#033[00m Nov 28 05:05:10 localhost nova_compute[279673]: 2025-11-28 10:05:10.618 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:10 localhost nova_compute[279673]: 2025-11-28 10:05:10.759 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:10 localhost dnsmasq[321145]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 2 addresses Nov 28 05:05:10 localhost dnsmasq-dhcp[321145]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:05:10 localhost dnsmasq-dhcp[321145]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:05:10 localhost podman[321165]: 2025-11-28 10:05:10.799873986 +0000 UTC m=+0.068321209 container kill d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 28 05:05:11 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:11.073 261084 INFO neutron.agent.dhcp.agent [None req-b7d10e26-9fc9-45e3-94dc-7691847df9c0 - - - - - -] DHCP configuration for ports {'ef68a835-f86a-4989-8b1b-ac699035eeaa'} is completed#033[00m Nov 28 05:05:11 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:11.361 2 INFO neutron.agent.securitygroups_rpc [None req-dada4afd-3538-47ee-91fc-3d547e9c2d44 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']#033[00m Nov 28 05:05:11 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:11.648 2 INFO neutron.agent.securitygroups_rpc [None req-3c61a2a4-46f5-45d3-8f70-86c422c843ed 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']#033[00m Nov 28 05:05:11 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:11.704 2 INFO neutron.agent.securitygroups_rpc [None req-bb3cde64-53ac-43b6-992c-b149fee8302c 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:05:11 localhost nova_compute[279673]: 2025-11-28 10:05:11.830 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:11 localhost podman[321202]: 2025-11-28 10:05:11.942113224 +0000 UTC m=+0.052223797 container kill d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:05:11 localhost dnsmasq[321145]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:05:11 localhost dnsmasq-dhcp[321145]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:05:11 localhost dnsmasq-dhcp[321145]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:05:11 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:11.965 2 INFO neutron.agent.securitygroups_rpc [None req-2c04ab47-4041-48cd-a577-3ad3edc1e57b a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']#033[00m Nov 28 05:05:12 localhost dnsmasq[321145]: exiting on receipt of SIGTERM Nov 28 05:05:12 localhost podman[321240]: 2025-11-28 10:05:12.990140644 +0000 UTC m=+0.060983409 container kill d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:05:12 localhost systemd[1]: libpod-d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4.scope: Deactivated successfully. Nov 28 05:05:13 localhost podman[321254]: 2025-11-28 10:05:13.07340909 +0000 UTC m=+0.064874670 container died d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3) Nov 28 05:05:13 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4-userdata-shm.mount: Deactivated successfully. Nov 28 05:05:13 localhost podman[321254]: 2025-11-28 10:05:13.105415917 +0000 UTC m=+0.096881457 container cleanup d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:05:13 localhost systemd[1]: libpod-conmon-d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4.scope: Deactivated successfully. Nov 28 05:05:13 localhost podman[321255]: 2025-11-28 10:05:13.145207647 +0000 UTC m=+0.131010734 container remove d6cef915e2969ba56885daaa83243a723b58023ba557435116d97136d2d8deb4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:05:13 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:13.285 2 INFO neutron.agent.securitygroups_rpc [None req-865841f9-78fc-4d59-bed2-9c90df3d7ecb a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']#033[00m Nov 28 05:05:13 localhost systemd[1]: var-lib-containers-storage-overlay-bd81c9126a3e041d11bddaaea13e7a044ce378612bdfad66b863e26327980a74-merged.mount: Deactivated successfully. Nov 28 05:05:14 localhost podman[321333]: Nov 28 05:05:14 localhost podman[321333]: 2025-11-28 10:05:14.028135457 +0000 UTC m=+0.087248962 container create 24c1e771cd71fd364067cf1ba78873c4dcc2e919b0f1b1f00343a6d218a15cca (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:05:14 localhost systemd[1]: Started libpod-conmon-24c1e771cd71fd364067cf1ba78873c4dcc2e919b0f1b1f00343a6d218a15cca.scope. Nov 28 05:05:14 localhost systemd[1]: Started libcrun container. Nov 28 05:05:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3804d8dee1d9232b07d6f91fedd0b9049c42d7f50f3153a52c55a3ee897b276c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:05:14 localhost podman[321333]: 2025-11-28 10:05:13.987537563 +0000 UTC m=+0.046651118 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:05:14 localhost podman[321333]: 2025-11-28 10:05:14.12491447 +0000 UTC m=+0.184027975 container init 24c1e771cd71fd364067cf1ba78873c4dcc2e919b0f1b1f00343a6d218a15cca (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2) Nov 28 05:05:14 localhost podman[321333]: 2025-11-28 10:05:14.141249208 +0000 UTC m=+0.200362723 container start 24c1e771cd71fd364067cf1ba78873c4dcc2e919b0f1b1f00343a6d218a15cca (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 28 05:05:14 localhost dnsmasq[321352]: started, version 2.85 cachesize 150 Nov 28 05:05:14 localhost dnsmasq[321352]: DNS service limited to local subnets Nov 28 05:05:14 localhost dnsmasq[321352]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:05:14 localhost dnsmasq[321352]: warning: no upstream servers configured Nov 28 05:05:14 localhost dnsmasq[321352]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:05:14 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:14.378 261084 INFO neutron.agent.dhcp.agent [None req-b20126a7-5ad8-4f4a-8aea-f023fd1f289c - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', '56bccdb5-6fe9-44a9-92c9-93e6e7a30192'} is completed#033[00m Nov 28 05:05:14 localhost dnsmasq[321352]: exiting on receipt of SIGTERM Nov 28 05:05:14 localhost podman[321370]: 2025-11-28 10:05:14.493528552 +0000 UTC m=+0.064468828 container kill 24c1e771cd71fd364067cf1ba78873c4dcc2e919b0f1b1f00343a6d218a15cca (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:05:14 localhost systemd[1]: libpod-24c1e771cd71fd364067cf1ba78873c4dcc2e919b0f1b1f00343a6d218a15cca.scope: Deactivated successfully. Nov 28 05:05:14 localhost podman[321385]: 2025-11-28 10:05:14.576585012 +0000 UTC m=+0.066761775 container died 24c1e771cd71fd364067cf1ba78873c4dcc2e919b0f1b1f00343a6d218a15cca (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Nov 28 05:05:14 localhost podman[321385]: 2025-11-28 10:05:14.616282829 +0000 UTC m=+0.106459552 container cleanup 24c1e771cd71fd364067cf1ba78873c4dcc2e919b0f1b1f00343a6d218a15cca (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:05:14 localhost systemd[1]: libpod-conmon-24c1e771cd71fd364067cf1ba78873c4dcc2e919b0f1b1f00343a6d218a15cca.scope: Deactivated successfully. Nov 28 05:05:14 localhost podman[321387]: 2025-11-28 10:05:14.671317546 +0000 UTC m=+0.155602059 container remove 24c1e771cd71fd364067cf1ba78873c4dcc2e919b0f1b1f00343a6d218a15cca (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:05:14 localhost nova_compute[279673]: 2025-11-28 10:05:14.686 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:14 localhost kernel: device tap56bccdb5-6f left promiscuous mode Nov 28 05:05:14 localhost ovn_controller[152322]: 2025-11-28T10:05:14Z|00332|binding|INFO|Releasing lport 56bccdb5-6fe9-44a9-92c9-93e6e7a30192 from this chassis (sb_readonly=0) Nov 28 05:05:14 localhost ovn_controller[152322]: 2025-11-28T10:05:14Z|00333|binding|INFO|Setting lport 56bccdb5-6fe9-44a9-92c9-93e6e7a30192 down in Southbound Nov 28 05:05:14 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:14.696 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::2/64 2001:db8::f816:3eff:fe3d:5072/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=56bccdb5-6fe9-44a9-92c9-93e6e7a30192) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:05:14 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:14.698 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 56bccdb5-6fe9-44a9-92c9-93e6e7a30192 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis#033[00m Nov 28 05:05:14 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:14.702 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:05:14 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:14.703 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[bf5f7241-ecc9-4b0e-a681-f4217f3bff0c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:05:14 localhost nova_compute[279673]: 2025-11-28 10:05:14.705 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:14 localhost nova_compute[279673]: 2025-11-28 10:05:14.707 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:14 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:14.919 261084 INFO neutron.agent.dhcp.agent [None req-7495b5dd-61b5-4d0e-b866-d4f43ab0d53d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:05:14 localhost systemd[1]: var-lib-containers-storage-overlay-3804d8dee1d9232b07d6f91fedd0b9049c42d7f50f3153a52c55a3ee897b276c-merged.mount: Deactivated successfully. Nov 28 05:05:14 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-24c1e771cd71fd364067cf1ba78873c4dcc2e919b0f1b1f00343a6d218a15cca-userdata-shm.mount: Deactivated successfully. Nov 28 05:05:14 localhost systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully. Nov 28 05:05:15 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:05:15 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:15.474 2 INFO neutron.agent.securitygroups_rpc [None req-95a3885e-d25b-4b35-9841-10ba9cd222bb a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']#033[00m Nov 28 05:05:15 localhost nova_compute[279673]: 2025-11-28 10:05:15.620 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:15 localhost nova_compute[279673]: 2025-11-28 10:05:15.803 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:16 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:16.732 2 INFO neutron.agent.securitygroups_rpc [None req-8e314c79-26fe-4adc-939f-e587c4e62ad2 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']#033[00m Nov 28 05:05:17 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:17.030 261084 INFO neutron.agent.linux.ip_lib [None req-c8307bfa-67b4-44b4-bd88-df6a413c4b68 - - - - - -] Device tapeb25319d-f0 cannot be used as it has no MAC address#033[00m Nov 28 05:05:17 localhost nova_compute[279673]: 2025-11-28 10:05:17.099 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:17 localhost kernel: device tapeb25319d-f0 entered promiscuous mode Nov 28 05:05:17 localhost NetworkManager[5967]: [1764324317.1082] manager: (tapeb25319d-f0): new Generic device (/org/freedesktop/NetworkManager/Devices/56) Nov 28 05:05:17 localhost ovn_controller[152322]: 2025-11-28T10:05:17Z|00334|binding|INFO|Claiming lport eb25319d-f07e-4bef-a7f6-ca024599d184 for this chassis. Nov 28 05:05:17 localhost ovn_controller[152322]: 2025-11-28T10:05:17Z|00335|binding|INFO|eb25319d-f07e-4bef-a7f6-ca024599d184: Claiming unknown Nov 28 05:05:17 localhost nova_compute[279673]: 2025-11-28 10:05:17.112 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:17 localhost systemd-udevd[321424]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:05:17 localhost ovn_controller[152322]: 2025-11-28T10:05:17Z|00336|binding|INFO|Setting lport eb25319d-f07e-4bef-a7f6-ca024599d184 ovn-installed in OVS Nov 28 05:05:17 localhost nova_compute[279673]: 2025-11-28 10:05:17.118 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:17 localhost ovn_controller[152322]: 2025-11-28T10:05:17Z|00337|binding|INFO|Setting lport eb25319d-f07e-4bef-a7f6-ca024599d184 up in Southbound Nov 28 05:05:17 localhost nova_compute[279673]: 2025-11-28 10:05:17.122 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:17 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:17.121 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:febd:20cb/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=eb25319d-f07e-4bef-a7f6-ca024599d184) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:05:17 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:17.123 158130 INFO neutron.agent.ovn.metadata.agent [-] Port eb25319d-f07e-4bef-a7f6-ca024599d184 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 bound to our chassis#033[00m Nov 28 05:05:17 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:17.126 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 034720ca-968c-4c23-b6b3-fb448c9725c8 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:05:17 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:17.126 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:05:17 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:17.127 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[9cd157a1-2d35-4f5c-b80c-7718e668f3ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:05:17 localhost journal[227875]: ethtool ioctl error on tapeb25319d-f0: No such device Nov 28 05:05:17 localhost journal[227875]: ethtool ioctl error on tapeb25319d-f0: No such device Nov 28 05:05:17 localhost nova_compute[279673]: 2025-11-28 10:05:17.151 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:17 localhost journal[227875]: ethtool ioctl error on tapeb25319d-f0: No such device Nov 28 05:05:17 localhost journal[227875]: ethtool ioctl error on tapeb25319d-f0: No such device Nov 28 05:05:17 localhost journal[227875]: ethtool ioctl error on tapeb25319d-f0: No such device Nov 28 05:05:17 localhost journal[227875]: ethtool ioctl error on tapeb25319d-f0: No such device Nov 28 05:05:17 localhost journal[227875]: ethtool ioctl error on tapeb25319d-f0: No such device Nov 28 05:05:17 localhost journal[227875]: ethtool ioctl error on tapeb25319d-f0: No such device Nov 28 05:05:17 localhost nova_compute[279673]: 2025-11-28 10:05:17.195 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:17 localhost nova_compute[279673]: 2025-11-28 10:05:17.231 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:18 localhost podman[321495]: Nov 28 05:05:18 localhost openstack_network_exporter[240658]: ERROR 10:05:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:05:18 localhost openstack_network_exporter[240658]: ERROR 10:05:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:05:18 localhost openstack_network_exporter[240658]: ERROR 10:05:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:05:18 localhost openstack_network_exporter[240658]: ERROR 10:05:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:05:18 localhost openstack_network_exporter[240658]: Nov 28 05:05:18 localhost openstack_network_exporter[240658]: ERROR 10:05:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:05:18 localhost openstack_network_exporter[240658]: Nov 28 05:05:18 localhost podman[321495]: 2025-11-28 10:05:18.103625925 +0000 UTC m=+0.092348617 container create 493424886c595aa6ffa9620d7a7213b71b14d5bf1ae32cac4298c71b921b95d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 28 05:05:18 localhost podman[321495]: 2025-11-28 10:05:18.056111624 +0000 UTC m=+0.044834316 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:05:18 localhost systemd[1]: Started libpod-conmon-493424886c595aa6ffa9620d7a7213b71b14d5bf1ae32cac4298c71b921b95d2.scope. Nov 28 05:05:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:05:18 localhost systemd[1]: Started libcrun container. Nov 28 05:05:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9dd9c406a4caf236febbbc1bd2d17248727709a1c53c4b98aa26be3cd147678/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:05:18 localhost podman[321495]: 2025-11-28 10:05:18.214488042 +0000 UTC m=+0.203210714 container init 493424886c595aa6ffa9620d7a7213b71b14d5bf1ae32cac4298c71b921b95d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Nov 28 05:05:18 localhost podman[321495]: 2025-11-28 10:05:18.223308165 +0000 UTC m=+0.212030837 container start 493424886c595aa6ffa9620d7a7213b71b14d5bf1ae32cac4298c71b921b95d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 28 05:05:18 localhost dnsmasq[321520]: started, version 2.85 cachesize 150 Nov 28 05:05:18 localhost dnsmasq[321520]: DNS service limited to local subnets Nov 28 05:05:18 localhost dnsmasq[321520]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:05:18 localhost dnsmasq[321520]: warning: no upstream servers configured Nov 28 05:05:18 localhost dnsmasq-dhcp[321520]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:05:18 localhost dnsmasq[321520]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:05:18 localhost dnsmasq-dhcp[321520]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:05:18 localhost dnsmasq-dhcp[321520]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:05:18 localhost podman[321513]: 2025-11-28 10:05:18.297266554 +0000 UTC m=+0.092338816 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, vcs-type=git, architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible) Nov 28 05:05:18 localhost podman[321513]: 2025-11-28 10:05:18.31353001 +0000 UTC m=+0.108602252 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7) Nov 28 05:05:18 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:05:18 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:18.334 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:05:18 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:18.411 261084 INFO neutron.agent.dhcp.agent [None req-57221227-0637-48ff-a646-1518f824b861 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f'} is completed#033[00m Nov 28 05:05:18 localhost dnsmasq[321520]: exiting on receipt of SIGTERM Nov 28 05:05:18 localhost systemd[1]: libpod-493424886c595aa6ffa9620d7a7213b71b14d5bf1ae32cac4298c71b921b95d2.scope: Deactivated successfully. Nov 28 05:05:18 localhost podman[321553]: 2025-11-28 10:05:18.700251541 +0000 UTC m=+0.084967965 container kill 493424886c595aa6ffa9620d7a7213b71b14d5bf1ae32cac4298c71b921b95d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 05:05:18 localhost podman[321566]: 2025-11-28 10:05:18.777459294 +0000 UTC m=+0.063492851 container died 493424886c595aa6ffa9620d7a7213b71b14d5bf1ae32cac4298c71b921b95d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125) Nov 28 05:05:18 localhost podman[321566]: 2025-11-28 10:05:18.819166139 +0000 UTC m=+0.105199646 container cleanup 493424886c595aa6ffa9620d7a7213b71b14d5bf1ae32cac4298c71b921b95d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:05:18 localhost systemd[1]: libpod-conmon-493424886c595aa6ffa9620d7a7213b71b14d5bf1ae32cac4298c71b921b95d2.scope: Deactivated successfully. Nov 28 05:05:18 localhost nova_compute[279673]: 2025-11-28 10:05:18.826 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:18 localhost podman[321571]: 2025-11-28 10:05:18.868336417 +0000 UTC m=+0.137093399 container remove 493424886c595aa6ffa9620d7a7213b71b14d5bf1ae32cac4298c71b921b95d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:05:19 localhost systemd[1]: var-lib-containers-storage-overlay-f9dd9c406a4caf236febbbc1bd2d17248727709a1c53c4b98aa26be3cd147678-merged.mount: Deactivated successfully. Nov 28 05:05:19 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-493424886c595aa6ffa9620d7a7213b71b14d5bf1ae32cac4298c71b921b95d2-userdata-shm.mount: Deactivated successfully. Nov 28 05:05:19 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:19.181 2 INFO neutron.agent.securitygroups_rpc [None req-8fd6935d-936a-4ee4-abda-f9675ff09d60 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']#033[00m Nov 28 05:05:19 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:19.799 2 INFO neutron.agent.securitygroups_rpc [None req-5f6fc136-bb5c-4f44-93cc-8b8331d1b8c7 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:05:20 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:05:20 localhost podman[321647]: Nov 28 05:05:20 localhost podman[321647]: 2025-11-28 10:05:20.370282213 +0000 UTC m=+0.094963871 container create 7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 28 05:05:20 localhost systemd[1]: Started libpod-conmon-7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4.scope. Nov 28 05:05:20 localhost podman[321647]: 2025-11-28 10:05:20.326260623 +0000 UTC m=+0.050942321 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:05:20 localhost systemd[1]: Started libcrun container. Nov 28 05:05:20 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:20.445 2 INFO neutron.agent.securitygroups_rpc [None req-08674235-5e89-42a7-aefb-bb66d7c4db90 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']#033[00m Nov 28 05:05:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b03c89dcf0faf8109c5c785fa3ed5032c1fb0ecf301a75e745f5f4a7b8eb2d71/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:05:20 localhost podman[321647]: 2025-11-28 10:05:20.460057187 +0000 UTC m=+0.184738845 container init 7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 28 05:05:20 localhost podman[321647]: 2025-11-28 10:05:20.470379522 +0000 UTC m=+0.195061180 container start 7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true) Nov 28 05:05:20 localhost dnsmasq[321665]: started, version 2.85 cachesize 150 Nov 28 05:05:20 localhost dnsmasq[321665]: DNS service limited to local subnets Nov 28 05:05:20 localhost dnsmasq[321665]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:05:20 localhost dnsmasq[321665]: warning: no upstream servers configured Nov 28 05:05:20 localhost dnsmasq-dhcp[321665]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Nov 28 05:05:20 localhost dnsmasq-dhcp[321665]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:05:20 localhost dnsmasq[321665]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:05:20 localhost dnsmasq-dhcp[321665]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:05:20 localhost dnsmasq-dhcp[321665]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:05:20 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:20.543 261084 INFO neutron.agent.dhcp.agent [None req-2b6e1a79-43d2-4680-90fa-c549dc6edda9 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:05:19Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=a1a7bb08-d189-4970-b6b2-630de75a7603, ip_allocation=immediate, mac_address=fa:16:3e:6a:01:fa, name=tempest-NetworksTestDHCPv6-636257890, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=53, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['58dd9090-d868-4db4-bc1b-d85af9f4c5b9', '5a9525ff-24a1-498c-a69f-7eb8d2a6b3f0'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:05:18Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1911, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:05:19Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1#033[00m Nov 28 05:05:20 localhost nova_compute[279673]: 2025-11-28 10:05:20.650 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:20 localhost dnsmasq[321665]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 2 addresses Nov 28 05:05:20 localhost podman[321684]: 2025-11-28 10:05:20.753758932 +0000 UTC m=+0.052368531 container kill 7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 05:05:20 localhost dnsmasq-dhcp[321665]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:05:20 localhost dnsmasq-dhcp[321665]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:05:20 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:20.777 261084 INFO neutron.agent.dhcp.agent [None req-6103797e-56b6-4637-9222-0c100f3c856f - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', 'eb25319d-f07e-4bef-a7f6-ca024599d184'} is completed#033[00m Nov 28 05:05:20 localhost nova_compute[279673]: 2025-11-28 10:05:20.807 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:20 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:20.934 2 INFO neutron.agent.securitygroups_rpc [None req-ff8acaea-6e1e-4ede-81e1-a7305639837a 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:05:21 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:21.000 261084 INFO neutron.agent.dhcp.agent [None req-08f71709-211c-40a5-a39d-ca60eb6b9f7a - - - - - -] DHCP configuration for ports {'a1a7bb08-d189-4970-b6b2-630de75a7603'} is completed#033[00m Nov 28 05:05:21 localhost dnsmasq[321665]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:05:21 localhost dnsmasq-dhcp[321665]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:05:21 localhost podman[321722]: 2025-11-28 10:05:21.179303155 +0000 UTC m=+0.068164824 container kill 7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:05:21 localhost dnsmasq-dhcp[321665]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:05:21 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:21.291 2 INFO neutron.agent.securitygroups_rpc [None req-f5c2db0c-7ccc-47c0-8119-58c735104780 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']#033[00m Nov 28 05:05:21 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:21.317 2 INFO neutron.agent.securitygroups_rpc [None req-baf56349-7def-4de1-b8e1-b12f76677166 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']#033[00m Nov 28 05:05:21 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:21.350 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:05:21 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:21.420 2 INFO neutron.agent.securitygroups_rpc [None req-1ffc7eeb-b258-4bda-bcd4-9d4ef92a7b1f 48f54094604448288052e0203d18d8df 45fdbe27569f45449de58f1d1899ceea - - default default] Security group member updated ['0eb534d2-8077-4b6c-8550-0df9f702073c']#033[00m Nov 28 05:05:21 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:21.532 2 INFO neutron.agent.securitygroups_rpc [None req-1ffc7eeb-b258-4bda-bcd4-9d4ef92a7b1f 48f54094604448288052e0203d18d8df 45fdbe27569f45449de58f1d1899ceea - - default default] Security group member updated ['0eb534d2-8077-4b6c-8550-0df9f702073c']#033[00m Nov 28 05:05:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:05:21 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3152367006' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:05:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:05:21 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3152367006' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:05:22 localhost dnsmasq[321665]: exiting on receipt of SIGTERM Nov 28 05:05:22 localhost systemd[1]: libpod-7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4.scope: Deactivated successfully. Nov 28 05:05:22 localhost podman[321809]: 2025-11-28 10:05:22.00971203 +0000 UTC m=+0.075011860 container kill 7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:05:22 localhost podman[321826]: 2025-11-28 10:05:22.087901981 +0000 UTC m=+0.061391930 container died 7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:05:22 localhost systemd[1]: tmp-crun.UNUyEQ.mount: Deactivated successfully. Nov 28 05:05:22 localhost podman[321826]: 2025-11-28 10:05:22.220081558 +0000 UTC m=+0.193571527 container cleanup 7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:05:22 localhost systemd[1]: libpod-conmon-7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4.scope: Deactivated successfully. Nov 28 05:05:22 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 05:05:22 localhost podman[321828]: 2025-11-28 10:05:22.244947041 +0000 UTC m=+0.206603911 container remove 7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:05:22 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:05:22 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:22.270 2 INFO neutron.agent.securitygroups_rpc [None req-a89ecc9f-78c7-44a1-98c4-c8700080c0c0 4d2a0ee370da4e688f1d8f5c639f278d ac0f848cde7c47c998a8b80087a3b59d - - default default] Security group member updated ['6784caad-903b-4e5a-b2eb-6bb1d9f8710a']#033[00m Nov 28 05:05:22 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:22.299 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:05:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:05:22 localhost systemd[1]: var-lib-containers-storage-overlay-b03c89dcf0faf8109c5c785fa3ed5032c1fb0ecf301a75e745f5f4a7b8eb2d71-merged.mount: Deactivated successfully. Nov 28 05:05:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7a79177c367286746847b4c28dc2291742deeb0c34a2199b825787c87c15c4b4-userdata-shm.mount: Deactivated successfully. Nov 28 05:05:22 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 05:05:22 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:05:22 localhost podman[321892]: 2025-11-28 10:05:22.459671133 +0000 UTC m=+0.081600909 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 28 05:05:22 localhost podman[321892]: 2025-11-28 10:05:22.494681657 +0000 UTC m=+0.116611443 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 05:05:22 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:05:22 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:22.551 2 INFO neutron.agent.securitygroups_rpc [None req-d94caea8-60f5-4f9f-b771-ee8db26744a5 48f54094604448288052e0203d18d8df 45fdbe27569f45449de58f1d1899ceea - - default default] Security group member updated ['0eb534d2-8077-4b6c-8550-0df9f702073c']#033[00m Nov 28 05:05:22 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:22.949 2 INFO neutron.agent.securitygroups_rpc [None req-e4fadf30-c514-41d2-a5af-51c101b5ad33 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']#033[00m Nov 28 05:05:23 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:23.171 2 INFO neutron.agent.securitygroups_rpc [None req-e4e98f94-fcd2-4011-84de-9dbf12942472 48f54094604448288052e0203d18d8df 45fdbe27569f45449de58f1d1899ceea - - default default] Security group member updated ['0eb534d2-8077-4b6c-8550-0df9f702073c']#033[00m Nov 28 05:05:23 localhost podman[321960]: Nov 28 05:05:23 localhost podman[321960]: 2025-11-28 10:05:23.218446826 +0000 UTC m=+0.091077611 container create 90ceb0d343071c884e99daae5c2d90e744f7ccd543266fb4543aa9ba4147d430 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:05:23 localhost systemd[1]: Started libpod-conmon-90ceb0d343071c884e99daae5c2d90e744f7ccd543266fb4543aa9ba4147d430.scope. Nov 28 05:05:23 localhost podman[321960]: 2025-11-28 10:05:23.176851624 +0000 UTC m=+0.049482509 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:05:23 localhost systemd[1]: Started libcrun container. Nov 28 05:05:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb9a48c4685e36add903d5c9ab17afdcaa556ea1a03aa31fa67d0b96b3f2a184/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:05:23 localhost podman[321960]: 2025-11-28 10:05:23.298065477 +0000 UTC m=+0.170696273 container init 90ceb0d343071c884e99daae5c2d90e744f7ccd543266fb4543aa9ba4147d430 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 05:05:23 localhost podman[321960]: 2025-11-28 10:05:23.306761887 +0000 UTC m=+0.179392672 container start 90ceb0d343071c884e99daae5c2d90e744f7ccd543266fb4543aa9ba4147d430 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:05:23 localhost dnsmasq[321979]: started, version 2.85 cachesize 150 Nov 28 05:05:23 localhost dnsmasq[321979]: DNS service limited to local subnets Nov 28 05:05:23 localhost dnsmasq[321979]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:05:23 localhost dnsmasq[321979]: warning: no upstream servers configured Nov 28 05:05:23 localhost dnsmasq-dhcp[321979]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Nov 28 05:05:23 localhost dnsmasq[321979]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:05:23 localhost dnsmasq-dhcp[321979]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:05:23 localhost dnsmasq-dhcp[321979]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:05:23 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:23.412 2 INFO neutron.agent.securitygroups_rpc [None req-ab356ba5-c3b5-409b-9ecc-7c688fb94166 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']#033[00m Nov 28 05:05:23 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:23.539 261084 INFO neutron.agent.dhcp.agent [None req-32f4d057-3903-4553-9801-7d3c217fd3ad - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', 'eb25319d-f07e-4bef-a7f6-ca024599d184'} is completed#033[00m Nov 28 05:05:23 localhost dnsmasq[321979]: exiting on receipt of SIGTERM Nov 28 05:05:23 localhost podman[321995]: 2025-11-28 10:05:23.661733987 +0000 UTC m=+0.068146903 container kill 90ceb0d343071c884e99daae5c2d90e744f7ccd543266fb4543aa9ba4147d430 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 28 05:05:23 localhost systemd[1]: libpod-90ceb0d343071c884e99daae5c2d90e744f7ccd543266fb4543aa9ba4147d430.scope: Deactivated successfully. Nov 28 05:05:23 localhost podman[322007]: 2025-11-28 10:05:23.737434796 +0000 UTC m=+0.058942869 container died 90ceb0d343071c884e99daae5c2d90e744f7ccd543266fb4543aa9ba4147d430 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:05:23 localhost podman[322007]: 2025-11-28 10:05:23.77105573 +0000 UTC m=+0.092563773 container cleanup 90ceb0d343071c884e99daae5c2d90e744f7ccd543266fb4543aa9ba4147d430 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 28 05:05:23 localhost systemd[1]: libpod-conmon-90ceb0d343071c884e99daae5c2d90e744f7ccd543266fb4543aa9ba4147d430.scope: Deactivated successfully. Nov 28 05:05:23 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:23.800 2 INFO neutron.agent.securitygroups_rpc [None req-237dd594-9b00-4d69-9190-5464bb1ce820 a22fc7f25dd74349be1fe8842a517a9e e8063417cfc540ba8948734a0d2952d7 - - default default] Security group member updated ['f26ff442-2d58-414e-b33d-dcdad9bae629']#033[00m Nov 28 05:05:23 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:23.816 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:05:23 localhost podman[322009]: 2025-11-28 10:05:23.822791593 +0000 UTC m=+0.136710569 container remove 90ceb0d343071c884e99daae5c2d90e744f7ccd543266fb4543aa9ba4147d430 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true) Nov 28 05:05:24 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:24.163 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:05:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:05:24 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1834815268' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:05:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:05:24 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1834815268' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:05:24 localhost systemd[1]: var-lib-containers-storage-overlay-bb9a48c4685e36add903d5c9ab17afdcaa556ea1a03aa31fa67d0b96b3f2a184-merged.mount: Deactivated successfully. Nov 28 05:05:24 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-90ceb0d343071c884e99daae5c2d90e744f7ccd543266fb4543aa9ba4147d430-userdata-shm.mount: Deactivated successfully. Nov 28 05:05:24 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:24.436 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:ad:87 2001:db8:0:1:f816:3eff:fef8:ad87'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '30', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ef9eb238-2b1e-49f7-8a0f-72efc8854e0f) old=Port_Binding(mac=['fa:16:3e:f8:ad:87 2001:db8::f816:3eff:fef8:ad87'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef8:ad87/64', 'neutron:device_id': 'ovnmeta-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '28', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:05:24 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:24.440 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ef9eb238-2b1e-49f7-8a0f-72efc8854e0f in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 updated#033[00m Nov 28 05:05:24 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:24.443 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 034720ca-968c-4c23-b6b3-fb448c9725c8 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:05:24 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:24.443 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:05:24 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:24.445 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[bf7ce151-258d-4e89-8d48-62be3ed95d3b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:05:25 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:05:25 localhost podman[322089]: Nov 28 05:05:25 localhost podman[322089]: 2025-11-28 10:05:25.338539085 +0000 UTC m=+0.089140106 container create 91accf782ff3cf731a308ad6ffb0d110d5b9e240beea4be32f785cb2a09074d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true) Nov 28 05:05:25 localhost systemd[1]: Started libpod-conmon-91accf782ff3cf731a308ad6ffb0d110d5b9e240beea4be32f785cb2a09074d2.scope. Nov 28 05:05:25 localhost systemd[1]: tmp-crun.1aL7Bk.mount: Deactivated successfully. Nov 28 05:05:25 localhost podman[322089]: 2025-11-28 10:05:25.3010042 +0000 UTC m=+0.051605231 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:05:25 localhost systemd[1]: Started libcrun container. Nov 28 05:05:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d5138906d6fc1bdee0a9b155d7b868d3e377bc1927eaf0edf6a9cc67c712c01/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:05:25 localhost podman[322089]: 2025-11-28 10:05:25.438321244 +0000 UTC m=+0.188922235 container init 91accf782ff3cf731a308ad6ffb0d110d5b9e240beea4be32f785cb2a09074d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:05:25 localhost podman[322089]: 2025-11-28 10:05:25.447923069 +0000 UTC m=+0.198524060 container start 91accf782ff3cf731a308ad6ffb0d110d5b9e240beea4be32f785cb2a09074d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:05:25 localhost dnsmasq[322107]: started, version 2.85 cachesize 150 Nov 28 05:05:25 localhost dnsmasq[322107]: DNS service limited to local subnets Nov 28 05:05:25 localhost dnsmasq[322107]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:05:25 localhost dnsmasq[322107]: warning: no upstream servers configured Nov 28 05:05:25 localhost dnsmasq-dhcp[322107]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:05:25 localhost dnsmasq[322107]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:05:25 localhost dnsmasq-dhcp[322107]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:05:25 localhost dnsmasq-dhcp[322107]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:05:25 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:25.474 2 INFO neutron.agent.securitygroups_rpc [None req-1ee68db6-3f9b-4e96-8e85-23a114c9c60e 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:05:25 localhost nova_compute[279673]: 2025-11-28 10:05:25.653 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:25 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:25.713 261084 INFO neutron.agent.dhcp.agent [None req-0e9b5ee0-b490-441e-ab59-983aa56ee460 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', 'eb25319d-f07e-4bef-a7f6-ca024599d184'} is completed#033[00m Nov 28 05:05:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:05:25 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 28 05:05:25 localhost nova_compute[279673]: 2025-11-28 10:05:25.850 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:25 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:05:25 localhost dnsmasq[322107]: exiting on receipt of SIGTERM Nov 28 05:05:25 localhost podman[322130]: 2025-11-28 10:05:25.897420678 +0000 UTC m=+0.108605722 container kill 91accf782ff3cf731a308ad6ffb0d110d5b9e240beea4be32f785cb2a09074d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2) Nov 28 05:05:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:05:25 localhost systemd[1]: libpod-91accf782ff3cf731a308ad6ffb0d110d5b9e240beea4be32f785cb2a09074d2.scope: Deactivated successfully. Nov 28 05:05:25 localhost podman[322123]: 2025-11-28 10:05:25.902188975 +0000 UTC m=+0.135642797 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Nov 28 05:05:25 localhost podman[322123]: 2025-11-28 10:05:25.908725673 +0000 UTC m=+0.142179455 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2) Nov 28 05:05:25 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:05:26 localhost podman[322150]: 2025-11-28 10:05:26.032622313 +0000 UTC m=+0.121381170 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 28 05:05:26 localhost podman[322152]: 2025-11-28 10:05:26.066907225 +0000 UTC m=+0.092126893 container died 91accf782ff3cf731a308ad6ffb0d110d5b9e240beea4be32f785cb2a09074d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:05:26 localhost podman[322152]: 2025-11-28 10:05:26.091380196 +0000 UTC m=+0.116599834 container cleanup 91accf782ff3cf731a308ad6ffb0d110d5b9e240beea4be32f785cb2a09074d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:05:26 localhost systemd[1]: libpod-conmon-91accf782ff3cf731a308ad6ffb0d110d5b9e240beea4be32f785cb2a09074d2.scope: Deactivated successfully. Nov 28 05:05:26 localhost podman[322150]: 2025-11-28 10:05:26.11490806 +0000 UTC m=+0.203666937 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller) Nov 28 05:05:26 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:05:26 localhost podman[322159]: 2025-11-28 10:05:26.197809566 +0000 UTC m=+0.214322454 container remove 91accf782ff3cf731a308ad6ffb0d110d5b9e240beea4be32f785cb2a09074d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:05:26 localhost systemd[1]: var-lib-containers-storage-overlay-8d5138906d6fc1bdee0a9b155d7b868d3e377bc1927eaf0edf6a9cc67c712c01-merged.mount: Deactivated successfully. Nov 28 05:05:26 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-91accf782ff3cf731a308ad6ffb0d110d5b9e240beea4be32f785cb2a09074d2-userdata-shm.mount: Deactivated successfully. Nov 28 05:05:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:05:26 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2070944282' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:05:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:05:26 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2070944282' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:05:26 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:05:26 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:26.901 2 INFO neutron.agent.securitygroups_rpc [None req-b7b686b0-df55-421e-8492-2a2961409c1a 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:05:27 localhost podman[322254]: Nov 28 05:05:27 localhost podman[322254]: 2025-11-28 10:05:27.148426825 +0000 UTC m=+0.092809581 container create 8efd976cbd7b204b0083dc3d5cb1ff48a831e0757f90392943e5415cb0813b9e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125) Nov 28 05:05:27 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:27.190 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:05:27 localhost podman[322254]: 2025-11-28 10:05:27.104256399 +0000 UTC m=+0.048639185 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:05:27 localhost systemd[1]: Started libpod-conmon-8efd976cbd7b204b0083dc3d5cb1ff48a831e0757f90392943e5415cb0813b9e.scope. Nov 28 05:05:27 localhost systemd[1]: Started libcrun container. Nov 28 05:05:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46e3f66133a2e22df0c7d7cab94349954b58c472860335b1bc1ff6e5aba2628c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:05:27 localhost podman[322254]: 2025-11-28 10:05:27.237706683 +0000 UTC m=+0.182089449 container init 8efd976cbd7b204b0083dc3d5cb1ff48a831e0757f90392943e5415cb0813b9e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 28 05:05:27 localhost podman[322254]: 2025-11-28 10:05:27.247634537 +0000 UTC m=+0.192017293 container start 8efd976cbd7b204b0083dc3d5cb1ff48a831e0757f90392943e5415cb0813b9e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:05:27 localhost dnsmasq[322273]: started, version 2.85 cachesize 150 Nov 28 05:05:27 localhost dnsmasq[322273]: DNS service limited to local subnets Nov 28 05:05:27 localhost dnsmasq[322273]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:05:27 localhost dnsmasq[322273]: warning: no upstream servers configured Nov 28 05:05:27 localhost dnsmasq-dhcp[322273]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:05:27 localhost dnsmasq[322273]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 2 addresses Nov 28 05:05:27 localhost dnsmasq-dhcp[322273]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:05:27 localhost dnsmasq-dhcp[322273]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:05:27 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:27.310 261084 INFO neutron.agent.dhcp.agent [None req-27a81b29-092f-412d-b9e4-496af2536f6c - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:05:24Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=073bafeb-50ac-4313-8f33-745fd94b95e2, ip_allocation=immediate, mac_address=fa:16:3e:60:05:4c, name=tempest-NetworksTestDHCPv6-71055213, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:03:14Z, description=, dns_domain=, id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-192795517, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62651, qos_policy_id=None, revision_number=57, router:external=False, shared=False, standard_attr_id=1266, status=ACTIVE, subnets=['0d5a9a0b-5034-49a5-ab15-87280961b993', '17bf8ba1-a681-4323-80ea-a68195dff493'], tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:05:23Z, vlan_transparent=None, network_id=8642adde-54ae-4fc2-b997-bf1962c6c7f1, port_security_enabled=True, project_id=a8f8694ac11a4237ad168b64c39ca114, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4'], standard_attr_id=1934, status=DOWN, tags=[], tenant_id=a8f8694ac11a4237ad168b64c39ca114, updated_at=2025-11-28T10:05:25Z on network 8642adde-54ae-4fc2-b997-bf1962c6c7f1#033[00m Nov 28 05:05:27 localhost dnsmasq[322273]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 2 addresses Nov 28 05:05:27 localhost dnsmasq-dhcp[322273]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:05:27 localhost dnsmasq-dhcp[322273]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:05:27 localhost podman[322291]: 2025-11-28 10:05:27.530295126 +0000 UTC m=+0.071034166 container kill 8efd976cbd7b204b0083dc3d5cb1ff48a831e0757f90392943e5415cb0813b9e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:05:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:05:27 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:27.607 261084 INFO neutron.agent.dhcp.agent [None req-a198de0c-d6b6-4922-9dfc-1a9ad0102467 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', 'eb25319d-f07e-4bef-a7f6-ca024599d184', '073bafeb-50ac-4313-8f33-745fd94b95e2'} is completed#033[00m Nov 28 05:05:27 localhost podman[322306]: 2025-11-28 10:05:27.659809338 +0000 UTC m=+0.100320696 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:05:27 localhost podman[322306]: 2025-11-28 10:05:27.67731263 +0000 UTC m=+0.117823998 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125) Nov 28 05:05:27 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:05:27 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:27.821 261084 INFO neutron.agent.dhcp.agent [None req-84d785b6-9e1d-42bc-a4aa-eb6338dd664b - - - - - -] DHCP configuration for ports {'073bafeb-50ac-4313-8f33-745fd94b95e2'} is completed#033[00m Nov 28 05:05:28 localhost dnsmasq[322273]: exiting on receipt of SIGTERM Nov 28 05:05:28 localhost podman[322347]: 2025-11-28 10:05:28.035344329 +0000 UTC m=+0.061217786 container kill 8efd976cbd7b204b0083dc3d5cb1ff48a831e0757f90392943e5415cb0813b9e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 05:05:28 localhost systemd[1]: libpod-8efd976cbd7b204b0083dc3d5cb1ff48a831e0757f90392943e5415cb0813b9e.scope: Deactivated successfully. Nov 28 05:05:28 localhost podman[322359]: 2025-11-28 10:05:28.118310555 +0000 UTC m=+0.071450647 container died 8efd976cbd7b204b0083dc3d5cb1ff48a831e0757f90392943e5415cb0813b9e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 05:05:28 localhost podman[322359]: 2025-11-28 10:05:28.152956139 +0000 UTC m=+0.106096191 container cleanup 8efd976cbd7b204b0083dc3d5cb1ff48a831e0757f90392943e5415cb0813b9e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 05:05:28 localhost systemd[1]: libpod-conmon-8efd976cbd7b204b0083dc3d5cb1ff48a831e0757f90392943e5415cb0813b9e.scope: Deactivated successfully. Nov 28 05:05:28 localhost podman[322366]: 2025-11-28 10:05:28.196174287 +0000 UTC m=+0.133118225 container remove 8efd976cbd7b204b0083dc3d5cb1ff48a831e0757f90392943e5415cb0813b9e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:05:28 localhost systemd[1]: tmp-crun.TBBEQ8.mount: Deactivated successfully. Nov 28 05:05:28 localhost systemd[1]: var-lib-containers-storage-overlay-46e3f66133a2e22df0c7d7cab94349954b58c472860335b1bc1ff6e5aba2628c-merged.mount: Deactivated successfully. Nov 28 05:05:28 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8efd976cbd7b204b0083dc3d5cb1ff48a831e0757f90392943e5415cb0813b9e-userdata-shm.mount: Deactivated successfully. Nov 28 05:05:28 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:28.615 261084 INFO neutron.agent.dhcp.agent [None req-f04b2642-6f28-4e46-8ac4-5c450b0aadb3 - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', 'eb25319d-f07e-4bef-a7f6-ca024599d184'} is completed#033[00m Nov 28 05:05:29 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:05:29 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1994004222' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:05:29 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:05:29 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1994004222' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:05:30 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:05:30 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:30.339 261084 INFO neutron.agent.linux.ip_lib [None req-b986e48b-5bb6-415c-96f7-ed26d4af6dae - - - - - -] Device tap59628025-67 cannot be used as it has no MAC address#033[00m Nov 28 05:05:30 localhost nova_compute[279673]: 2025-11-28 10:05:30.409 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:30 localhost kernel: device tap59628025-67 entered promiscuous mode Nov 28 05:05:30 localhost NetworkManager[5967]: [1764324330.4181] manager: (tap59628025-67): new Generic device (/org/freedesktop/NetworkManager/Devices/57) Nov 28 05:05:30 localhost nova_compute[279673]: 2025-11-28 10:05:30.418 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:30 localhost ovn_controller[152322]: 2025-11-28T10:05:30Z|00338|binding|INFO|Claiming lport 59628025-6745-4930-8b9e-1db836e05f1d for this chassis. Nov 28 05:05:30 localhost ovn_controller[152322]: 2025-11-28T10:05:30Z|00339|binding|INFO|59628025-6745-4930-8b9e-1db836e05f1d: Claiming unknown Nov 28 05:05:30 localhost systemd-udevd[322412]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:05:30 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:30.431 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-3008d273-ce4e-482f-9951-930717f7a6f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3008d273-ce4e-482f-9951-930717f7a6f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '45fdbe27569f45449de58f1d1899ceea', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e4f36a67-65f2-4a54-bf93-a2e1db5ebe54, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=59628025-6745-4930-8b9e-1db836e05f1d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:05:30 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:30.434 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 59628025-6745-4930-8b9e-1db836e05f1d in datapath 3008d273-ce4e-482f-9951-930717f7a6f1 bound to our chassis#033[00m Nov 28 05:05:30 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:30.435 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3008d273-ce4e-482f-9951-930717f7a6f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:05:30 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:30.438 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[4939faf7-0af9-403c-a3f3-a15c514ad0db]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:05:30 localhost journal[227875]: ethtool ioctl error on tap59628025-67: No such device Nov 28 05:05:30 localhost ovn_controller[152322]: 2025-11-28T10:05:30Z|00340|binding|INFO|Setting lport 59628025-6745-4930-8b9e-1db836e05f1d ovn-installed in OVS Nov 28 05:05:30 localhost ovn_controller[152322]: 2025-11-28T10:05:30Z|00341|binding|INFO|Setting lport 59628025-6745-4930-8b9e-1db836e05f1d up in Southbound Nov 28 05:05:30 localhost nova_compute[279673]: 2025-11-28 10:05:30.459 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:30 localhost journal[227875]: ethtool ioctl error on tap59628025-67: No such device Nov 28 05:05:30 localhost nova_compute[279673]: 2025-11-28 10:05:30.462 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:30 localhost journal[227875]: ethtool ioctl error on tap59628025-67: No such device Nov 28 05:05:30 localhost journal[227875]: ethtool ioctl error on tap59628025-67: No such device Nov 28 05:05:30 localhost journal[227875]: ethtool ioctl error on tap59628025-67: No such device Nov 28 05:05:30 localhost journal[227875]: ethtool ioctl error on tap59628025-67: No such device Nov 28 05:05:30 localhost journal[227875]: ethtool ioctl error on tap59628025-67: No such device Nov 28 05:05:30 localhost journal[227875]: ethtool ioctl error on tap59628025-67: No such device Nov 28 05:05:30 localhost nova_compute[279673]: 2025-11-28 10:05:30.521 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:30 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:30.543 2 INFO neutron.agent.securitygroups_rpc [None req-3bea7d0a-d883-4475-b5cf-e7689fbb3c41 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:05:30 localhost nova_compute[279673]: 2025-11-28 10:05:30.562 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:30 localhost nova_compute[279673]: 2025-11-28 10:05:30.655 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:30 localhost nova_compute[279673]: 2025-11-28 10:05:30.851 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:30 localhost ovn_controller[152322]: 2025-11-28T10:05:30Z|00342|binding|INFO|Removing iface tap59628025-67 ovn-installed in OVS Nov 28 05:05:30 localhost ovn_controller[152322]: 2025-11-28T10:05:30Z|00343|binding|INFO|Removing lport 59628025-6745-4930-8b9e-1db836e05f1d ovn-installed in OVS Nov 28 05:05:30 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:30.926 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port c4634f90-1cfe-436f-9c5d-e163574d13dd with type ""#033[00m Nov 28 05:05:30 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:30.928 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-3008d273-ce4e-482f-9951-930717f7a6f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3008d273-ce4e-482f-9951-930717f7a6f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '45fdbe27569f45449de58f1d1899ceea', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e4f36a67-65f2-4a54-bf93-a2e1db5ebe54, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=59628025-6745-4930-8b9e-1db836e05f1d) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:05:30 localhost nova_compute[279673]: 2025-11-28 10:05:30.930 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:30 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:30.932 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 59628025-6745-4930-8b9e-1db836e05f1d in datapath 3008d273-ce4e-482f-9951-930717f7a6f1 unbound from our chassis#033[00m Nov 28 05:05:30 localhost nova_compute[279673]: 2025-11-28 10:05:30.935 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:30 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:30.935 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3008d273-ce4e-482f-9951-930717f7a6f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:05:30 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:30.937 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[0e3a5c6f-18b8-479b-966e-cafffefdae87]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:05:31 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:31.191 2 INFO neutron.agent.securitygroups_rpc [None req-de2b1d93-259b-466a-9f3c-b35e66480eb7 1170e00b193540e48a34e21b33f08b7a a8f8694ac11a4237ad168b64c39ca114 - - default default] Security group member updated ['7f513fd2-7d14-47f7-892b-9f7b5ed2e3c4']#033[00m Nov 28 05:05:31 localhost ovn_controller[152322]: 2025-11-28T10:05:31Z|00344|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:05:31 localhost nova_compute[279673]: 2025-11-28 10:05:31.269 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:31 localhost podman[322499]: Nov 28 05:05:31 localhost podman[322499]: 2025-11-28 10:05:31.302061842 +0000 UTC m=+0.106869303 container create d8cce07217390891e329d7cc83cbf5c2d1400c381bb4068e46fceca2476aa476 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125) Nov 28 05:05:31 localhost systemd[1]: Started libpod-conmon-d8cce07217390891e329d7cc83cbf5c2d1400c381bb4068e46fceca2476aa476.scope. Nov 28 05:05:31 localhost systemd[1]: tmp-crun.sCCq4W.mount: Deactivated successfully. Nov 28 05:05:31 localhost podman[322499]: 2025-11-28 10:05:31.262829968 +0000 UTC m=+0.067637479 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:05:31 localhost systemd[1]: Started libcrun container. Nov 28 05:05:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/954bd3ba3f117d5d300db4abb70c466bc9dc08030da6ad9d8da5e12e09bf209f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:05:31 localhost podman[322499]: 2025-11-28 10:05:31.397909338 +0000 UTC m=+0.202716829 container init d8cce07217390891e329d7cc83cbf5c2d1400c381bb4068e46fceca2476aa476 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3) Nov 28 05:05:31 localhost podman[322499]: 2025-11-28 10:05:31.407294677 +0000 UTC m=+0.212102178 container start d8cce07217390891e329d7cc83cbf5c2d1400c381bb4068e46fceca2476aa476 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 28 05:05:31 localhost dnsmasq[322532]: started, version 2.85 cachesize 150 Nov 28 05:05:31 localhost dnsmasq[322532]: DNS service limited to local subnets Nov 28 05:05:31 localhost dnsmasq[322532]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:05:31 localhost dnsmasq[322532]: warning: no upstream servers configured Nov 28 05:05:31 localhost dnsmasq-dhcp[322532]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:05:31 localhost dnsmasq[322532]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:05:31 localhost dnsmasq-dhcp[322532]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:05:31 localhost dnsmasq-dhcp[322532]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:05:31 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:05:31 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3014863473' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:05:31 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:05:31 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3014863473' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:05:31 localhost dnsmasq[322532]: exiting on receipt of SIGTERM Nov 28 05:05:31 localhost systemd[1]: libpod-d8cce07217390891e329d7cc83cbf5c2d1400c381bb4068e46fceca2476aa476.scope: Deactivated successfully. Nov 28 05:05:31 localhost podman[322539]: 2025-11-28 10:05:31.520558553 +0000 UTC m=+0.077836452 container died d8cce07217390891e329d7cc83cbf5c2d1400c381bb4068e46fceca2476aa476 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Nov 28 05:05:31 localhost podman[322539]: 2025-11-28 10:05:31.552634411 +0000 UTC m=+0.109912310 container cleanup d8cce07217390891e329d7cc83cbf5c2d1400c381bb4068e46fceca2476aa476 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 28 05:05:31 localhost podman[322551]: 2025-11-28 10:05:31.590718033 +0000 UTC m=+0.064814928 container cleanup d8cce07217390891e329d7cc83cbf5c2d1400c381bb4068e46fceca2476aa476 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125) Nov 28 05:05:31 localhost systemd[1]: libpod-conmon-d8cce07217390891e329d7cc83cbf5c2d1400c381bb4068e46fceca2476aa476.scope: Deactivated successfully. Nov 28 05:05:31 localhost podman[322564]: 2025-11-28 10:05:31.651120453 +0000 UTC m=+0.081172597 container remove d8cce07217390891e329d7cc83cbf5c2d1400c381bb4068e46fceca2476aa476 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 28 05:05:31 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:31.699 261084 INFO neutron.agent.dhcp.agent [None req-9210b8f7-0116-4762-b89d-166ebab3df0d - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', 'eb25319d-f07e-4bef-a7f6-ca024599d184'} is completed#033[00m Nov 28 05:05:31 localhost podman[322584]: Nov 28 05:05:31 localhost podman[322584]: 2025-11-28 10:05:31.757282465 +0000 UTC m=+0.082337220 container create ed8267f6e5877157bd61357301b205935454612ff49680fa0a54a298d3cbc5bb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3008d273-ce4e-482f-9951-930717f7a6f1, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:05:31 localhost systemd[1]: Started libpod-conmon-ed8267f6e5877157bd61357301b205935454612ff49680fa0a54a298d3cbc5bb.scope. Nov 28 05:05:31 localhost systemd[1]: Started libcrun container. Nov 28 05:05:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5913fdca5af5a7fae83727499d970b246170438bd3f265b9b7404fabb91d105a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:05:31 localhost podman[322584]: 2025-11-28 10:05:31.716004643 +0000 UTC m=+0.041059428 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:05:31 localhost podman[322584]: 2025-11-28 10:05:31.819928651 +0000 UTC m=+0.144983386 container init ed8267f6e5877157bd61357301b205935454612ff49680fa0a54a298d3cbc5bb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3008d273-ce4e-482f-9951-930717f7a6f1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:05:31 localhost podman[322584]: 2025-11-28 10:05:31.829300899 +0000 UTC m=+0.154355634 container start ed8267f6e5877157bd61357301b205935454612ff49680fa0a54a298d3cbc5bb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3008d273-ce4e-482f-9951-930717f7a6f1, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:05:31 localhost dnsmasq[322610]: started, version 2.85 cachesize 150 Nov 28 05:05:31 localhost dnsmasq[322610]: DNS service limited to local subnets Nov 28 05:05:31 localhost dnsmasq[322610]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:05:31 localhost dnsmasq[322610]: warning: no upstream servers configured Nov 28 05:05:31 localhost dnsmasq-dhcp[322610]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Nov 28 05:05:31 localhost dnsmasq[322610]: read /var/lib/neutron/dhcp/3008d273-ce4e-482f-9951-930717f7a6f1/addn_hosts - 0 addresses Nov 28 05:05:31 localhost dnsmasq-dhcp[322610]: read /var/lib/neutron/dhcp/3008d273-ce4e-482f-9951-930717f7a6f1/host Nov 28 05:05:31 localhost dnsmasq-dhcp[322610]: read /var/lib/neutron/dhcp/3008d273-ce4e-482f-9951-930717f7a6f1/opts Nov 28 05:05:31 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:31.935 261084 INFO neutron.agent.dhcp.agent [None req-eaaef033-ae68-45f1-8bed-59bae5543439 - - - - - -] DHCP configuration for ports {'576302a1-e256-4653-ae72-a049f1fcfc76'} is completed#033[00m Nov 28 05:05:32 localhost dnsmasq[322610]: exiting on receipt of SIGTERM Nov 28 05:05:32 localhost systemd[1]: libpod-ed8267f6e5877157bd61357301b205935454612ff49680fa0a54a298d3cbc5bb.scope: Deactivated successfully. Nov 28 05:05:32 localhost podman[322637]: 2025-11-28 10:05:32.073506216 +0000 UTC m=+0.065307722 container kill ed8267f6e5877157bd61357301b205935454612ff49680fa0a54a298d3cbc5bb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3008d273-ce4e-482f-9951-930717f7a6f1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 05:05:32 localhost podman[322652]: 2025-11-28 10:05:32.153124998 +0000 UTC m=+0.061434331 container died ed8267f6e5877157bd61357301b205935454612ff49680fa0a54a298d3cbc5bb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3008d273-ce4e-482f-9951-930717f7a6f1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Nov 28 05:05:32 localhost podman[322652]: 2025-11-28 10:05:32.189431739 +0000 UTC m=+0.097741032 container cleanup ed8267f6e5877157bd61357301b205935454612ff49680fa0a54a298d3cbc5bb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3008d273-ce4e-482f-9951-930717f7a6f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 28 05:05:32 localhost systemd[1]: libpod-conmon-ed8267f6e5877157bd61357301b205935454612ff49680fa0a54a298d3cbc5bb.scope: Deactivated successfully. Nov 28 05:05:32 localhost podman[322655]: 2025-11-28 10:05:32.246848223 +0000 UTC m=+0.139753845 container remove ed8267f6e5877157bd61357301b205935454612ff49680fa0a54a298d3cbc5bb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3008d273-ce4e-482f-9951-930717f7a6f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 28 05:05:32 localhost nova_compute[279673]: 2025-11-28 10:05:32.261 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:32 localhost kernel: device tap59628025-67 left promiscuous mode Nov 28 05:05:32 localhost nova_compute[279673]: 2025-11-28 10:05:32.273 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:32 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:32.304 261084 INFO neutron.agent.dhcp.agent [None req-1e0f9bb0-f757-475c-a870-1380354c0e5b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:05:32 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:32.305 261084 INFO neutron.agent.dhcp.agent [None req-1e0f9bb0-f757-475c-a870-1380354c0e5b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:05:32 localhost systemd[1]: var-lib-containers-storage-overlay-954bd3ba3f117d5d300db4abb70c466bc9dc08030da6ad9d8da5e12e09bf209f-merged.mount: Deactivated successfully. Nov 28 05:05:32 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d8cce07217390891e329d7cc83cbf5c2d1400c381bb4068e46fceca2476aa476-userdata-shm.mount: Deactivated successfully. Nov 28 05:05:32 localhost systemd[1]: run-netns-qdhcp\x2d3008d273\x2dce4e\x2d482f\x2d9951\x2d930717f7a6f1.mount: Deactivated successfully. Nov 28 05:05:32 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:32.453 261084 INFO neutron.agent.linux.ip_lib [None req-d9378e2c-0c7c-4a60-9b25-298141c4d17a - - - - - -] Device tapbbebc9e7-db cannot be used as it has no MAC address#033[00m Nov 28 05:05:32 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:05:32 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/317391599' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:05:32 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:05:32 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/317391599' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:05:32 localhost nova_compute[279673]: 2025-11-28 10:05:32.485 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:32 localhost kernel: device tapbbebc9e7-db entered promiscuous mode Nov 28 05:05:32 localhost NetworkManager[5967]: [1764324332.4940] manager: (tapbbebc9e7-db): new Generic device (/org/freedesktop/NetworkManager/Devices/58) Nov 28 05:05:32 localhost nova_compute[279673]: 2025-11-28 10:05:32.493 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:32 localhost ovn_controller[152322]: 2025-11-28T10:05:32Z|00345|binding|INFO|Claiming lport bbebc9e7-dbe0-47b5-b390-b5bcd4b40cc9 for this chassis. Nov 28 05:05:32 localhost ovn_controller[152322]: 2025-11-28T10:05:32Z|00346|binding|INFO|bbebc9e7-dbe0-47b5-b390-b5bcd4b40cc9: Claiming unknown Nov 28 05:05:32 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:32.506 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.255.242/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-fa28040d-639a-454c-9515-60af86f8624b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa28040d-639a-454c-9515-60af86f8624b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8499932c523b4e26933fff84403e296e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dcca6890-1675-46ad-9260-7f267479c535, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=bbebc9e7-dbe0-47b5-b390-b5bcd4b40cc9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:05:32 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:32.511 158130 INFO neutron.agent.ovn.metadata.agent [-] Port bbebc9e7-dbe0-47b5-b390-b5bcd4b40cc9 in datapath fa28040d-639a-454c-9515-60af86f8624b bound to our chassis#033[00m Nov 28 05:05:32 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:32.512 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network fa28040d-639a-454c-9515-60af86f8624b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:05:32 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:32.514 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[fe50ceb5-8c08-4188-9a0b-a5346437d98b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:05:32 localhost ovn_controller[152322]: 2025-11-28T10:05:32Z|00347|binding|INFO|Setting lport bbebc9e7-dbe0-47b5-b390-b5bcd4b40cc9 ovn-installed in OVS Nov 28 05:05:32 localhost ovn_controller[152322]: 2025-11-28T10:05:32Z|00348|binding|INFO|Setting lport bbebc9e7-dbe0-47b5-b390-b5bcd4b40cc9 up in Southbound Nov 28 05:05:32 localhost nova_compute[279673]: 2025-11-28 10:05:32.532 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:32 localhost nova_compute[279673]: 2025-11-28 10:05:32.583 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:32 localhost nova_compute[279673]: 2025-11-28 10:05:32.629 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:32 localhost podman[322724]: Nov 28 05:05:32 localhost podman[322724]: 2025-11-28 10:05:32.703427776 +0000 UTC m=+0.104120674 container create f2bb80b42a65e58c8a0974c9018531bd1b0eec105f43d137f8056bf67ae4d4b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 28 05:05:32 localhost podman[322724]: 2025-11-28 10:05:32.655252756 +0000 UTC m=+0.055945694 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:05:32 localhost systemd[1]: Started libpod-conmon-f2bb80b42a65e58c8a0974c9018531bd1b0eec105f43d137f8056bf67ae4d4b8.scope. Nov 28 05:05:32 localhost systemd[1]: Started libcrun container. Nov 28 05:05:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34f6008d0476cd6ca17b69b0fcbfd3916be3191c61a21acea58861d8aa11abd9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:05:32 localhost podman[322724]: 2025-11-28 10:05:32.793871457 +0000 UTC m=+0.194564345 container init f2bb80b42a65e58c8a0974c9018531bd1b0eec105f43d137f8056bf67ae4d4b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 28 05:05:32 localhost podman[322724]: 2025-11-28 10:05:32.801768394 +0000 UTC m=+0.202461282 container start f2bb80b42a65e58c8a0974c9018531bd1b0eec105f43d137f8056bf67ae4d4b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:05:32 localhost dnsmasq[322750]: started, version 2.85 cachesize 150 Nov 28 05:05:32 localhost dnsmasq[322750]: DNS service limited to local subnets Nov 28 05:05:32 localhost dnsmasq[322750]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:05:32 localhost dnsmasq[322750]: warning: no upstream servers configured Nov 28 05:05:32 localhost dnsmasq-dhcp[322750]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Nov 28 05:05:32 localhost dnsmasq-dhcp[322750]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:05:32 localhost dnsmasq[322750]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/addn_hosts - 0 addresses Nov 28 05:05:32 localhost dnsmasq-dhcp[322750]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/host Nov 28 05:05:32 localhost dnsmasq-dhcp[322750]: read /var/lib/neutron/dhcp/8642adde-54ae-4fc2-b997-bf1962c6c7f1/opts Nov 28 05:05:33 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:05:33 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1576133045' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:05:33 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:05:33 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1576133045' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:05:33 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:33.062 261084 INFO neutron.agent.dhcp.agent [None req-dce4de1e-8c7e-4fa3-8c8b-9cbb8cb1c36c - - - - - -] DHCP configuration for ports {'ef9eb238-2b1e-49f7-8a0f-72efc8854e0f', 'eb25319d-f07e-4bef-a7f6-ca024599d184'} is completed#033[00m Nov 28 05:05:33 localhost dnsmasq[322750]: exiting on receipt of SIGTERM Nov 28 05:05:33 localhost podman[322782]: 2025-11-28 10:05:33.198572124 +0000 UTC m=+0.072002834 container kill f2bb80b42a65e58c8a0974c9018531bd1b0eec105f43d137f8056bf67ae4d4b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:05:33 localhost systemd[1]: libpod-f2bb80b42a65e58c8a0974c9018531bd1b0eec105f43d137f8056bf67ae4d4b8.scope: Deactivated successfully. Nov 28 05:05:33 localhost podman[322797]: 2025-11-28 10:05:33.274366716 +0000 UTC m=+0.060067953 container died f2bb80b42a65e58c8a0974c9018531bd1b0eec105f43d137f8056bf67ae4d4b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:05:33 localhost systemd[1]: var-lib-containers-storage-overlay-34f6008d0476cd6ca17b69b0fcbfd3916be3191c61a21acea58861d8aa11abd9-merged.mount: Deactivated successfully. Nov 28 05:05:33 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f2bb80b42a65e58c8a0974c9018531bd1b0eec105f43d137f8056bf67ae4d4b8-userdata-shm.mount: Deactivated successfully. Nov 28 05:05:33 localhost podman[322797]: 2025-11-28 10:05:33.316798782 +0000 UTC m=+0.102499969 container cleanup f2bb80b42a65e58c8a0974c9018531bd1b0eec105f43d137f8056bf67ae4d4b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 28 05:05:33 localhost systemd[1]: libpod-conmon-f2bb80b42a65e58c8a0974c9018531bd1b0eec105f43d137f8056bf67ae4d4b8.scope: Deactivated successfully. Nov 28 05:05:33 localhost podman[322799]: 2025-11-28 10:05:33.366834025 +0000 UTC m=+0.144287325 container remove f2bb80b42a65e58c8a0974c9018531bd1b0eec105f43d137f8056bf67ae4d4b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8642adde-54ae-4fc2-b997-bf1962c6c7f1, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:05:33 localhost nova_compute[279673]: 2025-11-28 10:05:33.420 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:33 localhost kernel: device tapeb25319d-f0 left promiscuous mode Nov 28 05:05:33 localhost ovn_controller[152322]: 2025-11-28T10:05:33Z|00349|binding|INFO|Releasing lport eb25319d-f07e-4bef-a7f6-ca024599d184 from this chassis (sb_readonly=0) Nov 28 05:05:33 localhost ovn_controller[152322]: 2025-11-28T10:05:33Z|00350|binding|INFO|Setting lport eb25319d-f07e-4bef-a7f6-ca024599d184 down in Southbound Nov 28 05:05:33 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:33.431 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:febd:20cb/64 2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8642adde-54ae-4fc2-b997-bf1962c6c7f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a8f8694ac11a4237ad168b64c39ca114', 'neutron:revision_number': '12', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e1a061c-89bc-4b63-8b60-f49fb95addda, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=eb25319d-f07e-4bef-a7f6-ca024599d184) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:05:33 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:33.433 158130 INFO neutron.agent.ovn.metadata.agent [-] Port eb25319d-f07e-4bef-a7f6-ca024599d184 in datapath 8642adde-54ae-4fc2-b997-bf1962c6c7f1 unbound from our chassis#033[00m Nov 28 05:05:33 localhost nova_compute[279673]: 2025-11-28 10:05:33.440 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:33 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:33.440 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8642adde-54ae-4fc2-b997-bf1962c6c7f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:05:33 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:33.444 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[bc5813df-241a-4625-ac4f-219bdefddc42]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:05:33 localhost podman[322862]: Nov 28 05:05:33 localhost podman[322862]: 2025-11-28 10:05:33.653146939 +0000 UTC m=+0.129008517 container create d45c3118e0cb447a1f990580ae47ce23aabce3b8035e5e02e901e9a0f3bc1683 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fa28040d-639a-454c-9515-60af86f8624b, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:05:33 localhost dnsmasq[319577]: read /var/lib/neutron/dhcp/553c7f35-d914-4af1-9846-a8cbe21f53f3/addn_hosts - 0 addresses Nov 28 05:05:33 localhost dnsmasq-dhcp[319577]: read /var/lib/neutron/dhcp/553c7f35-d914-4af1-9846-a8cbe21f53f3/host Nov 28 05:05:33 localhost dnsmasq-dhcp[319577]: read /var/lib/neutron/dhcp/553c7f35-d914-4af1-9846-a8cbe21f53f3/opts Nov 28 05:05:33 localhost podman[322877]: 2025-11-28 10:05:33.653758677 +0000 UTC m=+0.071958543 container kill 2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-553c7f35-d914-4af1-9846-a8cbe21f53f3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:05:33 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:33.656 261084 INFO neutron.agent.dhcp.agent [None req-b7f762bf-1f61-4981-b1c0-944291fec871 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:05:33 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:33.656 261084 INFO neutron.agent.dhcp.agent [None req-b7f762bf-1f61-4981-b1c0-944291fec871 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:05:33 localhost systemd[1]: run-netns-qdhcp\x2d8642adde\x2d54ae\x2d4fc2\x2db997\x2dbf1962c6c7f1.mount: Deactivated successfully. Nov 28 05:05:33 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:33.657 261084 INFO neutron.agent.dhcp.agent [None req-b7f762bf-1f61-4981-b1c0-944291fec871 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:05:33 localhost systemd[1]: Started libpod-conmon-d45c3118e0cb447a1f990580ae47ce23aabce3b8035e5e02e901e9a0f3bc1683.scope. Nov 28 05:05:33 localhost podman[322862]: 2025-11-28 10:05:33.610375764 +0000 UTC m=+0.086237382 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:05:33 localhost systemd[1]: Started libcrun container. Nov 28 05:05:33 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14c3eefe9cc85a0964db7e342e45ff322ab308fd13a7f19aa1847f356aa5bafb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:05:33 localhost podman[322862]: 2025-11-28 10:05:33.73764333 +0000 UTC m=+0.213504888 container init d45c3118e0cb447a1f990580ae47ce23aabce3b8035e5e02e901e9a0f3bc1683 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fa28040d-639a-454c-9515-60af86f8624b, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:05:33 localhost podman[322862]: 2025-11-28 10:05:33.7561256 +0000 UTC m=+0.231987158 container start d45c3118e0cb447a1f990580ae47ce23aabce3b8035e5e02e901e9a0f3bc1683 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fa28040d-639a-454c-9515-60af86f8624b, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 28 05:05:33 localhost dnsmasq[322901]: started, version 2.85 cachesize 150 Nov 28 05:05:33 localhost dnsmasq[322901]: DNS service limited to local subnets Nov 28 05:05:33 localhost dnsmasq[322901]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:05:33 localhost dnsmasq[322901]: warning: no upstream servers configured Nov 28 05:05:33 localhost dnsmasq-dhcp[322901]: DHCP, static leases only on 10.100.255.240, lease time 1d Nov 28 05:05:33 localhost dnsmasq[322901]: read /var/lib/neutron/dhcp/fa28040d-639a-454c-9515-60af86f8624b/addn_hosts - 0 addresses Nov 28 05:05:33 localhost dnsmasq-dhcp[322901]: read /var/lib/neutron/dhcp/fa28040d-639a-454c-9515-60af86f8624b/host Nov 28 05:05:33 localhost dnsmasq-dhcp[322901]: read /var/lib/neutron/dhcp/fa28040d-639a-454c-9515-60af86f8624b/opts Nov 28 05:05:33 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:33.769 2 INFO neutron.agent.securitygroups_rpc [None req-04197ece-4fb3-43df-90bb-b0e309825a8e cb58c56533984a129050414c9b160b63 de1aeac8abd545fcb83eb3ee06f16689 - - default default] Security group member updated ['805fa77a-da24-42d8-9154-db9402b01c3e']#033[00m Nov 28 05:05:33 localhost ovn_controller[152322]: 2025-11-28T10:05:33Z|00351|binding|INFO|Releasing lport 4929710e-eb4c-4144-9bca-64efc297e299 from this chassis (sb_readonly=0) Nov 28 05:05:33 localhost kernel: device tap4929710e-eb left promiscuous mode Nov 28 05:05:33 localhost nova_compute[279673]: 2025-11-28 10:05:33.853 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:33 localhost ovn_controller[152322]: 2025-11-28T10:05:33Z|00352|binding|INFO|Setting lport 4929710e-eb4c-4144-9bca-64efc297e299 down in Southbound Nov 28 05:05:33 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:33.864 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-553c7f35-d914-4af1-9846-a8cbe21f53f3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-553c7f35-d914-4af1-9846-a8cbe21f53f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'aa5be61eafca4d96976422f0e0103210', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=40693dd3-cde5-4c50-9ed5-4dc8ef3313af, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=4929710e-eb4c-4144-9bca-64efc297e299) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:05:33 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:33.866 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 4929710e-eb4c-4144-9bca-64efc297e299 in datapath 553c7f35-d914-4af1-9846-a8cbe21f53f3 unbound from our chassis#033[00m Nov 28 05:05:33 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:33.869 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 553c7f35-d914-4af1-9846-a8cbe21f53f3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:05:33 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:33.870 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[3ec09597-b513-430f-af27-17ae94054241]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:05:33 localhost nova_compute[279673]: 2025-11-28 10:05:33.874 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:33 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:33.918 261084 INFO neutron.agent.dhcp.agent [None req-2815f33c-0284-42ca-bce0-2081b7192bc7 - - - - - -] DHCP configuration for ports {'e2bd862e-905b-4769-9404-fb8c9861c0f4'} is completed#033[00m Nov 28 05:05:35 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:05:35 localhost nova_compute[279673]: 2025-11-28 10:05:35.659 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:35 localhost nova_compute[279673]: 2025-11-28 10:05:35.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:05:35 localhost nova_compute[279673]: 2025-11-28 10:05:35.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Nov 28 05:05:35 localhost nova_compute[279673]: 2025-11-28 10:05:35.787 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Nov 28 05:05:35 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:35.877 2 INFO neutron.agent.securitygroups_rpc [None req-9c55fb28-c01b-40d6-8fec-099f9b722777 cb58c56533984a129050414c9b160b63 de1aeac8abd545fcb83eb3ee06f16689 - - default default] Security group member updated ['805fa77a-da24-42d8-9154-db9402b01c3e']#033[00m Nov 28 05:05:35 localhost nova_compute[279673]: 2025-11-28 10:05:35.894 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:35 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:35.907 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:05:36 localhost ovn_controller[152322]: 2025-11-28T10:05:36Z|00353|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:05:36 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e151 do_prune osdmap full prune enabled Nov 28 05:05:36 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e152 e152: 6 total, 6 up, 6 in Nov 28 05:05:36 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e152: 6 total, 6 up, 6 in Nov 28 05:05:36 localhost nova_compute[279673]: 2025-11-28 10:05:36.133 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:37 localhost ovn_controller[152322]: 2025-11-28T10:05:37Z|00354|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:05:37 localhost nova_compute[279673]: 2025-11-28 10:05:37.134 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:37 localhost podman[322924]: 2025-11-28 10:05:37.548094335 +0000 UTC m=+0.061486113 container kill 2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-553c7f35-d914-4af1-9846-a8cbe21f53f3, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 28 05:05:37 localhost dnsmasq[319577]: exiting on receipt of SIGTERM Nov 28 05:05:37 localhost systemd[1]: libpod-2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd.scope: Deactivated successfully. Nov 28 05:05:37 localhost podman[322938]: 2025-11-28 10:05:37.627202732 +0000 UTC m=+0.060925777 container died 2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-553c7f35-d914-4af1-9846-a8cbe21f53f3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:05:37 localhost podman[322938]: 2025-11-28 10:05:37.665647383 +0000 UTC m=+0.099370388 container cleanup 2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-553c7f35-d914-4af1-9846-a8cbe21f53f3, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 05:05:37 localhost systemd[1]: libpod-conmon-2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd.scope: Deactivated successfully. Nov 28 05:05:37 localhost podman[322939]: 2025-11-28 10:05:37.701269284 +0000 UTC m=+0.128465752 container remove 2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-553c7f35-d914-4af1-9846-a8cbe21f53f3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 05:05:37 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:37.810 261084 INFO neutron.agent.dhcp.agent [None req-4bb69635-28b9-4340-89b7-b560ae1ec3aa - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:05:37 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:37.810 261084 INFO neutron.agent.dhcp.agent [None req-4bb69635-28b9-4340-89b7-b560ae1ec3aa - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:05:38 localhost systemd[1]: var-lib-containers-storage-overlay-4bcc12744151ca7e94791368e387c8ab451571c56040070ed3ec7c54499fb7f3-merged.mount: Deactivated successfully. Nov 28 05:05:38 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2be74bc48555f5ca9fda0c65974602d642dee270cf9293fdde68cc494d9336dd-userdata-shm.mount: Deactivated successfully. Nov 28 05:05:38 localhost systemd[1]: run-netns-qdhcp\x2d553c7f35\x2dd914\x2d4af1\x2d9846\x2da8cbe21f53f3.mount: Deactivated successfully. Nov 28 05:05:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:05:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:05:38 localhost podman[322967]: 2025-11-28 10:05:38.671258078 +0000 UTC m=+0.097308249 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 05:05:38 localhost podman[322967]: 2025-11-28 10:05:38.68145276 +0000 UTC m=+0.107502911 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 05:05:38 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:05:38 localhost podman[322985]: 2025-11-28 10:05:38.777931054 +0000 UTC m=+0.095161617 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:05:38 localhost podman[322985]: 2025-11-28 10:05:38.791359829 +0000 UTC m=+0.108590382 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:05:38 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:05:40 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:05:40 localhost podman[238687]: time="2025-11-28T10:05:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:05:40 localhost podman[238687]: @ - - [28/Nov/2025:10:05:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157516 "" "Go-http-client/1.1" Nov 28 05:05:40 localhost podman[238687]: @ - - [28/Nov/2025:10:05:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19745 "" "Go-http-client/1.1" Nov 28 05:05:40 localhost nova_compute[279673]: 2025-11-28 10:05:40.678 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:40 localhost nova_compute[279673]: 2025-11-28 10:05:40.787 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:05:40 localhost nova_compute[279673]: 2025-11-28 10:05:40.898 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:41 localhost ovn_controller[152322]: 2025-11-28T10:05:41Z|00355|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:05:41 localhost nova_compute[279673]: 2025-11-28 10:05:41.250 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:41 localhost nova_compute[279673]: 2025-11-28 10:05:41.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:05:42 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:42.447 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:05:42 localhost nova_compute[279673]: 2025-11-28 10:05:42.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:05:42 localhost nova_compute[279673]: 2025-11-28 10:05:42.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:05:42 localhost nova_compute[279673]: 2025-11-28 10:05:42.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 05:05:43 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:43.755 2 INFO neutron.agent.securitygroups_rpc [None req-1640c1ae-d5d6-4903-a020-69a5c84dc198 e0f29aacf6a94315b178b4a16e3fd03d 79185418333d4a93b24c87e39a4a1847 - - default default] Security group member updated ['f7d47ffa-9780-427b-aaf2-f0de3a638f8a']#033[00m Nov 28 05:05:44 localhost nova_compute[279673]: 2025-11-28 10:05:44.767 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:05:44 localhost nova_compute[279673]: 2025-11-28 10:05:44.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:05:44 localhost nova_compute[279673]: 2025-11-28 10:05:44.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:05:44 localhost nova_compute[279673]: 2025-11-28 10:05:44.788 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:05:44 localhost nova_compute[279673]: 2025-11-28 10:05:44.789 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:05:44 localhost nova_compute[279673]: 2025-11-28 10:05:44.789 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:05:44 localhost nova_compute[279673]: 2025-11-28 10:05:44.789 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 05:05:44 localhost nova_compute[279673]: 2025-11-28 10:05:44.790 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:05:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:05:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:05:45 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1626658574' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:05:45 localhost nova_compute[279673]: 2025-11-28 10:05:45.240 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:05:45 localhost nova_compute[279673]: 2025-11-28 10:05:45.316 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:05:45 localhost nova_compute[279673]: 2025-11-28 10:05:45.316 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:05:45 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:45.529 2 INFO neutron.agent.securitygroups_rpc [None req-6e211784-e61c-41e0-bb4d-96a8d1625a59 e0f29aacf6a94315b178b4a16e3fd03d 79185418333d4a93b24c87e39a4a1847 - - default default] Security group member updated ['f7d47ffa-9780-427b-aaf2-f0de3a638f8a']#033[00m Nov 28 05:05:45 localhost nova_compute[279673]: 2025-11-28 10:05:45.552 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 05:05:45 localhost nova_compute[279673]: 2025-11-28 10:05:45.554 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11156MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 05:05:45 localhost nova_compute[279673]: 2025-11-28 10:05:45.554 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:05:45 localhost nova_compute[279673]: 2025-11-28 10:05:45.555 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:05:45 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:45.692 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:05:45 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:45.694 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 28 05:05:45 localhost nova_compute[279673]: 2025-11-28 10:05:45.720 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:45 localhost nova_compute[279673]: 2025-11-28 10:05:45.807 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 05:05:45 localhost nova_compute[279673]: 2025-11-28 10:05:45.808 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 05:05:45 localhost nova_compute[279673]: 2025-11-28 10:05:45.809 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 05:05:45 localhost nova_compute[279673]: 2025-11-28 10:05:45.874 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing inventories for resource provider 35fead26-0bad-4950-b646-987079d58a17 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 28 05:05:45 localhost nova_compute[279673]: 2025-11-28 10:05:45.899 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:45 localhost nova_compute[279673]: 2025-11-28 10:05:45.956 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Updating ProviderTree inventory for provider 35fead26-0bad-4950-b646-987079d58a17 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 28 05:05:45 localhost nova_compute[279673]: 2025-11-28 10:05:45.956 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 28 05:05:45 localhost nova_compute[279673]: 2025-11-28 10:05:45.970 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing aggregate associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 28 05:05:46 localhost nova_compute[279673]: 2025-11-28 10:05:46.060 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing trait associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_FMA3,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_ABM,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AMD_SVM,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_CLMUL,HW_CPU_X86_BMI2,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 28 05:05:46 localhost nova_compute[279673]: 2025-11-28 10:05:46.100 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:05:46 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:05:46 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2969744802' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:05:46 localhost nova_compute[279673]: 2025-11-28 10:05:46.559 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:05:46 localhost nova_compute[279673]: 2025-11-28 10:05:46.567 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 05:05:46 localhost nova_compute[279673]: 2025-11-28 10:05:46.586 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 05:05:46 localhost nova_compute[279673]: 2025-11-28 10:05:46.589 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 05:05:46 localhost nova_compute[279673]: 2025-11-28 10:05:46.589 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:05:46 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:46.678 2 INFO neutron.agent.securitygroups_rpc [None req-fda551bc-bdc3-478a-93b4-a620175c516e 8d30c732fa674cae8e1de9092f58edd9 3a67d7f32f5e49c3aed3e09278dd6c95 - - default default] Security group member updated ['371ca172-3d0a-4f94-811c-7c823124cef1']#033[00m Nov 28 05:05:47 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:47.118 2 INFO neutron.agent.securitygroups_rpc [None req-e42eed40-5a16-457a-8c1b-352e3dbcff3e e0f29aacf6a94315b178b4a16e3fd03d 79185418333d4a93b24c87e39a4a1847 - - default default] Security group member updated ['f7d47ffa-9780-427b-aaf2-f0de3a638f8a']#033[00m Nov 28 05:05:47 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:47.157 2 INFO neutron.agent.securitygroups_rpc [None req-fda551bc-bdc3-478a-93b4-a620175c516e 8d30c732fa674cae8e1de9092f58edd9 3a67d7f32f5e49c3aed3e09278dd6c95 - - default default] Security group member updated ['371ca172-3d0a-4f94-811c-7c823124cef1']#033[00m Nov 28 05:05:47 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:47.194 261084 INFO neutron.agent.linux.ip_lib [None req-9d118a07-77e2-46d6-a384-cc654cdf0e1b - - - - - -] Device tap87ef7272-14 cannot be used as it has no MAC address#033[00m Nov 28 05:05:47 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e152 do_prune osdmap full prune enabled Nov 28 05:05:47 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e153 e153: 6 total, 6 up, 6 in Nov 28 05:05:47 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e153: 6 total, 6 up, 6 in Nov 28 05:05:47 localhost nova_compute[279673]: 2025-11-28 10:05:47.257 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:47 localhost kernel: device tap87ef7272-14 entered promiscuous mode Nov 28 05:05:47 localhost NetworkManager[5967]: [1764324347.2700] manager: (tap87ef7272-14): new Generic device (/org/freedesktop/NetworkManager/Devices/59) Nov 28 05:05:47 localhost nova_compute[279673]: 2025-11-28 10:05:47.274 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:47 localhost systemd-udevd[323063]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:05:47 localhost ovn_controller[152322]: 2025-11-28T10:05:47Z|00356|binding|INFO|Claiming lport 87ef7272-14f7-4162-a8a9-b13090f8924f for this chassis. Nov 28 05:05:47 localhost ovn_controller[152322]: 2025-11-28T10:05:47Z|00357|binding|INFO|87ef7272-14f7-4162-a8a9-b13090f8924f: Claiming unknown Nov 28 05:05:47 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:47.284 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-3f532ea4-a0de-4113-8993-33f982144ec8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f532ea4-a0de-4113-8993-33f982144ec8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b2dec4622fe844c592a13e779612beaa', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4a58c096-9217-4d0b-a64c-715683dae905, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=87ef7272-14f7-4162-a8a9-b13090f8924f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:05:47 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:47.286 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 87ef7272-14f7-4162-a8a9-b13090f8924f in datapath 3f532ea4-a0de-4113-8993-33f982144ec8 bound to our chassis#033[00m Nov 28 05:05:47 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:47.287 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3f532ea4-a0de-4113-8993-33f982144ec8 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:05:47 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:47.288 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[dfefbd10-37c5-42e7-860d-84b0f137b2c5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:05:47 localhost journal[227875]: ethtool ioctl error on tap87ef7272-14: No such device Nov 28 05:05:47 localhost journal[227875]: ethtool ioctl error on tap87ef7272-14: No such device Nov 28 05:05:47 localhost journal[227875]: ethtool ioctl error on tap87ef7272-14: No such device Nov 28 05:05:47 localhost ovn_controller[152322]: 2025-11-28T10:05:47Z|00358|binding|INFO|Setting lport 87ef7272-14f7-4162-a8a9-b13090f8924f ovn-installed in OVS Nov 28 05:05:47 localhost ovn_controller[152322]: 2025-11-28T10:05:47Z|00359|binding|INFO|Setting lport 87ef7272-14f7-4162-a8a9-b13090f8924f up in Southbound Nov 28 05:05:47 localhost journal[227875]: ethtool ioctl error on tap87ef7272-14: No such device Nov 28 05:05:47 localhost nova_compute[279673]: 2025-11-28 10:05:47.320 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:47 localhost journal[227875]: ethtool ioctl error on tap87ef7272-14: No such device Nov 28 05:05:47 localhost journal[227875]: ethtool ioctl error on tap87ef7272-14: No such device Nov 28 05:05:47 localhost journal[227875]: ethtool ioctl error on tap87ef7272-14: No such device Nov 28 05:05:47 localhost journal[227875]: ethtool ioctl error on tap87ef7272-14: No such device Nov 28 05:05:47 localhost nova_compute[279673]: 2025-11-28 10:05:47.354 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:47 localhost nova_compute[279673]: 2025-11-28 10:05:47.390 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:47 localhost nova_compute[279673]: 2025-11-28 10:05:47.590 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:05:47 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:47.695 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:05:47 localhost nova_compute[279673]: 2025-11-28 10:05:47.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:05:47 localhost nova_compute[279673]: 2025-11-28 10:05:47.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 05:05:47 localhost nova_compute[279673]: 2025-11-28 10:05:47.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 05:05:47 localhost nova_compute[279673]: 2025-11-28 10:05:47.936 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 05:05:47 localhost nova_compute[279673]: 2025-11-28 10:05:47.937 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 05:05:47 localhost nova_compute[279673]: 2025-11-28 10:05:47.937 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 05:05:47 localhost nova_compute[279673]: 2025-11-28 10:05:47.937 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:05:48 localhost openstack_network_exporter[240658]: ERROR 10:05:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:05:48 localhost openstack_network_exporter[240658]: ERROR 10:05:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:05:48 localhost openstack_network_exporter[240658]: ERROR 10:05:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:05:48 localhost openstack_network_exporter[240658]: ERROR 10:05:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:05:48 localhost openstack_network_exporter[240658]: Nov 28 05:05:48 localhost openstack_network_exporter[240658]: ERROR 10:05:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:05:48 localhost openstack_network_exporter[240658]: Nov 28 05:05:48 localhost podman[323134]: Nov 28 05:05:48 localhost podman[323134]: 2025-11-28 10:05:48.291535747 +0000 UTC m=+0.090948616 container create c2f76ad321100bf3dd865ae468acc4ac2ff5793c21d4c81e01c36259a18458eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f532ea4-a0de-4113-8993-33f982144ec8, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 28 05:05:48 localhost systemd[1]: Started libpod-conmon-c2f76ad321100bf3dd865ae468acc4ac2ff5793c21d4c81e01c36259a18458eb.scope. Nov 28 05:05:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:05:48 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:48.345 2 INFO neutron.agent.securitygroups_rpc [None req-7a28cb1b-8d30-40dd-8ac7-551ea3c1b87c e0f29aacf6a94315b178b4a16e3fd03d 79185418333d4a93b24c87e39a4a1847 - - default default] Security group member updated ['f7d47ffa-9780-427b-aaf2-f0de3a638f8a']#033[00m Nov 28 05:05:48 localhost podman[323134]: 2025-11-28 10:05:48.248251378 +0000 UTC m=+0.047664287 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:05:48 localhost systemd[1]: Started libcrun container. Nov 28 05:05:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a09506b8e900e354d47418961d4d79668634f1edb0c1cb9bb0d739ec90a1c16/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:05:48 localhost podman[323134]: 2025-11-28 10:05:48.366379932 +0000 UTC m=+0.165792801 container init c2f76ad321100bf3dd865ae468acc4ac2ff5793c21d4c81e01c36259a18458eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f532ea4-a0de-4113-8993-33f982144ec8, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125) Nov 28 05:05:48 localhost podman[323134]: 2025-11-28 10:05:48.379958552 +0000 UTC m=+0.179371411 container start c2f76ad321100bf3dd865ae468acc4ac2ff5793c21d4c81e01c36259a18458eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f532ea4-a0de-4113-8993-33f982144ec8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3) Nov 28 05:05:48 localhost dnsmasq[323161]: started, version 2.85 cachesize 150 Nov 28 05:05:48 localhost dnsmasq[323161]: DNS service limited to local subnets Nov 28 05:05:48 localhost dnsmasq[323161]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:05:48 localhost dnsmasq[323161]: warning: no upstream servers configured Nov 28 05:05:48 localhost dnsmasq-dhcp[323161]: DHCPv6, static leases only on 2001:db8:0:ffff::, lease time 1d Nov 28 05:05:48 localhost dnsmasq[323161]: read /var/lib/neutron/dhcp/3f532ea4-a0de-4113-8993-33f982144ec8/addn_hosts - 0 addresses Nov 28 05:05:48 localhost dnsmasq-dhcp[323161]: read /var/lib/neutron/dhcp/3f532ea4-a0de-4113-8993-33f982144ec8/host Nov 28 05:05:48 localhost dnsmasq-dhcp[323161]: read /var/lib/neutron/dhcp/3f532ea4-a0de-4113-8993-33f982144ec8/opts Nov 28 05:05:48 localhost podman[323150]: 2025-11-28 10:05:48.450103681 +0000 UTC m=+0.095879668 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64, release=1755695350, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Nov 28 05:05:48 localhost podman[323150]: 2025-11-28 10:05:48.487726019 +0000 UTC m=+0.133501996 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.buildah.version=1.33.7, vendor=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc.) Nov 28 05:05:48 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:05:48 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:48.666 261084 INFO neutron.agent.dhcp.agent [None req-978e41c3-87f3-432c-afc5-db2f2950207a - - - - - -] DHCP configuration for ports {'a1556cab-1b8b-43ba-b3a0-dfbacf198240'} is completed#033[00m Nov 28 05:05:48 localhost nova_compute[279673]: 2025-11-28 10:05:48.921 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:05:48 localhost nova_compute[279673]: 2025-11-28 10:05:48.938 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 05:05:48 localhost nova_compute[279673]: 2025-11-28 10:05:48.938 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 05:05:48 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:48.994 2 INFO neutron.agent.securitygroups_rpc [None req-0fb2cb74-ebe1-4595-9145-49dcf60353ff 8d30c732fa674cae8e1de9092f58edd9 3a67d7f32f5e49c3aed3e09278dd6c95 - - default default] Security group member updated ['371ca172-3d0a-4f94-811c-7c823124cef1']#033[00m Nov 28 05:05:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e153 do_prune osdmap full prune enabled Nov 28 05:05:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e154 e154: 6 total, 6 up, 6 in Nov 28 05:05:49 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e154: 6 total, 6 up, 6 in Nov 28 05:05:49 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:49.395 2 INFO neutron.agent.securitygroups_rpc [None req-a6f5628a-dc81-40c9-a579-68393376dce2 8d30c732fa674cae8e1de9092f58edd9 3a67d7f32f5e49c3aed3e09278dd6c95 - - default default] Security group member updated ['371ca172-3d0a-4f94-811c-7c823124cef1']#033[00m Nov 28 05:05:49 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:49.485 261084 INFO neutron.agent.linux.ip_lib [None req-140c8cb8-d4ff-4fd0-803b-53ad9954a08e - - - - - -] Device tap3be78940-7b cannot be used as it has no MAC address#033[00m Nov 28 05:05:49 localhost nova_compute[279673]: 2025-11-28 10:05:49.514 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:49 localhost kernel: device tap3be78940-7b entered promiscuous mode Nov 28 05:05:49 localhost ovn_controller[152322]: 2025-11-28T10:05:49Z|00360|binding|INFO|Claiming lport 3be78940-7b85-4f58-98c9-0b59e055c9b7 for this chassis. Nov 28 05:05:49 localhost ovn_controller[152322]: 2025-11-28T10:05:49Z|00361|binding|INFO|3be78940-7b85-4f58-98c9-0b59e055c9b7: Claiming unknown Nov 28 05:05:49 localhost nova_compute[279673]: 2025-11-28 10:05:49.522 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:49 localhost NetworkManager[5967]: [1764324349.5237] manager: (tap3be78940-7b): new Generic device (/org/freedesktop/NetworkManager/Devices/60) Nov 28 05:05:49 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:49.536 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-7d1af60e-6636-42cc-a949-e5df247a624f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d1af60e-6636-42cc-a949-e5df247a624f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8499932c523b4e26933fff84403e296e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=21e31ec5-87c8-4c59-84a8-d6708f5a124b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3be78940-7b85-4f58-98c9-0b59e055c9b7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:05:49 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:49.538 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 3be78940-7b85-4f58-98c9-0b59e055c9b7 in datapath 7d1af60e-6636-42cc-a949-e5df247a624f bound to our chassis#033[00m Nov 28 05:05:49 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:49.540 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7d1af60e-6636-42cc-a949-e5df247a624f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:05:49 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:49.541 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[8d717c0f-5a38-491b-b01f-92ee930894c2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:05:49 localhost ovn_controller[152322]: 2025-11-28T10:05:49Z|00362|binding|INFO|Setting lport 3be78940-7b85-4f58-98c9-0b59e055c9b7 ovn-installed in OVS Nov 28 05:05:49 localhost ovn_controller[152322]: 2025-11-28T10:05:49Z|00363|binding|INFO|Setting lport 3be78940-7b85-4f58-98c9-0b59e055c9b7 up in Southbound Nov 28 05:05:49 localhost nova_compute[279673]: 2025-11-28 10:05:49.567 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:49 localhost nova_compute[279673]: 2025-11-28 10:05:49.610 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:49 localhost nova_compute[279673]: 2025-11-28 10:05:49.638 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:50 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:05:50 localhost podman[323234]: Nov 28 05:05:50 localhost podman[323234]: 2025-11-28 10:05:50.575681988 +0000 UTC m=+0.096597910 container create 6f04465a00f9d9e59ca3689fa115e60b0bb8549ce868c8c4e43fa90e9fc2f46b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d1af60e-6636-42cc-a949-e5df247a624f, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2) Nov 28 05:05:50 localhost podman[323234]: 2025-11-28 10:05:50.526626791 +0000 UTC m=+0.047542703 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:05:50 localhost systemd[1]: Started libpod-conmon-6f04465a00f9d9e59ca3689fa115e60b0bb8549ce868c8c4e43fa90e9fc2f46b.scope. Nov 28 05:05:50 localhost systemd[1]: Started libcrun container. Nov 28 05:05:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02a1d6e207201b1173b4f89a5144ddbc385ac151e3437841e7a59d19a4acb788/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:05:50 localhost podman[323234]: 2025-11-28 10:05:50.664462801 +0000 UTC m=+0.185378643 container init 6f04465a00f9d9e59ca3689fa115e60b0bb8549ce868c8c4e43fa90e9fc2f46b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d1af60e-6636-42cc-a949-e5df247a624f, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true) Nov 28 05:05:50 localhost podman[323234]: 2025-11-28 10:05:50.676245989 +0000 UTC m=+0.197161841 container start 6f04465a00f9d9e59ca3689fa115e60b0bb8549ce868c8c4e43fa90e9fc2f46b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d1af60e-6636-42cc-a949-e5df247a624f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Nov 28 05:05:50 localhost dnsmasq[323252]: started, version 2.85 cachesize 150 Nov 28 05:05:50 localhost dnsmasq[323252]: DNS service limited to local subnets Nov 28 05:05:50 localhost dnsmasq[323252]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:05:50 localhost dnsmasq[323252]: warning: no upstream servers configured Nov 28 05:05:50 localhost dnsmasq-dhcp[323252]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:05:50 localhost dnsmasq[323252]: read /var/lib/neutron/dhcp/7d1af60e-6636-42cc-a949-e5df247a624f/addn_hosts - 0 addresses Nov 28 05:05:50 localhost dnsmasq-dhcp[323252]: read /var/lib/neutron/dhcp/7d1af60e-6636-42cc-a949-e5df247a624f/host Nov 28 05:05:50 localhost dnsmasq-dhcp[323252]: read /var/lib/neutron/dhcp/7d1af60e-6636-42cc-a949-e5df247a624f/opts Nov 28 05:05:50 localhost nova_compute[279673]: 2025-11-28 10:05:50.767 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:50 localhost nova_compute[279673]: 2025-11-28 10:05:50.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:05:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:50.844 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:05:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:50.844 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:05:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:50.846 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:05:50 localhost nova_compute[279673]: 2025-11-28 10:05:50.903 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:50 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:50.908 261084 INFO neutron.agent.dhcp.agent [None req-abf16529-5afa-4f6f-8b54-d755a04cfef7 - - - - - -] DHCP configuration for ports {'6a3df323-6735-4de7-800b-c26cf8d05b74'} is completed#033[00m Nov 28 05:05:51 localhost nova_compute[279673]: 2025-11-28 10:05:51.792 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:05:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:05:52 localhost ovn_controller[152322]: 2025-11-28T10:05:52Z|00364|binding|INFO|Releasing lport 3be78940-7b85-4f58-98c9-0b59e055c9b7 from this chassis (sb_readonly=0) Nov 28 05:05:52 localhost ovn_controller[152322]: 2025-11-28T10:05:52Z|00365|binding|INFO|Setting lport 3be78940-7b85-4f58-98c9-0b59e055c9b7 down in Southbound Nov 28 05:05:52 localhost nova_compute[279673]: 2025-11-28 10:05:52.814 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:52 localhost kernel: device tap3be78940-7b left promiscuous mode Nov 28 05:05:52 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:52.824 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 5c0b54aa-cfb1-4d71-9aea-bd9c3487eadc with type ""#033[00m Nov 28 05:05:52 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:52.826 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-7d1af60e-6636-42cc-a949-e5df247a624f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d1af60e-6636-42cc-a949-e5df247a624f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8499932c523b4e26933fff84403e296e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=21e31ec5-87c8-4c59-84a8-d6708f5a124b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3be78940-7b85-4f58-98c9-0b59e055c9b7) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:05:52 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:52.827 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 3be78940-7b85-4f58-98c9-0b59e055c9b7 in datapath 7d1af60e-6636-42cc-a949-e5df247a624f unbound from our chassis#033[00m Nov 28 05:05:52 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:52.830 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d1af60e-6636-42cc-a949-e5df247a624f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:05:52 localhost systemd[1]: tmp-crun.3XmyZs.mount: Deactivated successfully. Nov 28 05:05:52 localhost nova_compute[279673]: 2025-11-28 10:05:52.836 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:52 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:52.838 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[23d8df19-03ee-4b63-adfd-baa4ec6885fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:05:52 localhost podman[323253]: 2025-11-28 10:05:52.839810483 +0000 UTC m=+0.068335268 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 28 05:05:52 localhost podman[323253]: 2025-11-28 10:05:52.850067318 +0000 UTC m=+0.078592113 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 05:05:52 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:05:53 localhost dnsmasq[323252]: read /var/lib/neutron/dhcp/7d1af60e-6636-42cc-a949-e5df247a624f/addn_hosts - 0 addresses Nov 28 05:05:53 localhost dnsmasq-dhcp[323252]: read /var/lib/neutron/dhcp/7d1af60e-6636-42cc-a949-e5df247a624f/host Nov 28 05:05:53 localhost dnsmasq-dhcp[323252]: read /var/lib/neutron/dhcp/7d1af60e-6636-42cc-a949-e5df247a624f/opts Nov 28 05:05:53 localhost podman[323296]: 2025-11-28 10:05:53.365479766 +0000 UTC m=+0.074573788 container kill 6f04465a00f9d9e59ca3689fa115e60b0bb8549ce868c8c4e43fa90e9fc2f46b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d1af60e-6636-42cc-a949-e5df247a624f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent [None req-382043f7-0b96-4c26-ae87-5eeb4cb6913b - - - - - -] Unable to reload_allocations dhcp for 7d1af60e-6636-42cc-a949-e5df247a624f.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap3be78940-7b not found in namespace qdhcp-7d1af60e-6636-42cc-a949-e5df247a624f. Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent return fut.result() Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent return self.__get_result() Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent raise self._exception Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap3be78940-7b not found in namespace qdhcp-7d1af60e-6636-42cc-a949-e5df247a624f. Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.391 261084 ERROR neutron.agent.dhcp.agent #033[00m Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.394 261084 INFO neutron.agent.dhcp.agent [None req-6b0cd380-e608-4081-8967-4a3f74b64491 - - - - - -] Synchronizing state#033[00m Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.657 261084 INFO neutron.agent.dhcp.agent [None req-3892baa5-d454-46db-a6f6-03a5e0466c91 - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.658 261084 INFO neutron.agent.dhcp.agent [-] Starting network 7a711d7c-ff53-41f7-b3e7-fa55f4315988 dhcp configuration#033[00m Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.662 261084 INFO neutron.agent.dhcp.agent [-] Starting network 7d1af60e-6636-42cc-a949-e5df247a624f dhcp configuration#033[00m Nov 28 05:05:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:53.663 261084 INFO neutron.agent.dhcp.agent [-] Finished network 7d1af60e-6636-42cc-a949-e5df247a624f dhcp configuration#033[00m Nov 28 05:05:53 localhost ovn_controller[152322]: 2025-11-28T10:05:53Z|00366|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:05:53 localhost nova_compute[279673]: 2025-11-28 10:05:53.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:05:53 localhost nova_compute[279673]: 2025-11-28 10:05:53.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Nov 28 05:05:53 localhost nova_compute[279673]: 2025-11-28 10:05:53.829 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:53 localhost systemd[1]: tmp-crun.iFIAak.mount: Deactivated successfully. Nov 28 05:05:54 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:54.820 261084 INFO neutron.agent.linux.ip_lib [None req-ff7760ed-79c9-4fef-bbfc-d4d324a1e55d - - - - - -] Device tap66bb996f-a9 cannot be used as it has no MAC address#033[00m Nov 28 05:05:54 localhost nova_compute[279673]: 2025-11-28 10:05:54.879 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:54 localhost kernel: device tap66bb996f-a9 entered promiscuous mode Nov 28 05:05:54 localhost NetworkManager[5967]: [1764324354.8864] manager: (tap66bb996f-a9): new Generic device (/org/freedesktop/NetworkManager/Devices/61) Nov 28 05:05:54 localhost ovn_controller[152322]: 2025-11-28T10:05:54Z|00367|binding|INFO|Claiming lport 66bb996f-a921-4ae6-b26d-1be8fa01c3bb for this chassis. Nov 28 05:05:54 localhost ovn_controller[152322]: 2025-11-28T10:05:54Z|00368|binding|INFO|66bb996f-a921-4ae6-b26d-1be8fa01c3bb: Claiming unknown Nov 28 05:05:54 localhost nova_compute[279673]: 2025-11-28 10:05:54.889 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:54 localhost systemd-udevd[323319]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:05:54 localhost journal[227875]: ethtool ioctl error on tap66bb996f-a9: No such device Nov 28 05:05:54 localhost journal[227875]: ethtool ioctl error on tap66bb996f-a9: No such device Nov 28 05:05:54 localhost nova_compute[279673]: 2025-11-28 10:05:54.925 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:54 localhost nova_compute[279673]: 2025-11-28 10:05:54.927 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:54 localhost ovn_controller[152322]: 2025-11-28T10:05:54Z|00369|binding|INFO|Setting lport 66bb996f-a921-4ae6-b26d-1be8fa01c3bb ovn-installed in OVS Nov 28 05:05:54 localhost journal[227875]: ethtool ioctl error on tap66bb996f-a9: No such device Nov 28 05:05:54 localhost journal[227875]: ethtool ioctl error on tap66bb996f-a9: No such device Nov 28 05:05:54 localhost journal[227875]: ethtool ioctl error on tap66bb996f-a9: No such device Nov 28 05:05:54 localhost journal[227875]: ethtool ioctl error on tap66bb996f-a9: No such device Nov 28 05:05:54 localhost journal[227875]: ethtool ioctl error on tap66bb996f-a9: No such device Nov 28 05:05:54 localhost journal[227875]: ethtool ioctl error on tap66bb996f-a9: No such device Nov 28 05:05:54 localhost ovn_controller[152322]: 2025-11-28T10:05:54Z|00370|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:05:54 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:54.957 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-7a711d7c-ff53-41f7-b3e7-fa55f4315988', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7a711d7c-ff53-41f7-b3e7-fa55f4315988', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a67d7f32f5e49c3aed3e09278dd6c95', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a8ed49a-5d05-48e5-b507-755a90a6ebc7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=66bb996f-a921-4ae6-b26d-1be8fa01c3bb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:05:54 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:54.959 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 66bb996f-a921-4ae6-b26d-1be8fa01c3bb in datapath 7a711d7c-ff53-41f7-b3e7-fa55f4315988 bound to our chassis#033[00m Nov 28 05:05:54 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:54.960 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7a711d7c-ff53-41f7-b3e7-fa55f4315988 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:05:54 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:54.961 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[1125f8f2-855d-4654-a223-aaace0c59bde]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:05:54 localhost ovn_controller[152322]: 2025-11-28T10:05:54Z|00371|binding|INFO|Setting lport 66bb996f-a921-4ae6-b26d-1be8fa01c3bb up in Southbound Nov 28 05:05:54 localhost nova_compute[279673]: 2025-11-28 10:05:54.981 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:55 localhost nova_compute[279673]: 2025-11-28 10:05:55.014 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:55 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:05:55 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e154 do_prune osdmap full prune enabled Nov 28 05:05:55 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e155 e155: 6 total, 6 up, 6 in Nov 28 05:05:55 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e155: 6 total, 6 up, 6 in Nov 28 05:05:55 localhost nova_compute[279673]: 2025-11-28 10:05:55.769 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:55 localhost nova_compute[279673]: 2025-11-28 10:05:55.935 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:55 localhost podman[323390]: Nov 28 05:05:55 localhost podman[323390]: 2025-11-28 10:05:55.956854038 +0000 UTC m=+0.106853112 container create a555ad669ff628a216c2a0be74a9f22f715f17eea3843cb31f7d4cf37535cf33 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7a711d7c-ff53-41f7-b3e7-fa55f4315988, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:05:55 localhost podman[323390]: 2025-11-28 10:05:55.898501996 +0000 UTC m=+0.048501110 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:05:56 localhost systemd[1]: Started libpod-conmon-a555ad669ff628a216c2a0be74a9f22f715f17eea3843cb31f7d4cf37535cf33.scope. Nov 28 05:05:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:05:56 localhost systemd[1]: Started libcrun container. Nov 28 05:05:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60013bec6e27ad1923a5b5d469c77907300db4f1db841f191084d98ab2e16893/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:05:56 localhost podman[323390]: 2025-11-28 10:05:56.05150597 +0000 UTC m=+0.201505054 container init a555ad669ff628a216c2a0be74a9f22f715f17eea3843cb31f7d4cf37535cf33 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7a711d7c-ff53-41f7-b3e7-fa55f4315988, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:05:56 localhost podman[323390]: 2025-11-28 10:05:56.061191438 +0000 UTC m=+0.211190532 container start a555ad669ff628a216c2a0be74a9f22f715f17eea3843cb31f7d4cf37535cf33 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7a711d7c-ff53-41f7-b3e7-fa55f4315988, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 28 05:05:56 localhost dnsmasq[323421]: started, version 2.85 cachesize 150 Nov 28 05:05:56 localhost dnsmasq[323421]: DNS service limited to local subnets Nov 28 05:05:56 localhost dnsmasq[323421]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:05:56 localhost dnsmasq[323421]: warning: no upstream servers configured Nov 28 05:05:56 localhost dnsmasq-dhcp[323421]: DHCP, static leases only on 10.100.0.16, lease time 1d Nov 28 05:05:56 localhost dnsmasq[323421]: read /var/lib/neutron/dhcp/7a711d7c-ff53-41f7-b3e7-fa55f4315988/addn_hosts - 0 addresses Nov 28 05:05:56 localhost dnsmasq-dhcp[323421]: read /var/lib/neutron/dhcp/7a711d7c-ff53-41f7-b3e7-fa55f4315988/host Nov 28 05:05:56 localhost dnsmasq-dhcp[323421]: read /var/lib/neutron/dhcp/7a711d7c-ff53-41f7-b3e7-fa55f4315988/opts Nov 28 05:05:56 localhost podman[323407]: 2025-11-28 10:05:56.106894227 +0000 UTC m=+0.084119401 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:05:56 localhost podman[323407]: 2025-11-28 10:05:56.114577597 +0000 UTC m=+0.091802881 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Nov 28 05:05:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:56.125 261084 INFO neutron.agent.dhcp.agent [None req-08eec445-ad07-4d1d-afd9-ae6effb40357 - - - - - -] Finished network 7a711d7c-ff53-41f7-b3e7-fa55f4315988 dhcp configuration#033[00m Nov 28 05:05:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:56.125 261084 INFO neutron.agent.dhcp.agent [None req-3892baa5-d454-46db-a6f6-03a5e0466c91 - - - - - -] Synchronizing state complete#033[00m Nov 28 05:05:56 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:05:56 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:56.580 261084 INFO neutron.agent.dhcp.agent [None req-b7526cda-febb-4dfc-a15f-2e223030fd8b - - - - - -] DHCP configuration for ports {'cbfe47f5-af35-45dd-ae4e-d52ae43817f1'} is completed#033[00m Nov 28 05:05:56 localhost dnsmasq[323252]: exiting on receipt of SIGTERM Nov 28 05:05:56 localhost podman[323442]: 2025-11-28 10:05:56.596131396 +0000 UTC m=+0.060237988 container kill 6f04465a00f9d9e59ca3689fa115e60b0bb8549ce868c8c4e43fa90e9fc2f46b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d1af60e-6636-42cc-a949-e5df247a624f, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Nov 28 05:05:56 localhost systemd[1]: libpod-6f04465a00f9d9e59ca3689fa115e60b0bb8549ce868c8c4e43fa90e9fc2f46b.scope: Deactivated successfully. Nov 28 05:05:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:05:56 localhost podman[323456]: 2025-11-28 10:05:56.68074898 +0000 UTC m=+0.063745127 container died 6f04465a00f9d9e59ca3689fa115e60b0bb8549ce868c8c4e43fa90e9fc2f46b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d1af60e-6636-42cc-a949-e5df247a624f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Nov 28 05:05:56 localhost podman[323456]: 2025-11-28 10:05:56.737315621 +0000 UTC m=+0.120311728 container remove 6f04465a00f9d9e59ca3689fa115e60b0bb8549ce868c8c4e43fa90e9fc2f46b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d1af60e-6636-42cc-a949-e5df247a624f, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2) Nov 28 05:05:56 localhost systemd[1]: libpod-conmon-6f04465a00f9d9e59ca3689fa115e60b0bb8549ce868c8c4e43fa90e9fc2f46b.scope: Deactivated successfully. Nov 28 05:05:56 localhost podman[323460]: 2025-11-28 10:05:56.779489529 +0000 UTC m=+0.154235090 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 05:05:56 localhost podman[323460]: 2025-11-28 10:05:56.850610388 +0000 UTC m=+0.225355929 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 05:05:56 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:05:56 localhost dnsmasq[323421]: read /var/lib/neutron/dhcp/7a711d7c-ff53-41f7-b3e7-fa55f4315988/addn_hosts - 0 addresses Nov 28 05:05:56 localhost dnsmasq-dhcp[323421]: read /var/lib/neutron/dhcp/7a711d7c-ff53-41f7-b3e7-fa55f4315988/host Nov 28 05:05:56 localhost dnsmasq-dhcp[323421]: read /var/lib/neutron/dhcp/7a711d7c-ff53-41f7-b3e7-fa55f4315988/opts Nov 28 05:05:56 localhost podman[323517]: 2025-11-28 10:05:56.872647949 +0000 UTC m=+0.045300359 container kill a555ad669ff628a216c2a0be74a9f22f715f17eea3843cb31f7d4cf37535cf33 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7a711d7c-ff53-41f7-b3e7-fa55f4315988, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Nov 28 05:05:56 localhost systemd[1]: var-lib-containers-storage-overlay-02a1d6e207201b1173b4f89a5144ddbc385ac151e3437841e7a59d19a4acb788-merged.mount: Deactivated successfully. Nov 28 05:05:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6f04465a00f9d9e59ca3689fa115e60b0bb8549ce868c8c4e43fa90e9fc2f46b-userdata-shm.mount: Deactivated successfully. Nov 28 05:05:56 localhost systemd[1]: run-netns-qdhcp\x2d7d1af60e\x2d6636\x2d42cc\x2da949\x2de5df247a624f.mount: Deactivated successfully. Nov 28 05:05:57 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:57.601 261084 INFO neutron.agent.dhcp.agent [None req-a64e4494-7ff3-4570-a9a3-5f3560fa9b29 - - - - - -] DHCP configuration for ports {'cbfe47f5-af35-45dd-ae4e-d52ae43817f1', '66bb996f-a921-4ae6-b26d-1be8fa01c3bb'} is completed#033[00m Nov 28 05:05:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:05:57 localhost systemd[1]: tmp-crun.yTagFs.mount: Deactivated successfully. Nov 28 05:05:57 localhost podman[323539]: 2025-11-28 10:05:57.88297835 +0000 UTC m=+0.110606530 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 28 05:05:57 localhost podman[323539]: 2025-11-28 10:05:57.894478819 +0000 UTC m=+0.122107019 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:05:57 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:05:59 localhost dnsmasq[323421]: exiting on receipt of SIGTERM Nov 28 05:05:59 localhost podman[323576]: 2025-11-28 10:05:59.052127842 +0000 UTC m=+0.067736332 container kill a555ad669ff628a216c2a0be74a9f22f715f17eea3843cb31f7d4cf37535cf33 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7a711d7c-ff53-41f7-b3e7-fa55f4315988, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 05:05:59 localhost systemd[1]: libpod-a555ad669ff628a216c2a0be74a9f22f715f17eea3843cb31f7d4cf37535cf33.scope: Deactivated successfully. Nov 28 05:05:59 localhost podman[323588]: 2025-11-28 10:05:59.113670116 +0000 UTC m=+0.051157948 container died a555ad669ff628a216c2a0be74a9f22f715f17eea3843cb31f7d4cf37535cf33 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7a711d7c-ff53-41f7-b3e7-fa55f4315988, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3) Nov 28 05:05:59 localhost systemd[1]: tmp-crun.fLLzXW.mount: Deactivated successfully. Nov 28 05:05:59 localhost podman[323588]: 2025-11-28 10:05:59.163323428 +0000 UTC m=+0.100811200 container cleanup a555ad669ff628a216c2a0be74a9f22f715f17eea3843cb31f7d4cf37535cf33 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7a711d7c-ff53-41f7-b3e7-fa55f4315988, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Nov 28 05:05:59 localhost systemd[1]: libpod-conmon-a555ad669ff628a216c2a0be74a9f22f715f17eea3843cb31f7d4cf37535cf33.scope: Deactivated successfully. Nov 28 05:05:59 localhost podman[323595]: 2025-11-28 10:05:59.189505979 +0000 UTC m=+0.113949097 container remove a555ad669ff628a216c2a0be74a9f22f715f17eea3843cb31f7d4cf37535cf33 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7a711d7c-ff53-41f7-b3e7-fa55f4315988, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Nov 28 05:05:59 localhost kernel: device tap66bb996f-a9 left promiscuous mode Nov 28 05:05:59 localhost ovn_controller[152322]: 2025-11-28T10:05:59Z|00372|binding|INFO|Releasing lport 66bb996f-a921-4ae6-b26d-1be8fa01c3bb from this chassis (sb_readonly=0) Nov 28 05:05:59 localhost ovn_controller[152322]: 2025-11-28T10:05:59Z|00373|binding|INFO|Setting lport 66bb996f-a921-4ae6-b26d-1be8fa01c3bb down in Southbound Nov 28 05:05:59 localhost nova_compute[279673]: 2025-11-28 10:05:59.248 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:59 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:59.263 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-7a711d7c-ff53-41f7-b3e7-fa55f4315988', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7a711d7c-ff53-41f7-b3e7-fa55f4315988', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a67d7f32f5e49c3aed3e09278dd6c95', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a8ed49a-5d05-48e5-b507-755a90a6ebc7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=66bb996f-a921-4ae6-b26d-1be8fa01c3bb) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:05:59 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:59.265 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 66bb996f-a921-4ae6-b26d-1be8fa01c3bb in datapath 7a711d7c-ff53-41f7-b3e7-fa55f4315988 unbound from our chassis#033[00m Nov 28 05:05:59 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:59.267 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7a711d7c-ff53-41f7-b3e7-fa55f4315988, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:05:59 localhost ovn_metadata_agent[158125]: 2025-11-28 10:05:59.269 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[acc81291-13c4-4112-9f07-0dedd6bb64e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:05:59 localhost nova_compute[279673]: 2025-11-28 10:05:59.269 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:05:59 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:59.513 261084 INFO neutron.agent.dhcp.agent [None req-fb132a90-9367-460c-abde-fcbdc48eefd9 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:05:59 localhost neutron_sriov_agent[254147]: 2025-11-28 10:05:59.711 2 INFO neutron.agent.securitygroups_rpc [None req-eb7cbd19-2bf0-4a29-9cd5-8cfb0b13af7b 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']#033[00m Nov 28 05:05:59 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 28 05:05:59 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/753870557' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 28 05:05:59 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:05:59.767 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:06:00 localhost systemd[1]: var-lib-containers-storage-overlay-60013bec6e27ad1923a5b5d469c77907300db4f1db841f191084d98ab2e16893-merged.mount: Deactivated successfully. Nov 28 05:06:00 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a555ad669ff628a216c2a0be74a9f22f715f17eea3843cb31f7d4cf37535cf33-userdata-shm.mount: Deactivated successfully. Nov 28 05:06:00 localhost systemd[1]: run-netns-qdhcp\x2d7a711d7c\x2dff53\x2d41f7\x2db3e7\x2dfa55f4315988.mount: Deactivated successfully. Nov 28 05:06:00 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:06:00 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:00.253 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:06:00 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e155 do_prune osdmap full prune enabled Nov 28 05:06:00 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e156 e156: 6 total, 6 up, 6 in Nov 28 05:06:00 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e156: 6 total, 6 up, 6 in Nov 28 05:06:00 localhost ovn_controller[152322]: 2025-11-28T10:06:00Z|00374|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.675 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.676 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 28 05:06:00 localhost nova_compute[279673]: 2025-11-28 10:06:00.737 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.740 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 17020000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd845317-f579-404d-96e0-6d703e41e21d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17020000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:06:00.676394', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'd7690280-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.912291188, 'message_signature': '1844171b68b2ae6855ac910be045638da2f2754db42f228ebb574fdebf01bbdc'}]}, 'timestamp': '2025-11-28 10:06:00.741242', '_unique_id': 'f3479e4248ba43f2869d91279067b75e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.742 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.744 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.747 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '879335fb-2cfc-4e52-9033-4339135b3d7c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:06:00.744211', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'd76a0ef0-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.916120989, 'message_signature': '4fecac80c08f67fb6277b0811d4dadb1217cc5804b80e7f0427d67c353ebed38'}]}, 'timestamp': '2025-11-28 10:06:00.748076', '_unique_id': '66fb038e4b634e939f7af736678db402'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.749 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.750 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 28 05:06:00 localhost nova_compute[279673]: 2025-11-28 10:06:00.771 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.781 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.781 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a6c8efe-e1f2-40cc-bc24-50ed781294e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:06:00.750642', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd76f2ebc-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.922538142, 'message_signature': '81cc3aca4720f5d3541a2d5da9bfd1e7bdebb1cb7918e026d3299ff2ddc00513'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:06:00.750642', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd76f4078-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.922538142, 'message_signature': '2b6c56439d5e2f3135258ee0072d4a4124468f0e35bcb08b44bdc3910e5e2880'}]}, 'timestamp': '2025-11-28 10:06:00.782121', '_unique_id': '186081e5f5fa490db5b4cc07c4d3cb81'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.783 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.784 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.784 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.785 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4124fe0d-c212-4775-9be8-c4a7329fa4a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:06:00.784585', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd76fb580-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.922538142, 'message_signature': '2c54d53b0decc992e22f523c114c8e0860f32a2664e110d3485c123ac924e057'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:06:00.784585', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd76fc778-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.922538142, 'message_signature': '37968bb3d9e549caaf0510195022c1368e06b96c1e6322546bddf3302475a5b6'}]}, 'timestamp': '2025-11-28 10:06:00.785504', '_unique_id': 'a95b5535056240bb961c7cc9a658ebc5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.786 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.787 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.787 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3917536-1da7-4cc2-990f-94742f2fcd29', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:06:00.787811', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'd7703500-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.916120989, 'message_signature': '7cd043d6ffce00bdd548c43bcbf66ae1208774d2820273da2b2f8758fefee3f1'}]}, 'timestamp': '2025-11-28 10:06:00.788335', '_unique_id': '9cee908666bc45fd92f70e87f919339e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.789 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.790 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.790 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02a17a44-ea63-4357-9fdc-20ec71a48895', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:06:00.790530', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'd7709dd8-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.916120989, 'message_signature': 'fcb4c4ce24fe48d9d23c21a3b2dad1bda7ee8b9db960c2c4a9f79bbcefb957f5'}]}, 'timestamp': '2025-11-28 10:06:00.791048', '_unique_id': 'f3a347efd73c4552ae9a2ab3c27cd371'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.791 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.793 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.793 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.793 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.794 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34cbd06f-8184-4e5e-868f-b305dd3bbc1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:06:00.793455', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd77112e0-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.922538142, 'message_signature': 'bb493ea1c13d67268098da1445e732b8253e9ef8e0d83636ea8ad0dbb0fe888e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:06:00.793455', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd77126a4-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.922538142, 'message_signature': 'eb5f79782ff529fda20c477ad9da771f6c950152e3d91cfe47d49aaf320ee386'}]}, 'timestamp': '2025-11-28 10:06:00.794496', '_unique_id': '9f59f84c3eba4fe8a1280b472db9c17b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.795 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.796 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.796 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3667fa21-f424-434b-a928-d9c206dba987', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:06:00.796929', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'd7719bc0-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.916120989, 'message_signature': '9a00c65c12601944aab5e12d471a78931066d9a9054ce4b8e92f7593a4d90776'}]}, 'timestamp': '2025-11-28 10:06:00.797519', '_unique_id': '10b32e8f57b6426aa1b8347047b276da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.798 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.799 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.799 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.800 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9339392-0219-474a-b3c3-e6a4c4b7ff2a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:06:00.799702', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd77203b2-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.922538142, 'message_signature': '79021b3188ca7d4bffb13b98fca39f2bbd4d0de2a0ea9566fab8e6c2d0f6e8b0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:06:00.799702', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd77215a0-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.922538142, 'message_signature': '10ce026b81590cacab029d55c2933fa0afcc8a115e108aa36ba24deb3722ea5c'}]}, 'timestamp': '2025-11-28 10:06:00.800608', '_unique_id': '506a3e2be93c43aaa87557d1d3061f9e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.801 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.802 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.802 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.803 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db16ea7b-b393-4168-a16c-d5a2d9b22b4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:06:00.802819', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd7727ebe-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.922538142, 'message_signature': 'ca09f32a0e63b6ef48143297756aef36c636f2d05b3e40127d3dc02d6c479a0c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:06:00.802819', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd7728f58-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.922538142, 'message_signature': 'f314a57bb2db18c69b5adb2d8f4ce13e7c4e14fbf5bc03622674a45c1bbed950'}]}, 'timestamp': '2025-11-28 10:06:00.803722', '_unique_id': 'fd012c33038242d5bf66d338b09a3e3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.804 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.805 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.805 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a27944d1-8f9f-4f7a-81b4-bd49d28ce295', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:06:00.805928', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'd772f880-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.912291188, 'message_signature': '1b9b2ccabe1ee71474fbfed0814a4af03caccc3a3a638a8e44ff7932bbc520a2'}]}, 'timestamp': '2025-11-28 10:06:00.806429', '_unique_id': '51aaa6b58a1a4c48871fbc276671d9af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.807 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.808 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.821 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.821 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '679f0caf-dc09-4356-8178-655023860816', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:06:00.808609', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd7755008-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.980521094, 'message_signature': '91f9b9bc58a7044ff39de526d589933619f2db5afad0fd47ade18099e8b54590'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:06:00.808609', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd775639a-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.980521094, 'message_signature': '372aad769ccde2111fb794dffb932292136b322662b3b64ebedc0e74dc58cb08'}]}, 'timestamp': '2025-11-28 10:06:00.822265', '_unique_id': 'f29a5b98d4dd49dfb45a7be981ec5433'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.823 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.824 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.824 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.825 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb767f03-c0aa-402f-b296-6646cc6cfb4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:06:00.824687', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd775d4b0-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.980521094, 'message_signature': '6678bab04d9988a803c86f6dace1dcfd4f3b48cb9bec625a81e79335722fad96'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:06:00.824687', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd775e798-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.980521094, 'message_signature': '2a216f0d2707095498965e1c00c8b5c2d5596b20da5dbd6189e5be37658be4e7'}]}, 'timestamp': '2025-11-28 10:06:00.825646', '_unique_id': '3e8e5f7ace014151b692c42be2e9c0f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.826 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.828 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.828 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.828 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a1d45ca-54ed-4ea9-b4c3-663d49e8229c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:06:00.828381', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'd7766434-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.916120989, 'message_signature': '642771d5b258c5d8000e553764658be0ea14e0d4102a84ebc22bf87bb87f85d0'}]}, 'timestamp': '2025-11-28 10:06:00.828952', '_unique_id': 'a16cdb7b60ae482dbe17a43b9526f7aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.829 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.831 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.831 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.831 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4d6ba9f-2c45-4cc5-b67e-fc1e7702d4cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:06:00.831302', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd776d61c-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.980521094, 'message_signature': '141e6e9a77e9f23b4a77f0436858162518aaa0e13eec7c6c98340611a6d7b2f6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:06:00.831302', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd776eae4-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.980521094, 'message_signature': '21c6222ea854d17feed872af83386dd4c114557b3f1aff5239c8199a6b7c298b'}]}, 'timestamp': '2025-11-28 10:06:00.832298', '_unique_id': 'a400443d65df406c8e15ef1f012bb4a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.833 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.834 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.834 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.835 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b39f1238-cf13-4112-a0e9-fe4eab580e56', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:06:00.834730', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd7775c22-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.922538142, 'message_signature': 'b47d928d11de7813fa671747d80f464be49bc0ff691eafd299166be088bdf9ae'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:06:00.834730', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd77772de-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.922538142, 'message_signature': '133cd7f471866d255a6398dfdf6960f12af71f08d0046c3c6f32b64bdff02ecb'}]}, 'timestamp': '2025-11-28 10:06:00.835766', '_unique_id': '0996e0723681437d81ad40e78d60ef16'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.836 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.838 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.838 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '246dadc9-cd9d-4bec-bb6e-b498b59141c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:06:00.838577', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'd777f286-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.916120989, 'message_signature': '44d2fc2ec72d85890369e61684cf34e4b58d2d0ede1c10f65ddd3e873d3c2cb3'}]}, 'timestamp': '2025-11-28 10:06:00.839093', '_unique_id': 'f52ab51d20164396a1cb36daad93516c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.840 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.841 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.841 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94520f3b-d4e1-443e-9a7a-61e5d4b39dbd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:06:00.841715', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'd7786d9c-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.916120989, 'message_signature': '3ea228459018d04d3883cd2b3d7ed38d3cbdab2dde345b11221bda67c96c3c33'}]}, 'timestamp': '2025-11-28 10:06:00.842249', '_unique_id': 'cff874e6bffc4be9824f4c50760e0ea3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.842 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.843 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.844 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.844 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84fa6121-79d8-44da-a042-bd1ee67ae122', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:06:00.844109', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'd778c6fc-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.916120989, 'message_signature': 'e074c583572d1b447682ccc3dfa7afa5b0e218e855e70577e97aa103bc57fd50'}]}, 'timestamp': '2025-11-28 10:06:00.844410', '_unique_id': '71ac992c4d7a422e87498b35f95f4403'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.845 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba20dd06-eaf6-427d-b8c5-0db2475cc954', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:06:00.845891', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'd7790f04-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.916120989, 'message_signature': 'a2a416539b94a62b0c9519f1bd6decec056f666f60a5ac0f054e87695987aa1e'}]}, 'timestamp': '2025-11-28 10:06:00.846257', '_unique_id': 'a5e4255df49d4dd8a8599a5129bf263f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.846 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.847 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.847 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8fc0bdf-fc7b-4f98-a66a-c5ddd9b14ce2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:06:00.847636', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'd779536a-cc41-11f0-a370-fa163eb02593', 'monotonic_time': 12194.916120989, 'message_signature': 'c2de92a97b06ef51055fca4425cea2c31f720475b6fdb5453814c128803565c2'}]}, 'timestamp': '2025-11-28 10:06:00.848006', '_unique_id': '6116905a2f814d6093545617bda6776e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:06:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:06:00.848 12 ERROR oslo_messaging.notify.messaging Nov 28 05:06:00 localhost nova_compute[279673]: 2025-11-28 10:06:00.936 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:01 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e156 do_prune osdmap full prune enabled Nov 28 05:06:01 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e157 e157: 6 total, 6 up, 6 in Nov 28 05:06:01 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e157: 6 total, 6 up, 6 in Nov 28 05:06:02 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e157 do_prune osdmap full prune enabled Nov 28 05:06:02 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e158 e158: 6 total, 6 up, 6 in Nov 28 05:06:02 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e158: 6 total, 6 up, 6 in Nov 28 05:06:02 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:02.857 2 INFO neutron.agent.securitygroups_rpc [None req-e67102ba-fe9d-44cc-8c94-469bc136f71c 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']#033[00m Nov 28 05:06:03 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:03.015 2 INFO neutron.agent.securitygroups_rpc [None req-e67102ba-fe9d-44cc-8c94-469bc136f71c 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']#033[00m Nov 28 05:06:03 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e158 do_prune osdmap full prune enabled Nov 28 05:06:03 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e159 e159: 6 total, 6 up, 6 in Nov 28 05:06:03 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e159: 6 total, 6 up, 6 in Nov 28 05:06:03 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:03.622 2 INFO neutron.agent.securitygroups_rpc [None req-06fb4107-5237-41cb-aedf-7ec837b7d0c6 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']#033[00m Nov 28 05:06:04 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:04.288 2 INFO neutron.agent.securitygroups_rpc [None req-f2f37d61-bf24-49e1-9d4d-20b98269f559 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']#033[00m Nov 28 05:06:05 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:06:05 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e159 do_prune osdmap full prune enabled Nov 28 05:06:05 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e160 e160: 6 total, 6 up, 6 in Nov 28 05:06:05 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e160: 6 total, 6 up, 6 in Nov 28 05:06:05 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:05.147 261084 INFO neutron.agent.linux.ip_lib [None req-50215d9c-a361-42fb-8188-be02379e7e7d - - - - - -] Device tap6b9af304-88 cannot be used as it has no MAC address#033[00m Nov 28 05:06:05 localhost nova_compute[279673]: 2025-11-28 10:06:05.175 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:05 localhost kernel: device tap6b9af304-88 entered promiscuous mode Nov 28 05:06:05 localhost nova_compute[279673]: 2025-11-28 10:06:05.185 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:05 localhost ovn_controller[152322]: 2025-11-28T10:06:05Z|00375|binding|INFO|Claiming lport 6b9af304-88e0-4384-8907-09d0654dd558 for this chassis. Nov 28 05:06:05 localhost NetworkManager[5967]: [1764324365.1861] manager: (tap6b9af304-88): new Generic device (/org/freedesktop/NetworkManager/Devices/62) Nov 28 05:06:05 localhost ovn_controller[152322]: 2025-11-28T10:06:05Z|00376|binding|INFO|6b9af304-88e0-4384-8907-09d0654dd558: Claiming unknown Nov 28 05:06:05 localhost systemd-udevd[323626]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:06:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:05.199 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8499932c523b4e26933fff84403e296e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2c47060-f3f3-4904-844c-d551d6391359, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=6b9af304-88e0-4384-8907-09d0654dd558) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:06:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:05.200 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 6b9af304-88e0-4384-8907-09d0654dd558 in datapath 7aa47953-45b5-4e9e-a0ad-6ce1121b65d3 bound to our chassis#033[00m Nov 28 05:06:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:05.202 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7aa47953-45b5-4e9e-a0ad-6ce1121b65d3 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:06:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:05.203 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[63718553-f4b8-440b-baf0-fc1e9f6155bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:06:05 localhost ovn_controller[152322]: 2025-11-28T10:06:05Z|00377|binding|INFO|Setting lport 6b9af304-88e0-4384-8907-09d0654dd558 ovn-installed in OVS Nov 28 05:06:05 localhost ovn_controller[152322]: 2025-11-28T10:06:05Z|00378|binding|INFO|Setting lport 6b9af304-88e0-4384-8907-09d0654dd558 up in Southbound Nov 28 05:06:05 localhost nova_compute[279673]: 2025-11-28 10:06:05.233 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:05 localhost nova_compute[279673]: 2025-11-28 10:06:05.237 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:05 localhost nova_compute[279673]: 2025-11-28 10:06:05.290 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:05 localhost nova_compute[279673]: 2025-11-28 10:06:05.376 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:05 localhost nova_compute[279673]: 2025-11-28 10:06:05.774 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:05 localhost nova_compute[279673]: 2025-11-28 10:06:05.939 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:06 localhost podman[323681]: Nov 28 05:06:06 localhost podman[323681]: 2025-11-28 10:06:06.306576 +0000 UTC m=+0.092891902 container create e1b8c98f5cd735ffc0cdccca2a12012cef919d6f5c249aa963f6344ad96606c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 05:06:06 localhost systemd[1]: Started libpod-conmon-e1b8c98f5cd735ffc0cdccca2a12012cef919d6f5c249aa963f6344ad96606c6.scope. Nov 28 05:06:06 localhost podman[323681]: 2025-11-28 10:06:06.262814947 +0000 UTC m=+0.049130889 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:06:06 localhost systemd[1]: Started libcrun container. Nov 28 05:06:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7af7ba26fc5497d5193fe60b439f0ab812c8b19ba53610d2ee58bf5d35d015e0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:06:06 localhost podman[323681]: 2025-11-28 10:06:06.378908393 +0000 UTC m=+0.165224305 container init e1b8c98f5cd735ffc0cdccca2a12012cef919d6f5c249aa963f6344ad96606c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 05:06:06 localhost podman[323681]: 2025-11-28 10:06:06.389390904 +0000 UTC m=+0.175706806 container start e1b8c98f5cd735ffc0cdccca2a12012cef919d6f5c249aa963f6344ad96606c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:06:06 localhost dnsmasq[323699]: started, version 2.85 cachesize 150 Nov 28 05:06:06 localhost dnsmasq[323699]: DNS service limited to local subnets Nov 28 05:06:06 localhost dnsmasq[323699]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:06:06 localhost dnsmasq[323699]: warning: no upstream servers configured Nov 28 05:06:06 localhost dnsmasq-dhcp[323699]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:06:06 localhost dnsmasq[323699]: read /var/lib/neutron/dhcp/7aa47953-45b5-4e9e-a0ad-6ce1121b65d3/addn_hosts - 0 addresses Nov 28 05:06:06 localhost dnsmasq-dhcp[323699]: read /var/lib/neutron/dhcp/7aa47953-45b5-4e9e-a0ad-6ce1121b65d3/host Nov 28 05:06:06 localhost dnsmasq-dhcp[323699]: read /var/lib/neutron/dhcp/7aa47953-45b5-4e9e-a0ad-6ce1121b65d3/opts Nov 28 05:06:06 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:06.531 261084 INFO neutron.agent.dhcp.agent [None req-25f77893-cfe2-4169-b8c6-84d3c462c28e - - - - - -] DHCP configuration for ports {'b362d99b-cd7e-4279-a644-ce03e6e8ad3e'} is completed#033[00m Nov 28 05:06:07 localhost ovn_controller[152322]: 2025-11-28T10:06:07Z|00379|binding|INFO|Removing iface tap6b9af304-88 ovn-installed in OVS Nov 28 05:06:07 localhost ovn_controller[152322]: 2025-11-28T10:06:07Z|00380|binding|INFO|Removing lport 6b9af304-88e0-4384-8907-09d0654dd558 ovn-installed in OVS Nov 28 05:06:07 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:07.591 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 8be6fc43-4d37-4bea-ad0e-c8c9afe7f73d with type ""#033[00m Nov 28 05:06:07 localhost nova_compute[279673]: 2025-11-28 10:06:07.592 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:07 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:07.593 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8499932c523b4e26933fff84403e296e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2c47060-f3f3-4904-844c-d551d6391359, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=6b9af304-88e0-4384-8907-09d0654dd558) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:06:07 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:07.596 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 6b9af304-88e0-4384-8907-09d0654dd558 in datapath 7aa47953-45b5-4e9e-a0ad-6ce1121b65d3 unbound from our chassis#033[00m Nov 28 05:06:07 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:07.600 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7aa47953-45b5-4e9e-a0ad-6ce1121b65d3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:06:07 localhost nova_compute[279673]: 2025-11-28 10:06:07.601 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:07 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:07.601 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[4378f71d-7c4c-4010-891e-045354d9f477]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:06:07 localhost nova_compute[279673]: 2025-11-28 10:06:07.605 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:07 localhost kernel: device tap6b9af304-88 left promiscuous mode Nov 28 05:06:07 localhost nova_compute[279673]: 2025-11-28 10:06:07.625 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:08 localhost dnsmasq[323699]: read /var/lib/neutron/dhcp/7aa47953-45b5-4e9e-a0ad-6ce1121b65d3/addn_hosts - 0 addresses Nov 28 05:06:08 localhost dnsmasq-dhcp[323699]: read /var/lib/neutron/dhcp/7aa47953-45b5-4e9e-a0ad-6ce1121b65d3/host Nov 28 05:06:08 localhost dnsmasq-dhcp[323699]: read /var/lib/neutron/dhcp/7aa47953-45b5-4e9e-a0ad-6ce1121b65d3/opts Nov 28 05:06:08 localhost podman[323719]: 2025-11-28 10:06:08.314114615 +0000 UTC m=+0.060852805 container kill e1b8c98f5cd735ffc0cdccca2a12012cef919d6f5c249aa963f6344ad96606c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent [None req-d45746ab-e98a-471b-b415-935bfb9a74da - - - - - -] Unable to reload_allocations dhcp for 7aa47953-45b5-4e9e-a0ad-6ce1121b65d3.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap6b9af304-88 not found in namespace qdhcp-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3. Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent return fut.result() Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent return self.__get_result() Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent raise self._exception Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap6b9af304-88 not found in namespace qdhcp-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3. Nov 28 05:06:08 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:08.337 261084 ERROR neutron.agent.dhcp.agent #033[00m Nov 28 05:06:08 localhost ovn_controller[152322]: 2025-11-28T10:06:08Z|00381|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:06:08 localhost nova_compute[279673]: 2025-11-28 10:06:08.643 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:06:08 localhost podman[323732]: 2025-11-28 10:06:08.853283764 +0000 UTC m=+0.084858582 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 05:06:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:06:08 localhost podman[323732]: 2025-11-28 10:06:08.871651521 +0000 UTC m=+0.103226319 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 05:06:08 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:06:08 localhost podman[323755]: 2025-11-28 10:06:08.976421923 +0000 UTC m=+0.099046449 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Nov 28 05:06:08 localhost podman[323755]: 2025-11-28 10:06:08.991498145 +0000 UTC m=+0.114122711 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 28 05:06:09 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:06:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e160 do_prune osdmap full prune enabled Nov 28 05:06:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e161 e161: 6 total, 6 up, 6 in Nov 28 05:06:09 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e161: 6 total, 6 up, 6 in Nov 28 05:06:09 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:09.390 2 INFO neutron.agent.securitygroups_rpc [None req-88638c5c-0267-4c55-b76f-e8bedf4b6799 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']#033[00m Nov 28 05:06:09 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:09.411 261084 INFO neutron.agent.linux.ip_lib [None req-df36a7d6-c202-4395-8eec-27213b0f90c7 - - - - - -] Device tap9f28414a-bc cannot be used as it has no MAC address#033[00m Nov 28 05:06:09 localhost nova_compute[279673]: 2025-11-28 10:06:09.446 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:09 localhost kernel: device tap9f28414a-bc entered promiscuous mode Nov 28 05:06:09 localhost nova_compute[279673]: 2025-11-28 10:06:09.456 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:09 localhost NetworkManager[5967]: [1764324369.4565] manager: (tap9f28414a-bc): new Generic device (/org/freedesktop/NetworkManager/Devices/63) Nov 28 05:06:09 localhost ovn_controller[152322]: 2025-11-28T10:06:09Z|00382|binding|INFO|Claiming lport 9f28414a-bc83-4ccc-ac84-3585c39e468a for this chassis. Nov 28 05:06:09 localhost ovn_controller[152322]: 2025-11-28T10:06:09Z|00383|binding|INFO|9f28414a-bc83-4ccc-ac84-3585c39e468a: Claiming unknown Nov 28 05:06:09 localhost systemd-udevd[323784]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:06:09 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:09.468 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-a31c6261-6aec-4e5b-8552-7f0b3ff5946f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a31c6261-6aec-4e5b-8552-7f0b3ff5946f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=05816226-956c-45ae-8b67-7c74d141697e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9f28414a-bc83-4ccc-ac84-3585c39e468a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:06:09 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:09.471 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 9f28414a-bc83-4ccc-ac84-3585c39e468a in datapath a31c6261-6aec-4e5b-8552-7f0b3ff5946f bound to our chassis#033[00m Nov 28 05:06:09 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:09.475 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port f3273cf3-8b00-4e72-8ef6-318774bfd7b2 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:06:09 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:09.475 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a31c6261-6aec-4e5b-8552-7f0b3ff5946f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:06:09 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:09.477 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[a62d72fe-26bf-4ffc-8943-c6a1b24cc38e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:06:09 localhost journal[227875]: ethtool ioctl error on tap9f28414a-bc: No such device Nov 28 05:06:09 localhost ovn_controller[152322]: 2025-11-28T10:06:09Z|00384|binding|INFO|Setting lport 9f28414a-bc83-4ccc-ac84-3585c39e468a ovn-installed in OVS Nov 28 05:06:09 localhost ovn_controller[152322]: 2025-11-28T10:06:09Z|00385|binding|INFO|Setting lport 9f28414a-bc83-4ccc-ac84-3585c39e468a up in Southbound Nov 28 05:06:09 localhost nova_compute[279673]: 2025-11-28 10:06:09.510 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:09 localhost journal[227875]: ethtool ioctl error on tap9f28414a-bc: No such device Nov 28 05:06:09 localhost journal[227875]: ethtool ioctl error on tap9f28414a-bc: No such device Nov 28 05:06:09 localhost journal[227875]: ethtool ioctl error on tap9f28414a-bc: No such device Nov 28 05:06:09 localhost journal[227875]: ethtool ioctl error on tap9f28414a-bc: No such device Nov 28 05:06:09 localhost journal[227875]: ethtool ioctl error on tap9f28414a-bc: No such device Nov 28 05:06:09 localhost journal[227875]: ethtool ioctl error on tap9f28414a-bc: No such device Nov 28 05:06:09 localhost journal[227875]: ethtool ioctl error on tap9f28414a-bc: No such device Nov 28 05:06:09 localhost nova_compute[279673]: 2025-11-28 10:06:09.564 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:09 localhost nova_compute[279673]: 2025-11-28 10:06:09.601 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:10 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:10.073 2 INFO neutron.agent.securitygroups_rpc [None req-58c4368b-992c-4171-8380-96e11b260575 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']#033[00m Nov 28 05:06:10 localhost podman[238687]: time="2025-11-28T10:06:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:06:10 localhost podman[238687]: @ - - [28/Nov/2025:10:06:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 161163 "" "Go-http-client/1.1" Nov 28 05:06:10 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:06:10 localhost podman[238687]: @ - - [28/Nov/2025:10:06:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20695 "" "Go-http-client/1.1" Nov 28 05:06:10 localhost podman[323855]: Nov 28 05:06:10 localhost podman[323855]: 2025-11-28 10:06:10.560642181 +0000 UTC m=+0.095779582 container create 97dc091afbfb3d896ac4141ccfc99fc0fb34966e3631b0de0ec2bba64af16354 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a31c6261-6aec-4e5b-8552-7f0b3ff5946f, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0) Nov 28 05:06:10 localhost podman[323855]: 2025-11-28 10:06:10.514083526 +0000 UTC m=+0.049220947 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:06:10 localhost systemd[1]: Started libpod-conmon-97dc091afbfb3d896ac4141ccfc99fc0fb34966e3631b0de0ec2bba64af16354.scope. Nov 28 05:06:10 localhost systemd[1]: Started libcrun container. Nov 28 05:06:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05b7276827f072b306555cf6cbc24a30c5bb95b18f1dc24627a8bd098febac87/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:06:10 localhost podman[323855]: 2025-11-28 10:06:10.676462404 +0000 UTC m=+0.211599795 container init 97dc091afbfb3d896ac4141ccfc99fc0fb34966e3631b0de0ec2bba64af16354 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a31c6261-6aec-4e5b-8552-7f0b3ff5946f, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125) Nov 28 05:06:10 localhost podman[323855]: 2025-11-28 10:06:10.686255838 +0000 UTC m=+0.221393229 container start 97dc091afbfb3d896ac4141ccfc99fc0fb34966e3631b0de0ec2bba64af16354 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a31c6261-6aec-4e5b-8552-7f0b3ff5946f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:06:10 localhost dnsmasq[323873]: started, version 2.85 cachesize 150 Nov 28 05:06:10 localhost dnsmasq[323873]: DNS service limited to local subnets Nov 28 05:06:10 localhost dnsmasq[323873]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:06:10 localhost dnsmasq[323873]: warning: no upstream servers configured Nov 28 05:06:10 localhost dnsmasq-dhcp[323873]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:06:10 localhost dnsmasq[323873]: read /var/lib/neutron/dhcp/a31c6261-6aec-4e5b-8552-7f0b3ff5946f/addn_hosts - 0 addresses Nov 28 05:06:10 localhost dnsmasq-dhcp[323873]: read /var/lib/neutron/dhcp/a31c6261-6aec-4e5b-8552-7f0b3ff5946f/host Nov 28 05:06:10 localhost dnsmasq-dhcp[323873]: read /var/lib/neutron/dhcp/a31c6261-6aec-4e5b-8552-7f0b3ff5946f/opts Nov 28 05:06:10 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:10.753 261084 INFO neutron.agent.dhcp.agent [None req-3892baa5-d454-46db-a6f6-03a5e0466c91 - - - - - -] Synchronizing state#033[00m Nov 28 05:06:10 localhost nova_compute[279673]: 2025-11-28 10:06:10.778 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:10 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:10.811 261084 INFO neutron.agent.dhcp.agent [None req-8e0eeb55-b1dd-4683-a456-f2bbe435ed76 - - - - - -] DHCP configuration for ports {'806f1b66-c281-4ff2-a156-2f6b3a5062cd'} is completed#033[00m Nov 28 05:06:10 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:10.908 2 INFO neutron.agent.securitygroups_rpc [None req-dbbfa92a-6f48-4219-96e5-450182918d3d e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']#033[00m Nov 28 05:06:10 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:10.932 261084 INFO neutron.agent.dhcp.agent [None req-f0ac88e7-c8b1-41b1-80b8-08b0837b50f0 - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 28 05:06:10 localhost nova_compute[279673]: 2025-11-28 10:06:10.982 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:10 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:10.997 2 INFO neutron.agent.securitygroups_rpc [None req-dbbfa92a-6f48-4219-96e5-450182918d3d e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']#033[00m Nov 28 05:06:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e161 do_prune osdmap full prune enabled Nov 28 05:06:11 localhost dnsmasq[323699]: exiting on receipt of SIGTERM Nov 28 05:06:11 localhost podman[323889]: 2025-11-28 10:06:11.126642211 +0000 UTC m=+0.067399372 container kill e1b8c98f5cd735ffc0cdccca2a12012cef919d6f5c249aa963f6344ad96606c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 28 05:06:11 localhost systemd[1]: libpod-e1b8c98f5cd735ffc0cdccca2a12012cef919d6f5c249aa963f6344ad96606c6.scope: Deactivated successfully. Nov 28 05:06:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e162 e162: 6 total, 6 up, 6 in Nov 28 05:06:11 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e162: 6 total, 6 up, 6 in Nov 28 05:06:11 localhost podman[323901]: 2025-11-28 10:06:11.21495442 +0000 UTC m=+0.072165909 container died e1b8c98f5cd735ffc0cdccca2a12012cef919d6f5c249aa963f6344ad96606c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true) Nov 28 05:06:11 localhost podman[323901]: 2025-11-28 10:06:11.25458968 +0000 UTC m=+0.111801129 container cleanup e1b8c98f5cd735ffc0cdccca2a12012cef919d6f5c249aa963f6344ad96606c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Nov 28 05:06:11 localhost systemd[1]: libpod-conmon-e1b8c98f5cd735ffc0cdccca2a12012cef919d6f5c249aa963f6344ad96606c6.scope: Deactivated successfully. Nov 28 05:06:11 localhost podman[323903]: 2025-11-28 10:06:11.30488493 +0000 UTC m=+0.152446780 container remove e1b8c98f5cd735ffc0cdccca2a12012cef919d6f5c249aa963f6344ad96606c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7aa47953-45b5-4e9e-a0ad-6ce1121b65d3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 28 05:06:11 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:11.316 2 INFO neutron.agent.securitygroups_rpc [None req-a2df05ee-0a35-442d-aa18-eb9de4dda01c e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']#033[00m Nov 28 05:06:11 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:11.338 261084 INFO neutron.agent.dhcp.agent [-] Starting network 4dc5b71e-287e-4ec6-b6b7-4d131e85d551 dhcp configuration#033[00m Nov 28 05:06:11 localhost systemd[1]: var-lib-containers-storage-overlay-7af7ba26fc5497d5193fe60b439f0ab812c8b19ba53610d2ee58bf5d35d015e0-merged.mount: Deactivated successfully. Nov 28 05:06:11 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e1b8c98f5cd735ffc0cdccca2a12012cef919d6f5c249aa963f6344ad96606c6-userdata-shm.mount: Deactivated successfully. Nov 28 05:06:11 localhost systemd[1]: run-netns-qdhcp\x2d7aa47953\x2d45b5\x2d4e9e\x2da0ad\x2d6ce1121b65d3.mount: Deactivated successfully. Nov 28 05:06:11 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:11.779 2 INFO neutron.agent.securitygroups_rpc [None req-cbfd0cad-e26e-4d58-9458-36c5ead2083c 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']#033[00m Nov 28 05:06:11 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:11.840 2 INFO neutron.agent.securitygroups_rpc [None req-841a5df4-1f75-4611-8939-235283ca6a97 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']#033[00m Nov 28 05:06:11 localhost nova_compute[279673]: 2025-11-28 10:06:11.853 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:12 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:12.122 261084 INFO neutron.agent.linux.ip_lib [None req-146628cc-f53a-48e0-97bd-a4c2ecadc8fb - - - - - -] Device tap1bff36cc-f5 cannot be used as it has no MAC address#033[00m Nov 28 05:06:12 localhost nova_compute[279673]: 2025-11-28 10:06:12.191 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:12 localhost kernel: device tap1bff36cc-f5 entered promiscuous mode Nov 28 05:06:12 localhost NetworkManager[5967]: [1764324372.2012] manager: (tap1bff36cc-f5): new Generic device (/org/freedesktop/NetworkManager/Devices/64) Nov 28 05:06:12 localhost systemd-udevd[323786]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:06:12 localhost ovn_controller[152322]: 2025-11-28T10:06:12Z|00386|binding|INFO|Claiming lport 1bff36cc-f508-4066-a5d7-c55bc5baf4a9 for this chassis. Nov 28 05:06:12 localhost ovn_controller[152322]: 2025-11-28T10:06:12Z|00387|binding|INFO|1bff36cc-f508-4066-a5d7-c55bc5baf4a9: Claiming unknown Nov 28 05:06:12 localhost nova_compute[279673]: 2025-11-28 10:06:12.203 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:12 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:12.215 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-4dc5b71e-287e-4ec6-b6b7-4d131e85d551', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4dc5b71e-287e-4ec6-b6b7-4d131e85d551', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '29d0d5b3ba0745d58aee3845ea704b73', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c0295c06-e7c1-42d0-9d25-c6c6ebd15e16, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1bff36cc-f508-4066-a5d7-c55bc5baf4a9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:06:12 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:12.217 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 1bff36cc-f508-4066-a5d7-c55bc5baf4a9 in datapath 4dc5b71e-287e-4ec6-b6b7-4d131e85d551 bound to our chassis#033[00m Nov 28 05:06:12 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:12.219 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4dc5b71e-287e-4ec6-b6b7-4d131e85d551 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:06:12 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:12.220 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[5a97589f-1ab5-4f9c-b8b0-207ced6eb030]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:06:12 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e162 do_prune osdmap full prune enabled Nov 28 05:06:12 localhost ovn_controller[152322]: 2025-11-28T10:06:12Z|00388|binding|INFO|Setting lport 1bff36cc-f508-4066-a5d7-c55bc5baf4a9 ovn-installed in OVS Nov 28 05:06:12 localhost ovn_controller[152322]: 2025-11-28T10:06:12Z|00389|binding|INFO|Setting lport 1bff36cc-f508-4066-a5d7-c55bc5baf4a9 up in Southbound Nov 28 05:06:12 localhost nova_compute[279673]: 2025-11-28 10:06:12.240 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:12 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e163 e163: 6 total, 6 up, 6 in Nov 28 05:06:12 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e163: 6 total, 6 up, 6 in Nov 28 05:06:12 localhost nova_compute[279673]: 2025-11-28 10:06:12.299 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:12 localhost nova_compute[279673]: 2025-11-28 10:06:12.346 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:13 localhost podman[323995]: Nov 28 05:06:13 localhost podman[323995]: 2025-11-28 10:06:13.315766668 +0000 UTC m=+0.087240427 container create a69ad56203ce2d58010d1a3d9560651f5f89bafa2fcdaede5f1718e219c2c9ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 28 05:06:13 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:06:13 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1677773907' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:06:13 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:06:13 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1677773907' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:06:13 localhost systemd[1]: Started libpod-conmon-a69ad56203ce2d58010d1a3d9560651f5f89bafa2fcdaede5f1718e219c2c9ba.scope. Nov 28 05:06:13 localhost systemd[1]: Started libcrun container. Nov 28 05:06:13 localhost podman[323995]: 2025-11-28 10:06:13.272913139 +0000 UTC m=+0.044386908 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:06:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fbbc0b5dc4455006807612af40ea366df63f7068109c1199eb3107e33ae30da1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:06:13 localhost podman[323995]: 2025-11-28 10:06:13.383008714 +0000 UTC m=+0.154482483 container init a69ad56203ce2d58010d1a3d9560651f5f89bafa2fcdaede5f1718e219c2c9ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:06:13 localhost podman[323995]: 2025-11-28 10:06:13.394034986 +0000 UTC m=+0.165508705 container start a69ad56203ce2d58010d1a3d9560651f5f89bafa2fcdaede5f1718e219c2c9ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:06:13 localhost dnsmasq[324012]: started, version 2.85 cachesize 150 Nov 28 05:06:13 localhost dnsmasq[324012]: DNS service limited to local subnets Nov 28 05:06:13 localhost dnsmasq[324012]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:06:13 localhost dnsmasq[324012]: warning: no upstream servers configured Nov 28 05:06:13 localhost dnsmasq-dhcp[324012]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:06:13 localhost dnsmasq[324012]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/addn_hosts - 0 addresses Nov 28 05:06:13 localhost dnsmasq-dhcp[324012]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/host Nov 28 05:06:13 localhost dnsmasq-dhcp[324012]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/opts Nov 28 05:06:13 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:13.468 261084 INFO neutron.agent.dhcp.agent [None req-259ccea0-5b7c-48f2-bda3-af2d63ecdc14 - - - - - -] Finished network 4dc5b71e-287e-4ec6-b6b7-4d131e85d551 dhcp configuration#033[00m Nov 28 05:06:13 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:13.470 261084 INFO neutron.agent.dhcp.agent [None req-a54afd89-63c5-4b53-bc59-976538151a8c - - - - - -] Synchronizing state complete#033[00m Nov 28 05:06:13 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:13.473 261084 INFO neutron.agent.dhcp.agent [None req-df8bf941-a81b-43f3-a5d2-ac851bb9d287 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:06:09Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=8971d2b3-e3e8-4058-a520-863fde2aaa63, ip_allocation=immediate, mac_address=fa:16:3e:ea:c0:b6, name=tempest-PortsTestJSON-538865126, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:06:05Z, description=, dns_domain=, id=a31c6261-6aec-4e5b-8552-7f0b3ff5946f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-2085773191, port_security_enabled=True, project_id=5e7a07c97c664076bc825e05137c574c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9902, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2204, status=ACTIVE, subnets=['3b5c7335-5ab3-4264-870c-37328703e1d1'], tags=[], tenant_id=5e7a07c97c664076bc825e05137c574c, updated_at=2025-11-28T10:06:07Z, vlan_transparent=None, network_id=a31c6261-6aec-4e5b-8552-7f0b3ff5946f, port_security_enabled=True, project_id=5e7a07c97c664076bc825e05137c574c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b'], standard_attr_id=2247, status=DOWN, tags=[], tenant_id=5e7a07c97c664076bc825e05137c574c, updated_at=2025-11-28T10:06:09Z on network a31c6261-6aec-4e5b-8552-7f0b3ff5946f#033[00m Nov 28 05:06:13 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:13.701 261084 INFO neutron.agent.dhcp.agent [None req-61afb93c-70a2-4d65-8044-2ca1b5bda457 - - - - - -] DHCP configuration for ports {'2f0cd637-5351-4b69-86b0-00f4d51eccc9'} is completed#033[00m Nov 28 05:06:13 localhost dnsmasq[323873]: read /var/lib/neutron/dhcp/a31c6261-6aec-4e5b-8552-7f0b3ff5946f/addn_hosts - 1 addresses Nov 28 05:06:13 localhost dnsmasq-dhcp[323873]: read /var/lib/neutron/dhcp/a31c6261-6aec-4e5b-8552-7f0b3ff5946f/host Nov 28 05:06:13 localhost dnsmasq-dhcp[323873]: read /var/lib/neutron/dhcp/a31c6261-6aec-4e5b-8552-7f0b3ff5946f/opts Nov 28 05:06:13 localhost podman[324030]: 2025-11-28 10:06:13.733186408 +0000 UTC m=+0.070660133 container kill 97dc091afbfb3d896ac4141ccfc99fc0fb34966e3631b0de0ec2bba64af16354 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a31c6261-6aec-4e5b-8552-7f0b3ff5946f, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true) Nov 28 05:06:13 localhost dnsmasq[324012]: exiting on receipt of SIGTERM Nov 28 05:06:13 localhost systemd[1]: libpod-a69ad56203ce2d58010d1a3d9560651f5f89bafa2fcdaede5f1718e219c2c9ba.scope: Deactivated successfully. Nov 28 05:06:13 localhost podman[324064]: 2025-11-28 10:06:13.911130129 +0000 UTC m=+0.068351142 container kill a69ad56203ce2d58010d1a3d9560651f5f89bafa2fcdaede5f1718e219c2c9ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:06:13 localhost podman[324081]: 2025-11-28 10:06:13.98366493 +0000 UTC m=+0.058324871 container died a69ad56203ce2d58010d1a3d9560651f5f89bafa2fcdaede5f1718e219c2c9ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Nov 28 05:06:14 localhost podman[324081]: 2025-11-28 10:06:14.021345418 +0000 UTC m=+0.096005320 container cleanup a69ad56203ce2d58010d1a3d9560651f5f89bafa2fcdaede5f1718e219c2c9ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2) Nov 28 05:06:14 localhost systemd[1]: libpod-conmon-a69ad56203ce2d58010d1a3d9560651f5f89bafa2fcdaede5f1718e219c2c9ba.scope: Deactivated successfully. Nov 28 05:06:14 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:14.033 261084 INFO neutron.agent.dhcp.agent [None req-b8fb8199-99b4-4b6d-bf42-b356b5d98cf2 - - - - - -] DHCP configuration for ports {'8971d2b3-e3e8-4058-a520-863fde2aaa63'} is completed#033[00m Nov 28 05:06:14 localhost podman[324083]: 2025-11-28 10:06:14.074425885 +0000 UTC m=+0.137980271 container remove a69ad56203ce2d58010d1a3d9560651f5f89bafa2fcdaede5f1718e219c2c9ba (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:06:14 localhost ovn_controller[152322]: 2025-11-28T10:06:14Z|00390|binding|INFO|Removing iface tap9f28414a-bc ovn-installed in OVS Nov 28 05:06:14 localhost ovn_controller[152322]: 2025-11-28T10:06:14Z|00391|binding|INFO|Removing lport 9f28414a-bc83-4ccc-ac84-3585c39e468a ovn-installed in OVS Nov 28 05:06:14 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:14.187 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port f3273cf3-8b00-4e72-8ef6-318774bfd7b2 with type ""#033[00m Nov 28 05:06:14 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:14.189 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-a31c6261-6aec-4e5b-8552-7f0b3ff5946f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a31c6261-6aec-4e5b-8552-7f0b3ff5946f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=05816226-956c-45ae-8b67-7c74d141697e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9f28414a-bc83-4ccc-ac84-3585c39e468a) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:06:14 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:14.190 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 9f28414a-bc83-4ccc-ac84-3585c39e468a in datapath a31c6261-6aec-4e5b-8552-7f0b3ff5946f unbound from our chassis#033[00m Nov 28 05:06:14 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:14.194 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a31c6261-6aec-4e5b-8552-7f0b3ff5946f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:06:14 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:14.195 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[1037481d-251e-4d9c-bcd6-4b4c2bfcff7b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:06:14 localhost nova_compute[279673]: 2025-11-28 10:06:14.243 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:14 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e163 do_prune osdmap full prune enabled Nov 28 05:06:14 localhost dnsmasq[323873]: exiting on receipt of SIGTERM Nov 28 05:06:14 localhost podman[324131]: 2025-11-28 10:06:14.280125237 +0000 UTC m=+0.113843943 container kill 97dc091afbfb3d896ac4141ccfc99fc0fb34966e3631b0de0ec2bba64af16354 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a31c6261-6aec-4e5b-8552-7f0b3ff5946f, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 28 05:06:14 localhost systemd[1]: libpod-97dc091afbfb3d896ac4141ccfc99fc0fb34966e3631b0de0ec2bba64af16354.scope: Deactivated successfully. Nov 28 05:06:14 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e164 e164: 6 total, 6 up, 6 in Nov 28 05:06:14 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e164: 6 total, 6 up, 6 in Nov 28 05:06:14 localhost systemd[1]: var-lib-containers-storage-overlay-fbbc0b5dc4455006807612af40ea366df63f7068109c1199eb3107e33ae30da1-merged.mount: Deactivated successfully. Nov 28 05:06:14 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a69ad56203ce2d58010d1a3d9560651f5f89bafa2fcdaede5f1718e219c2c9ba-userdata-shm.mount: Deactivated successfully. Nov 28 05:06:14 localhost podman[324153]: 2025-11-28 10:06:14.373484154 +0000 UTC m=+0.064872594 container died 97dc091afbfb3d896ac4141ccfc99fc0fb34966e3631b0de0ec2bba64af16354 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a31c6261-6aec-4e5b-8552-7f0b3ff5946f, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:06:14 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-97dc091afbfb3d896ac4141ccfc99fc0fb34966e3631b0de0ec2bba64af16354-userdata-shm.mount: Deactivated successfully. Nov 28 05:06:14 localhost systemd[1]: var-lib-containers-storage-overlay-05b7276827f072b306555cf6cbc24a30c5bb95b18f1dc24627a8bd098febac87-merged.mount: Deactivated successfully. Nov 28 05:06:14 localhost podman[324153]: 2025-11-28 10:06:14.413768233 +0000 UTC m=+0.105156643 container remove 97dc091afbfb3d896ac4141ccfc99fc0fb34966e3631b0de0ec2bba64af16354 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a31c6261-6aec-4e5b-8552-7f0b3ff5946f, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 28 05:06:14 localhost nova_compute[279673]: 2025-11-28 10:06:14.432 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:14 localhost kernel: device tap9f28414a-bc left promiscuous mode Nov 28 05:06:14 localhost nova_compute[279673]: 2025-11-28 10:06:14.447 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:14 localhost systemd[1]: libpod-conmon-97dc091afbfb3d896ac4141ccfc99fc0fb34966e3631b0de0ec2bba64af16354.scope: Deactivated successfully. Nov 28 05:06:14 localhost systemd[1]: run-netns-qdhcp\x2da31c6261\x2d6aec\x2d4e5b\x2d8552\x2d7f0b3ff5946f.mount: Deactivated successfully. Nov 28 05:06:14 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:14.493 261084 INFO neutron.agent.dhcp.agent [None req-8fb76a28-58bf-4f2e-8c38-5de0ca321bfa - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:06:14 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:14.494 261084 INFO neutron.agent.dhcp.agent [None req-8fb76a28-58bf-4f2e-8c38-5de0ca321bfa - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:06:14 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:14.835 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:06:15 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:15.011 2 INFO neutron.agent.securitygroups_rpc [None req-e90d3b0d-d250-425f-9007-eccb585011b0 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']#033[00m Nov 28 05:06:15 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e164 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:06:15 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e164 do_prune osdmap full prune enabled Nov 28 05:06:15 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e165 e165: 6 total, 6 up, 6 in Nov 28 05:06:15 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e165: 6 total, 6 up, 6 in Nov 28 05:06:15 localhost podman[324216]: Nov 28 05:06:15 localhost podman[324216]: 2025-11-28 10:06:15.161263424 +0000 UTC m=+0.107355151 container create c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125) Nov 28 05:06:15 localhost systemd[1]: Started libpod-conmon-c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952.scope. Nov 28 05:06:15 localhost podman[324216]: 2025-11-28 10:06:15.106138474 +0000 UTC m=+0.052230241 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:06:15 localhost systemd[1]: Started libcrun container. Nov 28 05:06:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/518cfa22c4e318530a6e5aefab2c4e20bcb90abb837fcbd18b8f75b2b31294f3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:06:15 localhost podman[324216]: 2025-11-28 10:06:15.230509553 +0000 UTC m=+0.176601290 container init c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125) Nov 28 05:06:15 localhost podman[324216]: 2025-11-28 10:06:15.240614916 +0000 UTC m=+0.186706633 container start c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 05:06:15 localhost dnsmasq[324234]: started, version 2.85 cachesize 150 Nov 28 05:06:15 localhost dnsmasq[324234]: DNS service limited to local subnets Nov 28 05:06:15 localhost dnsmasq[324234]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:06:15 localhost dnsmasq[324234]: warning: no upstream servers configured Nov 28 05:06:15 localhost dnsmasq-dhcp[324234]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:06:15 localhost dnsmasq[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/addn_hosts - 0 addresses Nov 28 05:06:15 localhost dnsmasq-dhcp[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/host Nov 28 05:06:15 localhost dnsmasq-dhcp[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/opts Nov 28 05:06:15 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:15.558 261084 INFO neutron.agent.dhcp.agent [None req-d655e655-3f16-4703-a66f-e36878eb1591 - - - - - -] DHCP configuration for ports {'1bff36cc-f508-4066-a5d7-c55bc5baf4a9', '2f0cd637-5351-4b69-86b0-00f4d51eccc9'} is completed#033[00m Nov 28 05:06:15 localhost dnsmasq[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/addn_hosts - 0 addresses Nov 28 05:06:15 localhost dnsmasq-dhcp[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/host Nov 28 05:06:15 localhost dnsmasq-dhcp[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/opts Nov 28 05:06:15 localhost podman[324252]: 2025-11-28 10:06:15.741853556 +0000 UTC m=+0.064315496 container kill c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 28 05:06:15 localhost nova_compute[279673]: 2025-11-28 10:06:15.802 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:15 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:15.899 261084 INFO neutron.agent.dhcp.agent [None req-a54afd89-63c5-4b53-bc59-976538151a8c - - - - - -] Synchronizing state#033[00m Nov 28 05:06:15 localhost ovn_controller[152322]: 2025-11-28T10:06:15Z|00392|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:06:15 localhost nova_compute[279673]: 2025-11-28 10:06:15.946 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:15 localhost nova_compute[279673]: 2025-11-28 10:06:15.985 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:16 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:16.022 2 INFO neutron.agent.securitygroups_rpc [None req-3771b2dc-83c7-4322-8cb3-68ef5f3840bb e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']#033[00m Nov 28 05:06:16 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:16.158 261084 INFO neutron.agent.dhcp.agent [None req-6a72d357-0d95-4adc-8d0a-cc258cb0edd9 - - - - - -] DHCP configuration for ports {'1bff36cc-f508-4066-a5d7-c55bc5baf4a9', '2f0cd637-5351-4b69-86b0-00f4d51eccc9'} is completed#033[00m Nov 28 05:06:16 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:16.188 261084 INFO neutron.agent.dhcp.agent [None req-48712969-2fd3-4809-8736-106860b7dd0e - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 28 05:06:16 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:16.189 261084 INFO neutron.agent.dhcp.agent [-] Starting network 66c5dde3-dd95-4799-b7fb-daebf3806263 dhcp configuration#033[00m Nov 28 05:06:16 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:16.189 261084 INFO neutron.agent.dhcp.agent [-] Finished network 66c5dde3-dd95-4799-b7fb-daebf3806263 dhcp configuration#033[00m Nov 28 05:06:16 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:16.190 261084 INFO neutron.agent.dhcp.agent [None req-48712969-2fd3-4809-8736-106860b7dd0e - - - - - -] Synchronizing state complete#033[00m Nov 28 05:06:16 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:16.191 261084 INFO neutron.agent.dhcp.agent [None req-d2bc7583-b8ae-435a-8225-f6a351893a7d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:06:16 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:16.191 261084 INFO neutron.agent.dhcp.agent [None req-d2bc7583-b8ae-435a-8225-f6a351893a7d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:06:17 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 28 05:06:17 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3947693664' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 28 05:06:18 localhost openstack_network_exporter[240658]: ERROR 10:06:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:06:18 localhost openstack_network_exporter[240658]: ERROR 10:06:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:06:18 localhost openstack_network_exporter[240658]: ERROR 10:06:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:06:18 localhost openstack_network_exporter[240658]: ERROR 10:06:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:06:18 localhost openstack_network_exporter[240658]: Nov 28 05:06:18 localhost openstack_network_exporter[240658]: ERROR 10:06:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:06:18 localhost openstack_network_exporter[240658]: Nov 28 05:06:18 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e165 do_prune osdmap full prune enabled Nov 28 05:06:18 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e166 e166: 6 total, 6 up, 6 in Nov 28 05:06:18 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e166: 6 total, 6 up, 6 in Nov 28 05:06:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:06:18 localhost nova_compute[279673]: 2025-11-28 10:06:18.857 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:18 localhost podman[324274]: 2025-11-28 10:06:18.901893956 +0000 UTC m=+0.125643700 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, architecture=x86_64) Nov 28 05:06:18 localhost podman[324274]: 2025-11-28 10:06:18.91881249 +0000 UTC m=+0.142562224 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, config_id=edpm, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vendor=Red Hat, Inc.) Nov 28 05:06:18 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:06:19 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e166 do_prune osdmap full prune enabled Nov 28 05:06:19 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e167 e167: 6 total, 6 up, 6 in Nov 28 05:06:19 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e167: 6 total, 6 up, 6 in Nov 28 05:06:20 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:06:20 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e167 do_prune osdmap full prune enabled Nov 28 05:06:20 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e168 e168: 6 total, 6 up, 6 in Nov 28 05:06:20 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e168: 6 total, 6 up, 6 in Nov 28 05:06:20 localhost nova_compute[279673]: 2025-11-28 10:06:20.181 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:20 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:20.471 261084 INFO neutron.agent.linux.ip_lib [None req-7734d0e4-46d2-4c3f-a26c-3a43df1cc1ff - - - - - -] Device tapdc62470e-a4 cannot be used as it has no MAC address#033[00m Nov 28 05:06:20 localhost nova_compute[279673]: 2025-11-28 10:06:20.498 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:20 localhost kernel: device tapdc62470e-a4 entered promiscuous mode Nov 28 05:06:20 localhost NetworkManager[5967]: [1764324380.5055] manager: (tapdc62470e-a4): new Generic device (/org/freedesktop/NetworkManager/Devices/65) Nov 28 05:06:20 localhost ovn_controller[152322]: 2025-11-28T10:06:20Z|00393|binding|INFO|Claiming lport dc62470e-a40f-4105-848f-1f2b879c4aae for this chassis. Nov 28 05:06:20 localhost ovn_controller[152322]: 2025-11-28T10:06:20Z|00394|binding|INFO|dc62470e-a40f-4105-848f-1f2b879c4aae: Claiming unknown Nov 28 05:06:20 localhost nova_compute[279673]: 2025-11-28 10:06:20.508 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:20 localhost systemd-udevd[324302]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:06:20 localhost nova_compute[279673]: 2025-11-28 10:06:20.535 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:20 localhost journal[227875]: ethtool ioctl error on tapdc62470e-a4: No such device Nov 28 05:06:20 localhost ovn_controller[152322]: 2025-11-28T10:06:20Z|00395|binding|INFO|Setting lport dc62470e-a40f-4105-848f-1f2b879c4aae ovn-installed in OVS Nov 28 05:06:20 localhost nova_compute[279673]: 2025-11-28 10:06:20.541 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:20 localhost nova_compute[279673]: 2025-11-28 10:06:20.543 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:20 localhost journal[227875]: ethtool ioctl error on tapdc62470e-a4: No such device Nov 28 05:06:20 localhost journal[227875]: ethtool ioctl error on tapdc62470e-a4: No such device Nov 28 05:06:20 localhost ovn_controller[152322]: 2025-11-28T10:06:20Z|00396|binding|INFO|Setting lport dc62470e-a40f-4105-848f-1f2b879c4aae up in Southbound Nov 28 05:06:20 localhost journal[227875]: ethtool ioctl error on tapdc62470e-a4: No such device Nov 28 05:06:20 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:20.553 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-10c0858a-69b4-4de1-aea8-8d780005bf13', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-10c0858a-69b4-4de1-aea8-8d780005bf13', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b2dec4622fe844c592a13e779612beaa', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64753474-6cc2-4012-bab6-4f0449c46fab, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=dc62470e-a40f-4105-848f-1f2b879c4aae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:06:20 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:20.555 158130 INFO neutron.agent.ovn.metadata.agent [-] Port dc62470e-a40f-4105-848f-1f2b879c4aae in datapath 10c0858a-69b4-4de1-aea8-8d780005bf13 bound to our chassis#033[00m Nov 28 05:06:20 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:20.557 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 10c0858a-69b4-4de1-aea8-8d780005bf13 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:06:20 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:20.558 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[0972ce58-f1f3-4460-8c99-6513129d7229]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:06:20 localhost journal[227875]: ethtool ioctl error on tapdc62470e-a4: No such device Nov 28 05:06:20 localhost journal[227875]: ethtool ioctl error on tapdc62470e-a4: No such device Nov 28 05:06:20 localhost journal[227875]: ethtool ioctl error on tapdc62470e-a4: No such device Nov 28 05:06:20 localhost journal[227875]: ethtool ioctl error on tapdc62470e-a4: No such device Nov 28 05:06:20 localhost nova_compute[279673]: 2025-11-28 10:06:20.591 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:20 localhost nova_compute[279673]: 2025-11-28 10:06:20.618 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:20 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:20.620 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:06:20Z, description=, device_id=5b45f823-0eca-4648-936e-96781a85013b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d39baeac-faf7-4050-b1c2-2f8c9573c064, ip_allocation=immediate, mac_address=fa:16:3e:45:9b:f5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:06:08Z, description=, dns_domain=, id=4dc5b71e-287e-4ec6-b6b7-4d131e85d551, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-346806519-network, port_security_enabled=True, project_id=29d0d5b3ba0745d58aee3845ea704b73, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=16781, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2238, status=ACTIVE, subnets=['1d51cccc-0a3c-4da7-88f2-d129e18efd59'], tags=[], tenant_id=29d0d5b3ba0745d58aee3845ea704b73, updated_at=2025-11-28T10:06:09Z, vlan_transparent=None, network_id=4dc5b71e-287e-4ec6-b6b7-4d131e85d551, port_security_enabled=False, project_id=29d0d5b3ba0745d58aee3845ea704b73, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2306, status=DOWN, tags=[], tenant_id=29d0d5b3ba0745d58aee3845ea704b73, updated_at=2025-11-28T10:06:20Z on network 4dc5b71e-287e-4ec6-b6b7-4d131e85d551#033[00m Nov 28 05:06:20 localhost nova_compute[279673]: 2025-11-28 10:06:20.806 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:20 localhost podman[324350]: 2025-11-28 10:06:20.852464102 +0000 UTC m=+0.059763675 container kill c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:06:20 localhost dnsmasq[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/addn_hosts - 1 addresses Nov 28 05:06:20 localhost dnsmasq-dhcp[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/host Nov 28 05:06:20 localhost dnsmasq-dhcp[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/opts Nov 28 05:06:20 localhost nova_compute[279673]: 2025-11-28 10:06:20.988 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e168 do_prune osdmap full prune enabled Nov 28 05:06:21 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:21.153 261084 INFO neutron.agent.dhcp.agent [None req-2764cf61-d9b3-474b-8e9f-6056c6b8e402 - - - - - -] DHCP configuration for ports {'d39baeac-faf7-4050-b1c2-2f8c9573c064'} is completed#033[00m Nov 28 05:06:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e169 e169: 6 total, 6 up, 6 in Nov 28 05:06:21 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e169: 6 total, 6 up, 6 in Nov 28 05:06:21 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:21.264 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port ff390443-5640-48d9-91fb-f1efebde638f with type ""#033[00m Nov 28 05:06:21 localhost ovn_controller[152322]: 2025-11-28T10:06:21Z|00397|binding|INFO|Removing iface tapdc62470e-a4 ovn-installed in OVS Nov 28 05:06:21 localhost ovn_controller[152322]: 2025-11-28T10:06:21Z|00398|binding|INFO|Removing lport dc62470e-a40f-4105-848f-1f2b879c4aae ovn-installed in OVS Nov 28 05:06:21 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:21.266 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-10c0858a-69b4-4de1-aea8-8d780005bf13', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-10c0858a-69b4-4de1-aea8-8d780005bf13', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b2dec4622fe844c592a13e779612beaa', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64753474-6cc2-4012-bab6-4f0449c46fab, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=dc62470e-a40f-4105-848f-1f2b879c4aae) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:06:21 localhost nova_compute[279673]: 2025-11-28 10:06:21.267 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:21 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:21.270 158130 INFO neutron.agent.ovn.metadata.agent [-] Port dc62470e-a40f-4105-848f-1f2b879c4aae in datapath 10c0858a-69b4-4de1-aea8-8d780005bf13 unbound from our chassis#033[00m Nov 28 05:06:21 localhost nova_compute[279673]: 2025-11-28 10:06:21.270 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:21 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:21.271 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 10c0858a-69b4-4de1-aea8-8d780005bf13 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:06:21 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:21.273 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[7176f691-d30f-48e6-b509-1b7fccc23695]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:06:21 localhost ovn_controller[152322]: 2025-11-28T10:06:21Z|00399|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:06:21 localhost nova_compute[279673]: 2025-11-28 10:06:21.543 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:21 localhost podman[324409]: Nov 28 05:06:21 localhost podman[324409]: 2025-11-28 10:06:21.633856954 +0000 UTC m=+0.098100775 container create 08f2491e5443093d105a1e930be3d3c856a41ce1ee3b9888ac54ff837a0f5839 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10c0858a-69b4-4de1-aea8-8d780005bf13, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 28 05:06:21 localhost systemd[1]: Started libpod-conmon-08f2491e5443093d105a1e930be3d3c856a41ce1ee3b9888ac54ff837a0f5839.scope. Nov 28 05:06:21 localhost podman[324409]: 2025-11-28 10:06:21.585794283 +0000 UTC m=+0.050038124 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:06:21 localhost systemd[1]: Started libcrun container. Nov 28 05:06:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bcb6162e51a81069b70e3cfbeb695cb5cd524dd6654c7ed4ff41275202a34476/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:06:21 localhost podman[324409]: 2025-11-28 10:06:21.725152627 +0000 UTC m=+0.189396408 container init 08f2491e5443093d105a1e930be3d3c856a41ce1ee3b9888ac54ff837a0f5839 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10c0858a-69b4-4de1-aea8-8d780005bf13, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:06:21 localhost podman[324409]: 2025-11-28 10:06:21.741154454 +0000 UTC m=+0.205398255 container start 08f2491e5443093d105a1e930be3d3c856a41ce1ee3b9888ac54ff837a0f5839 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10c0858a-69b4-4de1-aea8-8d780005bf13, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Nov 28 05:06:21 localhost dnsmasq[324427]: started, version 2.85 cachesize 150 Nov 28 05:06:21 localhost dnsmasq[324427]: DNS service limited to local subnets Nov 28 05:06:21 localhost dnsmasq[324427]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:06:21 localhost dnsmasq[324427]: warning: no upstream servers configured Nov 28 05:06:21 localhost dnsmasq-dhcp[324427]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:06:21 localhost dnsmasq[324427]: read /var/lib/neutron/dhcp/10c0858a-69b4-4de1-aea8-8d780005bf13/addn_hosts - 0 addresses Nov 28 05:06:21 localhost dnsmasq-dhcp[324427]: read /var/lib/neutron/dhcp/10c0858a-69b4-4de1-aea8-8d780005bf13/host Nov 28 05:06:21 localhost dnsmasq-dhcp[324427]: read /var/lib/neutron/dhcp/10c0858a-69b4-4de1-aea8-8d780005bf13/opts Nov 28 05:06:21 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:21.795 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:06:20Z, description=, device_id=5b45f823-0eca-4648-936e-96781a85013b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d39baeac-faf7-4050-b1c2-2f8c9573c064, ip_allocation=immediate, mac_address=fa:16:3e:45:9b:f5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:06:08Z, description=, dns_domain=, id=4dc5b71e-287e-4ec6-b6b7-4d131e85d551, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-346806519-network, port_security_enabled=True, project_id=29d0d5b3ba0745d58aee3845ea704b73, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=16781, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2238, status=ACTIVE, subnets=['1d51cccc-0a3c-4da7-88f2-d129e18efd59'], tags=[], tenant_id=29d0d5b3ba0745d58aee3845ea704b73, updated_at=2025-11-28T10:06:09Z, vlan_transparent=None, network_id=4dc5b71e-287e-4ec6-b6b7-4d131e85d551, port_security_enabled=False, project_id=29d0d5b3ba0745d58aee3845ea704b73, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2306, status=DOWN, tags=[], tenant_id=29d0d5b3ba0745d58aee3845ea704b73, updated_at=2025-11-28T10:06:20Z on network 4dc5b71e-287e-4ec6-b6b7-4d131e85d551#033[00m Nov 28 05:06:21 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:21.874 261084 INFO neutron.agent.dhcp.agent [None req-a1e6e3d9-c842-444a-be15-1415125307fa - - - - - -] DHCP configuration for ports {'d8b529de-dee0-4172-a457-4cf0aa46cb55'} is completed#033[00m Nov 28 05:06:21 localhost dnsmasq[324427]: exiting on receipt of SIGTERM Nov 28 05:06:21 localhost podman[324448]: 2025-11-28 10:06:21.998123905 +0000 UTC m=+0.070396245 container kill 08f2491e5443093d105a1e930be3d3c856a41ce1ee3b9888ac54ff837a0f5839 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10c0858a-69b4-4de1-aea8-8d780005bf13, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:06:22 localhost systemd[1]: libpod-08f2491e5443093d105a1e930be3d3c856a41ce1ee3b9888ac54ff837a0f5839.scope: Deactivated successfully. Nov 28 05:06:22 localhost podman[324473]: 2025-11-28 10:06:22.070134 +0000 UTC m=+0.056886466 container died 08f2491e5443093d105a1e930be3d3c856a41ce1ee3b9888ac54ff837a0f5839 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10c0858a-69b4-4de1-aea8-8d780005bf13, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Nov 28 05:06:22 localhost podman[324473]: 2025-11-28 10:06:22.108333565 +0000 UTC m=+0.095085991 container cleanup 08f2491e5443093d105a1e930be3d3c856a41ce1ee3b9888ac54ff837a0f5839 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10c0858a-69b4-4de1-aea8-8d780005bf13, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 05:06:22 localhost systemd[1]: libpod-conmon-08f2491e5443093d105a1e930be3d3c856a41ce1ee3b9888ac54ff837a0f5839.scope: Deactivated successfully. Nov 28 05:06:22 localhost podman[324475]: 2025-11-28 10:06:22.164206929 +0000 UTC m=+0.141010947 container remove 08f2491e5443093d105a1e930be3d3c856a41ce1ee3b9888ac54ff837a0f5839 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10c0858a-69b4-4de1-aea8-8d780005bf13, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true) Nov 28 05:06:22 localhost nova_compute[279673]: 2025-11-28 10:06:22.217 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:22 localhost dnsmasq[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/addn_hosts - 1 addresses Nov 28 05:06:22 localhost kernel: device tapdc62470e-a4 left promiscuous mode Nov 28 05:06:22 localhost podman[324498]: 2025-11-28 10:06:22.218868964 +0000 UTC m=+0.148790317 container kill c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:06:22 localhost dnsmasq-dhcp[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/host Nov 28 05:06:22 localhost dnsmasq-dhcp[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/opts Nov 28 05:06:22 localhost nova_compute[279673]: 2025-11-28 10:06:22.230 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:22 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:22.267 261084 INFO neutron.agent.dhcp.agent [None req-815c770b-1db7-4e8b-922f-aafbd2b45173 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:06:22 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:22.269 261084 INFO neutron.agent.dhcp.agent [None req-815c770b-1db7-4e8b-922f-aafbd2b45173 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:06:22 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:22.548 261084 INFO neutron.agent.dhcp.agent [None req-7cb2b08c-bbd3-4595-a4a3-bf79857633c2 - - - - - -] DHCP configuration for ports {'d39baeac-faf7-4050-b1c2-2f8c9573c064'} is completed#033[00m Nov 28 05:06:22 localhost systemd[1]: tmp-crun.4dOuKN.mount: Deactivated successfully. Nov 28 05:06:22 localhost systemd[1]: var-lib-containers-storage-overlay-bcb6162e51a81069b70e3cfbeb695cb5cd524dd6654c7ed4ff41275202a34476-merged.mount: Deactivated successfully. Nov 28 05:06:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-08f2491e5443093d105a1e930be3d3c856a41ce1ee3b9888ac54ff837a0f5839-userdata-shm.mount: Deactivated successfully. Nov 28 05:06:22 localhost systemd[1]: run-netns-qdhcp\x2d10c0858a\x2d69b4\x2d4de1\x2daea8\x2d8d780005bf13.mount: Deactivated successfully. Nov 28 05:06:23 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e169 do_prune osdmap full prune enabled Nov 28 05:06:23 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e170 e170: 6 total, 6 up, 6 in Nov 28 05:06:23 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e170: 6 total, 6 up, 6 in Nov 28 05:06:23 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 05:06:23 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:06:23 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:23.633 2 INFO neutron.agent.securitygroups_rpc [None req-5b881bab-382f-42c7-b1b6-bde09ef38c32 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']#033[00m Nov 28 05:06:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:06:23 localhost podman[324612]: 2025-11-28 10:06:23.76329246 +0000 UTC m=+0.099612142 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 05:06:23 localhost podman[324612]: 2025-11-28 10:06:23.806548362 +0000 UTC m=+0.142868074 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 05:06:23 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:06:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:06:24 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1033241070' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:06:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:06:24 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1033241070' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:06:24 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:24.468 261084 INFO neutron.agent.linux.ip_lib [None req-52d0945c-7c9f-4996-ad7b-72d0dbba49ac - - - - - -] Device tapac54af53-f9 cannot be used as it has no MAC address#033[00m Nov 28 05:06:24 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 05:06:24 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:06:24 localhost nova_compute[279673]: 2025-11-28 10:06:24.498 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:24 localhost kernel: device tapac54af53-f9 entered promiscuous mode Nov 28 05:06:24 localhost NetworkManager[5967]: [1764324384.5090] manager: (tapac54af53-f9): new Generic device (/org/freedesktop/NetworkManager/Devices/66) Nov 28 05:06:24 localhost nova_compute[279673]: 2025-11-28 10:06:24.509 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:24 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:24.512 2 INFO neutron.agent.securitygroups_rpc [None req-043214bd-8f49-4013-b76d-a6a4f382dbad e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']#033[00m Nov 28 05:06:24 localhost ovn_controller[152322]: 2025-11-28T10:06:24Z|00400|binding|INFO|Claiming lport ac54af53-f927-47a3-a012-007eb09610ba for this chassis. Nov 28 05:06:24 localhost ovn_controller[152322]: 2025-11-28T10:06:24Z|00401|binding|INFO|ac54af53-f927-47a3-a012-007eb09610ba: Claiming unknown Nov 28 05:06:24 localhost systemd-udevd[324647]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:06:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e170 do_prune osdmap full prune enabled Nov 28 05:06:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e171 e171: 6 total, 6 up, 6 in Nov 28 05:06:24 localhost journal[227875]: ethtool ioctl error on tapac54af53-f9: No such device Nov 28 05:06:24 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:24.537 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-c2ece010-6b32-45eb-a0e7-54a94c6d37c8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c2ece010-6b32-45eb-a0e7-54a94c6d37c8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b2dec4622fe844c592a13e779612beaa', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5d67334e-0468-412d-8122-94888f52f93e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ac54af53-f927-47a3-a012-007eb09610ba) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:06:24 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:24.539 158130 INFO neutron.agent.ovn.metadata.agent [-] Port ac54af53-f927-47a3-a012-007eb09610ba in datapath c2ece010-6b32-45eb-a0e7-54a94c6d37c8 bound to our chassis#033[00m Nov 28 05:06:24 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e171: 6 total, 6 up, 6 in Nov 28 05:06:24 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:24.541 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c2ece010-6b32-45eb-a0e7-54a94c6d37c8 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:06:24 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:24.542 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[1f272300-3832-4aa8-b8c9-daa2457dd068]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:06:24 localhost journal[227875]: ethtool ioctl error on tapac54af53-f9: No such device Nov 28 05:06:24 localhost ovn_controller[152322]: 2025-11-28T10:06:24Z|00402|binding|INFO|Setting lport ac54af53-f927-47a3-a012-007eb09610ba ovn-installed in OVS Nov 28 05:06:24 localhost ovn_controller[152322]: 2025-11-28T10:06:24Z|00403|binding|INFO|Setting lport ac54af53-f927-47a3-a012-007eb09610ba up in Southbound Nov 28 05:06:24 localhost nova_compute[279673]: 2025-11-28 10:06:24.550 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:24 localhost journal[227875]: ethtool ioctl error on tapac54af53-f9: No such device Nov 28 05:06:24 localhost journal[227875]: ethtool ioctl error on tapac54af53-f9: No such device Nov 28 05:06:24 localhost journal[227875]: ethtool ioctl error on tapac54af53-f9: No such device Nov 28 05:06:24 localhost journal[227875]: ethtool ioctl error on tapac54af53-f9: No such device Nov 28 05:06:24 localhost journal[227875]: ethtool ioctl error on tapac54af53-f9: No such device Nov 28 05:06:24 localhost journal[227875]: ethtool ioctl error on tapac54af53-f9: No such device Nov 28 05:06:24 localhost nova_compute[279673]: 2025-11-28 10:06:24.593 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:24 localhost nova_compute[279673]: 2025-11-28 10:06:24.623 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:25 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e171 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:06:25 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e171 do_prune osdmap full prune enabled Nov 28 05:06:25 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e172 e172: 6 total, 6 up, 6 in Nov 28 05:06:25 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e172: 6 total, 6 up, 6 in Nov 28 05:06:25 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:25.416 2 INFO neutron.agent.securitygroups_rpc [None req-dd385026-e816-420f-a351-7652f9735a8b e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']#033[00m Nov 28 05:06:25 localhost podman[324718]: Nov 28 05:06:25 localhost podman[324718]: 2025-11-28 10:06:25.630264802 +0000 UTC m=+0.103133171 container create c50b6e12f7635965fc8766068ad19b1eea4c436af11e8c7810c10881b6721190 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS) Nov 28 05:06:25 localhost podman[324718]: 2025-11-28 10:06:25.583816021 +0000 UTC m=+0.056684450 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:06:25 localhost systemd[1]: Started libpod-conmon-c50b6e12f7635965fc8766068ad19b1eea4c436af11e8c7810c10881b6721190.scope. Nov 28 05:06:25 localhost systemd[1]: Started libcrun container. Nov 28 05:06:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb8c60485550cb70649c48654a78f2ae0852e1cfaa2dfda9defc695ca31eee49/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:06:25 localhost podman[324718]: 2025-11-28 10:06:25.723294158 +0000 UTC m=+0.196162537 container init c50b6e12f7635965fc8766068ad19b1eea4c436af11e8c7810c10881b6721190 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true) Nov 28 05:06:25 localhost podman[324718]: 2025-11-28 10:06:25.732910927 +0000 UTC m=+0.205779296 container start c50b6e12f7635965fc8766068ad19b1eea4c436af11e8c7810c10881b6721190 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 28 05:06:25 localhost dnsmasq[324736]: started, version 2.85 cachesize 150 Nov 28 05:06:25 localhost dnsmasq[324736]: DNS service limited to local subnets Nov 28 05:06:25 localhost dnsmasq[324736]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:06:25 localhost dnsmasq[324736]: warning: no upstream servers configured Nov 28 05:06:25 localhost dnsmasq-dhcp[324736]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:06:25 localhost dnsmasq[324736]: read /var/lib/neutron/dhcp/c2ece010-6b32-45eb-a0e7-54a94c6d37c8/addn_hosts - 0 addresses Nov 28 05:06:25 localhost dnsmasq-dhcp[324736]: read /var/lib/neutron/dhcp/c2ece010-6b32-45eb-a0e7-54a94c6d37c8/host Nov 28 05:06:25 localhost dnsmasq-dhcp[324736]: read /var/lib/neutron/dhcp/c2ece010-6b32-45eb-a0e7-54a94c6d37c8/opts Nov 28 05:06:25 localhost nova_compute[279673]: 2025-11-28 10:06:25.808 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:25 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 28 05:06:25 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:06:25 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:25.958 261084 INFO neutron.agent.dhcp.agent [None req-3b9216c1-d5b5-46ec-8219-0fbc48816222 - - - - - -] DHCP configuration for ports {'9de81172-b84e-48ed-a6ac-538e374abb7b'} is completed#033[00m Nov 28 05:06:25 localhost nova_compute[279673]: 2025-11-28 10:06:25.993 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:26 localhost dnsmasq[324736]: exiting on receipt of SIGTERM Nov 28 05:06:26 localhost podman[324754]: 2025-11-28 10:06:26.135337282 +0000 UTC m=+0.066528175 container kill c50b6e12f7635965fc8766068ad19b1eea4c436af11e8c7810c10881b6721190 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Nov 28 05:06:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:06:26 localhost systemd[1]: libpod-c50b6e12f7635965fc8766068ad19b1eea4c436af11e8c7810c10881b6721190.scope: Deactivated successfully. Nov 28 05:06:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e172 do_prune osdmap full prune enabled Nov 28 05:06:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e173 e173: 6 total, 6 up, 6 in Nov 28 05:06:26 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e173: 6 total, 6 up, 6 in Nov 28 05:06:26 localhost podman[324766]: 2025-11-28 10:06:26.21618232 +0000 UTC m=+0.060884500 container died c50b6e12f7635965fc8766068ad19b1eea4c436af11e8c7810c10881b6721190 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true) Nov 28 05:06:26 localhost podman[324766]: 2025-11-28 10:06:26.24548491 +0000 UTC m=+0.090187050 container cleanup c50b6e12f7635965fc8766068ad19b1eea4c436af11e8c7810c10881b6721190 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 28 05:06:26 localhost systemd[1]: libpod-conmon-c50b6e12f7635965fc8766068ad19b1eea4c436af11e8c7810c10881b6721190.scope: Deactivated successfully. Nov 28 05:06:26 localhost podman[324769]: 2025-11-28 10:06:26.292112326 +0000 UTC m=+0.126022671 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:06:26 localhost podman[324768]: 2025-11-28 10:06:26.342057005 +0000 UTC m=+0.181905664 container remove c50b6e12f7635965fc8766068ad19b1eea4c436af11e8c7810c10881b6721190 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 05:06:26 localhost podman[324769]: 2025-11-28 10:06:26.370868339 +0000 UTC m=+0.204778664 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Nov 28 05:06:26 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:06:26 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:26.470 2 INFO neutron.agent.securitygroups_rpc [None req-1c752cf9-f4dc-4e37-b8ac-f785450dc01f 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']#033[00m Nov 28 05:06:26 localhost systemd[1]: var-lib-containers-storage-overlay-eb8c60485550cb70649c48654a78f2ae0852e1cfaa2dfda9defc695ca31eee49-merged.mount: Deactivated successfully. Nov 28 05:06:26 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c50b6e12f7635965fc8766068ad19b1eea4c436af11e8c7810c10881b6721190-userdata-shm.mount: Deactivated successfully. Nov 28 05:06:26 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:26.651 2 INFO neutron.agent.securitygroups_rpc [None req-0d9bcbd8-d3ad-4d76-88b4-dcdc400ccf8b e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']#033[00m Nov 28 05:06:26 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:26.734 2 INFO neutron.agent.securitygroups_rpc [None req-2fa3b339-652f-411d-b715-041869496ad1 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']#033[00m Nov 28 05:06:26 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:06:27 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:27.055 2 INFO neutron.agent.securitygroups_rpc [None req-c1a7ab1c-728b-412f-8d01-654c720e39af 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']#033[00m Nov 28 05:06:27 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:27.252 2 INFO neutron.agent.securitygroups_rpc [None req-35d7c929-00df-4969-aae8-f80246e7ea1b e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']#033[00m Nov 28 05:06:27 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:27.567 2 INFO neutron.agent.securitygroups_rpc [None req-c6801d78-6b56-496a-b37f-67c7bc7e4554 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']#033[00m Nov 28 05:06:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:06:27 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:27.830 2 INFO neutron.agent.securitygroups_rpc [None req-70e4eaf3-c859-433d-b609-e53f73e65383 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']#033[00m Nov 28 05:06:27 localhost podman[324835]: 2025-11-28 10:06:27.862411937 +0000 UTC m=+0.091023358 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller) Nov 28 05:06:27 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e173 do_prune osdmap full prune enabled Nov 28 05:06:27 localhost podman[324835]: 2025-11-28 10:06:27.920665103 +0000 UTC m=+0.149276584 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 05:06:27 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e174 e174: 6 total, 6 up, 6 in Nov 28 05:06:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:06:27 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e174: 6 total, 6 up, 6 in Nov 28 05:06:27 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:06:28 localhost podman[324863]: 2025-11-28 10:06:28.043051433 +0000 UTC m=+0.093403209 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0) Nov 28 05:06:28 localhost podman[324863]: 2025-11-28 10:06:28.059439008 +0000 UTC m=+0.109790784 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:06:28 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:06:28 localhost podman[324903]: Nov 28 05:06:28 localhost podman[324903]: 2025-11-28 10:06:28.294448049 +0000 UTC m=+0.101965252 container create cccea3c566abb3eb2f1c1273d5967910e2ccce3eebec24f7603a8fc6ef2a2e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2) Nov 28 05:06:28 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:28.318 2 INFO neutron.agent.securitygroups_rpc [None req-8150f43d-22f5-4d4c-88f9-cca24e00dc84 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']#033[00m Nov 28 05:06:28 localhost systemd[1]: Started libpod-conmon-cccea3c566abb3eb2f1c1273d5967910e2ccce3eebec24f7603a8fc6ef2a2e42.scope. Nov 28 05:06:28 localhost podman[324903]: 2025-11-28 10:06:28.251205397 +0000 UTC m=+0.058722590 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:06:28 localhost systemd[1]: Started libcrun container. Nov 28 05:06:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee9e293b8742e167b01ef0934def034c313f2eb915f944093d63d89ce9a34521/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:06:28 localhost podman[324903]: 2025-11-28 10:06:28.36876627 +0000 UTC m=+0.176283483 container init cccea3c566abb3eb2f1c1273d5967910e2ccce3eebec24f7603a8fc6ef2a2e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 28 05:06:28 localhost podman[324903]: 2025-11-28 10:06:28.377981813 +0000 UTC m=+0.185499026 container start cccea3c566abb3eb2f1c1273d5967910e2ccce3eebec24f7603a8fc6ef2a2e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 05:06:28 localhost dnsmasq[324922]: started, version 2.85 cachesize 150 Nov 28 05:06:28 localhost dnsmasq[324922]: DNS service limited to local subnets Nov 28 05:06:28 localhost dnsmasq[324922]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:06:28 localhost dnsmasq[324922]: warning: no upstream servers configured Nov 28 05:06:28 localhost dnsmasq-dhcp[324922]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:06:28 localhost dnsmasq-dhcp[324922]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:06:28 localhost dnsmasq[324922]: read /var/lib/neutron/dhcp/c2ece010-6b32-45eb-a0e7-54a94c6d37c8/addn_hosts - 0 addresses Nov 28 05:06:28 localhost dnsmasq-dhcp[324922]: read /var/lib/neutron/dhcp/c2ece010-6b32-45eb-a0e7-54a94c6d37c8/host Nov 28 05:06:28 localhost dnsmasq-dhcp[324922]: read /var/lib/neutron/dhcp/c2ece010-6b32-45eb-a0e7-54a94c6d37c8/opts Nov 28 05:06:28 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:28.614 261084 INFO neutron.agent.dhcp.agent [None req-ec7a9a4e-be2e-4d97-a715-30cb9146efd8 - - - - - -] DHCP configuration for ports {'ac54af53-f927-47a3-a012-007eb09610ba', '9de81172-b84e-48ed-a6ac-538e374abb7b'} is completed#033[00m Nov 28 05:06:28 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e174 do_prune osdmap full prune enabled Nov 28 05:06:28 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e175 e175: 6 total, 6 up, 6 in Nov 28 05:06:29 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e175: 6 total, 6 up, 6 in Nov 28 05:06:29 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e175 do_prune osdmap full prune enabled Nov 28 05:06:30 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e176 e176: 6 total, 6 up, 6 in Nov 28 05:06:30 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e176: 6 total, 6 up, 6 in Nov 28 05:06:30 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e176 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Nov 28 05:06:30 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e176 do_prune osdmap full prune enabled Nov 28 05:06:30 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e177 e177: 6 total, 6 up, 6 in Nov 28 05:06:30 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e177: 6 total, 6 up, 6 in Nov 28 05:06:30 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:30.313 2 INFO neutron.agent.securitygroups_rpc [None req-61635fc4-7cf2-4d88-91e1-3ec9d744288e 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']#033[00m Nov 28 05:06:30 localhost nova_compute[279673]: 2025-11-28 10:06:30.821 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:30 localhost nova_compute[279673]: 2025-11-28 10:06:30.996 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:31 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:31.237 261084 INFO neutron.agent.linux.ip_lib [None req-39937e8a-9887-42fc-9240-e17ad0de1399 - - - - - -] Device tapa7bebe57-e6 cannot be used as it has no MAC address#033[00m Nov 28 05:06:31 localhost nova_compute[279673]: 2025-11-28 10:06:31.263 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:31 localhost kernel: device tapa7bebe57-e6 entered promiscuous mode Nov 28 05:06:31 localhost nova_compute[279673]: 2025-11-28 10:06:31.271 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:31 localhost NetworkManager[5967]: [1764324391.2723] manager: (tapa7bebe57-e6): new Generic device (/org/freedesktop/NetworkManager/Devices/67) Nov 28 05:06:31 localhost ovn_controller[152322]: 2025-11-28T10:06:31Z|00404|binding|INFO|Claiming lport a7bebe57-e6af-47fd-a208-2b421ce68fb2 for this chassis. Nov 28 05:06:31 localhost ovn_controller[152322]: 2025-11-28T10:06:31Z|00405|binding|INFO|a7bebe57-e6af-47fd-a208-2b421ce68fb2: Claiming unknown Nov 28 05:06:31 localhost systemd-udevd[324933]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:06:31 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:31.287 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8462a4a9a313405e8fd212f9ec4a0c92', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=337e4655-c093-49ec-8f8f-37f4f2ac3a09, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a7bebe57-e6af-47fd-a208-2b421ce68fb2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:06:31 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:31.289 158130 INFO neutron.agent.ovn.metadata.agent [-] Port a7bebe57-e6af-47fd-a208-2b421ce68fb2 in datapath ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec bound to our chassis#033[00m Nov 28 05:06:31 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:31.291 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:06:31 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:31.295 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[2e029e96-50cd-4888-8ee5-d69628bfe77c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:06:31 localhost journal[227875]: ethtool ioctl error on tapa7bebe57-e6: No such device Nov 28 05:06:31 localhost nova_compute[279673]: 2025-11-28 10:06:31.306 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:31 localhost journal[227875]: ethtool ioctl error on tapa7bebe57-e6: No such device Nov 28 05:06:31 localhost ovn_controller[152322]: 2025-11-28T10:06:31Z|00406|binding|INFO|Setting lport a7bebe57-e6af-47fd-a208-2b421ce68fb2 ovn-installed in OVS Nov 28 05:06:31 localhost ovn_controller[152322]: 2025-11-28T10:06:31Z|00407|binding|INFO|Setting lport a7bebe57-e6af-47fd-a208-2b421ce68fb2 up in Southbound Nov 28 05:06:31 localhost nova_compute[279673]: 2025-11-28 10:06:31.313 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:31 localhost journal[227875]: ethtool ioctl error on tapa7bebe57-e6: No such device Nov 28 05:06:31 localhost nova_compute[279673]: 2025-11-28 10:06:31.315 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:31 localhost journal[227875]: ethtool ioctl error on tapa7bebe57-e6: No such device Nov 28 05:06:31 localhost journal[227875]: ethtool ioctl error on tapa7bebe57-e6: No such device Nov 28 05:06:31 localhost journal[227875]: ethtool ioctl error on tapa7bebe57-e6: No such device Nov 28 05:06:31 localhost journal[227875]: ethtool ioctl error on tapa7bebe57-e6: No such device Nov 28 05:06:31 localhost journal[227875]: ethtool ioctl error on tapa7bebe57-e6: No such device Nov 28 05:06:31 localhost nova_compute[279673]: 2025-11-28 10:06:31.356 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:31 localhost nova_compute[279673]: 2025-11-28 10:06:31.388 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:32 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e177 do_prune osdmap full prune enabled Nov 28 05:06:32 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e178 e178: 6 total, 6 up, 6 in Nov 28 05:06:32 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e178: 6 total, 6 up, 6 in Nov 28 05:06:32 localhost podman[325004]: Nov 28 05:06:32 localhost podman[325004]: 2025-11-28 10:06:32.234163512 +0000 UTC m=+0.093468221 container create 3b5a924724bbfd3f876266e615b1440da44bfeb8e3e3830b03306159687c33b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:06:32 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:32.265 2 INFO neutron.agent.securitygroups_rpc [None req-b832b7b5-4ef2-4bcf-bb1b-eacd2f3a21fc e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']#033[00m Nov 28 05:06:32 localhost systemd[1]: Started libpod-conmon-3b5a924724bbfd3f876266e615b1440da44bfeb8e3e3830b03306159687c33b6.scope. Nov 28 05:06:32 localhost podman[325004]: 2025-11-28 10:06:32.189699693 +0000 UTC m=+0.049004432 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:06:32 localhost systemd[1]: Started libcrun container. Nov 28 05:06:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6442dac4c49b82e1b453723658ecb4ea232eb6ab4ad6772df820c1942a4ca81f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:06:32 localhost podman[325004]: 2025-11-28 10:06:32.324089933 +0000 UTC m=+0.183394652 container init 3b5a924724bbfd3f876266e615b1440da44bfeb8e3e3830b03306159687c33b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Nov 28 05:06:32 localhost podman[325004]: 2025-11-28 10:06:32.335806424 +0000 UTC m=+0.195111133 container start 3b5a924724bbfd3f876266e615b1440da44bfeb8e3e3830b03306159687c33b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 28 05:06:32 localhost dnsmasq[325023]: started, version 2.85 cachesize 150 Nov 28 05:06:32 localhost dnsmasq[325023]: DNS service limited to local subnets Nov 28 05:06:32 localhost dnsmasq[325023]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:06:32 localhost dnsmasq[325023]: warning: no upstream servers configured Nov 28 05:06:32 localhost dnsmasq-dhcp[325023]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:06:32 localhost dnsmasq[325023]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/addn_hosts - 0 addresses Nov 28 05:06:32 localhost dnsmasq-dhcp[325023]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/host Nov 28 05:06:32 localhost dnsmasq-dhcp[325023]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/opts Nov 28 05:06:32 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:32.563 261084 INFO neutron.agent.dhcp.agent [None req-51490f9d-9eb5-4300-b409-e06914b0e646 - - - - - -] DHCP configuration for ports {'bd93ee1b-7d5f-4009-bc66-cf9872b0e906'} is completed#033[00m Nov 28 05:06:32 localhost dnsmasq[325023]: exiting on receipt of SIGTERM Nov 28 05:06:32 localhost podman[325039]: 2025-11-28 10:06:32.752890376 +0000 UTC m=+0.049470956 container kill 3b5a924724bbfd3f876266e615b1440da44bfeb8e3e3830b03306159687c33b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:06:32 localhost systemd[1]: libpod-3b5a924724bbfd3f876266e615b1440da44bfeb8e3e3830b03306159687c33b6.scope: Deactivated successfully. Nov 28 05:06:32 localhost podman[325051]: 2025-11-28 10:06:32.831932081 +0000 UTC m=+0.063386644 container died 3b5a924724bbfd3f876266e615b1440da44bfeb8e3e3830b03306159687c33b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:06:32 localhost podman[325051]: 2025-11-28 10:06:32.866888638 +0000 UTC m=+0.098343171 container cleanup 3b5a924724bbfd3f876266e615b1440da44bfeb8e3e3830b03306159687c33b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Nov 28 05:06:32 localhost systemd[1]: libpod-conmon-3b5a924724bbfd3f876266e615b1440da44bfeb8e3e3830b03306159687c33b6.scope: Deactivated successfully. Nov 28 05:06:32 localhost podman[325053]: 2025-11-28 10:06:32.938274278 +0000 UTC m=+0.161749235 container remove 3b5a924724bbfd3f876266e615b1440da44bfeb8e3e3830b03306159687c33b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 05:06:33 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e178 do_prune osdmap full prune enabled Nov 28 05:06:33 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e179 e179: 6 total, 6 up, 6 in Nov 28 05:06:33 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e179: 6 total, 6 up, 6 in Nov 28 05:06:33 localhost systemd[1]: var-lib-containers-storage-overlay-6442dac4c49b82e1b453723658ecb4ea232eb6ab4ad6772df820c1942a4ca81f-merged.mount: Deactivated successfully. Nov 28 05:06:33 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3b5a924724bbfd3f876266e615b1440da44bfeb8e3e3830b03306159687c33b6-userdata-shm.mount: Deactivated successfully. Nov 28 05:06:33 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:33.310 2 INFO neutron.agent.securitygroups_rpc [None req-995edc3e-e8fd-43bb-892b-18b0775677c3 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']#033[00m Nov 28 05:06:33 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:33.825 2 INFO neutron.agent.securitygroups_rpc [None req-ca5aaf48-8c6f-4efd-a0a9-4566338fc9f9 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']#033[00m Nov 28 05:06:34 localhost podman[325131]: Nov 28 05:06:34 localhost podman[325131]: 2025-11-28 10:06:34.418439805 +0000 UTC m=+0.066827751 container create f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:06:34 localhost systemd[1]: Started libpod-conmon-f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924.scope. Nov 28 05:06:34 localhost systemd[1]: Started libcrun container. Nov 28 05:06:34 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49b9eae002394005d1393e4090115e2e8233d8d4759be7627d4b8be05351a27e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:06:34 localhost podman[325131]: 2025-11-28 10:06:34.384390556 +0000 UTC m=+0.032778522 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:06:34 localhost podman[325131]: 2025-11-28 10:06:34.530069244 +0000 UTC m=+0.178457200 container init f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Nov 28 05:06:34 localhost podman[325131]: 2025-11-28 10:06:34.544789018 +0000 UTC m=+0.193176964 container start f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125) Nov 28 05:06:34 localhost dnsmasq[325150]: started, version 2.85 cachesize 150 Nov 28 05:06:34 localhost dnsmasq[325150]: DNS service limited to local subnets Nov 28 05:06:34 localhost dnsmasq[325150]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:06:34 localhost dnsmasq[325150]: warning: no upstream servers configured Nov 28 05:06:34 localhost dnsmasq-dhcp[325150]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Nov 28 05:06:34 localhost dnsmasq-dhcp[325150]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:06:34 localhost dnsmasq[325150]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/addn_hosts - 2 addresses Nov 28 05:06:34 localhost dnsmasq-dhcp[325150]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/host Nov 28 05:06:34 localhost dnsmasq-dhcp[325150]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/opts Nov 28 05:06:34 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:34.581 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:db:09 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d1c7e9a2-1241-45c0-8a7d-a563a8d4e9f3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1c7e9a2-1241-45c0-8a7d-a563a8d4e9f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b879ef3c-9a06-48a8-9e87-0eac0ec86fcf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=6db6a620-dcc3-4cb5-ab27-f70881c20730) old=Port_Binding(mac=['fa:16:3e:a3:db:09 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d1c7e9a2-1241-45c0-8a7d-a563a8d4e9f3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1c7e9a2-1241-45c0-8a7d-a563a8d4e9f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:06:34 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:34.583 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 6db6a620-dcc3-4cb5-ab27-f70881c20730 in datapath d1c7e9a2-1241-45c0-8a7d-a563a8d4e9f3 updated#033[00m Nov 28 05:06:34 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:34.587 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d1c7e9a2-1241-45c0-8a7d-a563a8d4e9f3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:06:34 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:34.588 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[e1dd2b9b-aff7-46e7-882b-9491670e6f0c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:06:34 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:34.611 2 INFO neutron.agent.securitygroups_rpc [None req-c85e903a-7c43-4db3-84d5-12d5f3b5c956 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']#033[00m Nov 28 05:06:34 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:34.611 261084 INFO neutron.agent.dhcp.agent [None req-98b7c1d3-4579-411b-808c-fa917aa96b17 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:06:31Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=7f8efc94-f33c-45bd-8ba3-5aae61ce07bb, ip_allocation=immediate, mac_address=fa:16:3e:c3:7f:0b, name=tempest-PortsIpV6TestJSON-733345616, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:06:28Z, description=, dns_domain=, id=ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-584530130, port_security_enabled=True, project_id=8462a4a9a313405e8fd212f9ec4a0c92, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=56666, qos_policy_id=None, revision_number=3, router:external=False, shared=False, standard_attr_id=2340, status=ACTIVE, subnets=['364769b3-b014-43ab-8260-f65a82aa0e26', 'aea3f6fd-3ed0-459b-8972-d3dc4b7dc981'], tags=[], tenant_id=8462a4a9a313405e8fd212f9ec4a0c92, updated_at=2025-11-28T10:06:31Z, vlan_transparent=None, network_id=ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, port_security_enabled=True, project_id=8462a4a9a313405e8fd212f9ec4a0c92, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['78343c03-098f-4faf-a880-2814fe3611d6'], standard_attr_id=2352, status=DOWN, tags=[], tenant_id=8462a4a9a313405e8fd212f9ec4a0c92, updated_at=2025-11-28T10:06:32Z on network ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec#033[00m Nov 28 05:06:34 localhost dnsmasq[325150]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/addn_hosts - 2 addresses Nov 28 05:06:34 localhost podman[325168]: 2025-11-28 10:06:34.759981878 +0000 UTC m=+0.046695420 container kill f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Nov 28 05:06:34 localhost dnsmasq-dhcp[325150]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/host Nov 28 05:06:34 localhost dnsmasq-dhcp[325150]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/opts Nov 28 05:06:34 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:34.797 261084 INFO neutron.agent.dhcp.agent [None req-21956997-4457-454d-a56e-5040ea8f3871 - - - - - -] DHCP configuration for ports {'7f8efc94-f33c-45bd-8ba3-5aae61ce07bb', 'a7bebe57-e6af-47fd-a208-2b421ce68fb2', 'bd93ee1b-7d5f-4009-bc66-cf9872b0e906'} is completed#033[00m Nov 28 05:06:34 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:34.853 261084 INFO neutron.agent.dhcp.agent [None req-98b7c1d3-4579-411b-808c-fa917aa96b17 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:06:31Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7f8efc94-f33c-45bd-8ba3-5aae61ce07bb, ip_allocation=immediate, mac_address=fa:16:3e:c3:7f:0b, name=tempest-PortsIpV6TestJSON-733345616, network_id=ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, port_security_enabled=True, project_id=8462a4a9a313405e8fd212f9ec4a0c92, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['78343c03-098f-4faf-a880-2814fe3611d6'], standard_attr_id=2352, status=DOWN, tags=[], tenant_id=8462a4a9a313405e8fd212f9ec4a0c92, updated_at=2025-11-28T10:06:32Z on network ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec#033[00m Nov 28 05:06:34 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:34.942 261084 INFO neutron.agent.dhcp.agent [None req-95653764-e084-4088-8d4b-3286ea811f41 - - - - - -] DHCP configuration for ports {'7f8efc94-f33c-45bd-8ba3-5aae61ce07bb'} is completed#033[00m Nov 28 05:06:34 localhost dnsmasq[325150]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/addn_hosts - 1 addresses Nov 28 05:06:34 localhost dnsmasq-dhcp[325150]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/host Nov 28 05:06:34 localhost dnsmasq-dhcp[325150]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/opts Nov 28 05:06:34 localhost podman[325208]: 2025-11-28 10:06:34.984176436 +0000 UTC m=+0.046898416 container kill f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 05:06:35 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:35.129 261084 INFO neutron.agent.dhcp.agent [None req-98b7c1d3-4579-411b-808c-fa917aa96b17 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:06:31Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=7f8efc94-f33c-45bd-8ba3-5aae61ce07bb, ip_allocation=immediate, mac_address=fa:16:3e:c3:7f:0b, name=tempest-PortsIpV6TestJSON-733345616, network_id=ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, port_security_enabled=True, project_id=8462a4a9a313405e8fd212f9ec4a0c92, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=3, security_groups=['78343c03-098f-4faf-a880-2814fe3611d6'], standard_attr_id=2352, status=DOWN, tags=[], tenant_id=8462a4a9a313405e8fd212f9ec4a0c92, updated_at=2025-11-28T10:06:33Z on network ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec#033[00m Nov 28 05:06:35 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e179 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:06:35 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e179 do_prune osdmap full prune enabled Nov 28 05:06:35 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:35.378 261084 INFO neutron.agent.dhcp.agent [None req-c8ba8810-26a5-423d-9bad-10653cc8e7af - - - - - -] DHCP configuration for ports {'7f8efc94-f33c-45bd-8ba3-5aae61ce07bb'} is completed#033[00m Nov 28 05:06:35 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e180 e180: 6 total, 6 up, 6 in Nov 28 05:06:35 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e180: 6 total, 6 up, 6 in Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0. Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.401312) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49 Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324395401357, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 1793, "num_deletes": 266, "total_data_size": 1787086, "memory_usage": 1827776, "flush_reason": "Manual Compaction"} Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324395416611, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 1738394, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28012, "largest_seqno": 29804, "table_properties": {"data_size": 1730746, "index_size": 4541, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16990, "raw_average_key_size": 20, "raw_value_size": 1714900, "raw_average_value_size": 2099, "num_data_blocks": 197, "num_entries": 817, "num_filter_entries": 817, "num_deletions": 266, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324288, "oldest_key_time": 1764324288, "file_creation_time": 1764324395, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}} Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 15358 microseconds, and 6260 cpu microseconds. Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.416663) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 1738394 bytes OK Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.416692) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.419135) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.419161) EVENT_LOG_v1 {"time_micros": 1764324395419154, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.419183) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 1779189, prev total WAL file size 1779189, number of live WAL files 2. Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.420340) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034303231' seq:72057594037927935, type:22 .. '6C6F676D0034323735' seq:0, type:0; will stop at (end) Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(1697KB)], [48(15MB)] Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324395420413, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 18373606, "oldest_snapshot_seqno": -1} Nov 28 05:06:35 localhost dnsmasq[325150]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/addn_hosts - 2 addresses Nov 28 05:06:35 localhost dnsmasq-dhcp[325150]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/host Nov 28 05:06:35 localhost dnsmasq-dhcp[325150]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/opts Nov 28 05:06:35 localhost systemd[1]: tmp-crun.pFJxPR.mount: Deactivated successfully. Nov 28 05:06:35 localhost podman[325246]: 2025-11-28 10:06:35.425097352 +0000 UTC m=+0.051997873 container kill f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true) Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 12778 keys, 17853894 bytes, temperature: kUnknown Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324395517883, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 17853894, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17779155, "index_size": 41678, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32005, "raw_key_size": 341741, "raw_average_key_size": 26, "raw_value_size": 17559952, "raw_average_value_size": 1374, "num_data_blocks": 1585, "num_entries": 12778, "num_filter_entries": 12778, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764324395, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}} Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.518328) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 17853894 bytes Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.528712) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 188.2 rd, 182.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 15.9 +0.0 blob) out(17.0 +0.0 blob), read-write-amplify(20.8) write-amplify(10.3) OK, records in: 13325, records dropped: 547 output_compression: NoCompression Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.528751) EVENT_LOG_v1 {"time_micros": 1764324395528734, "job": 28, "event": "compaction_finished", "compaction_time_micros": 97623, "compaction_time_cpu_micros": 29339, "output_level": 6, "num_output_files": 1, "total_output_size": 17853894, "num_input_records": 13325, "num_output_records": 12778, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324395529206, "job": 28, "event": "table_file_deletion", "file_number": 50} Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324395531739, "job": 28, "event": "table_file_deletion", "file_number": 48} Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.420219) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.531831) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.531840) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.531843) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.531847) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:06:35 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:06:35.531849) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:06:35 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:35.615 261084 INFO neutron.agent.dhcp.agent [None req-9b598890-afd1-4569-81c1-6cab4cabdcc3 - - - - - -] DHCP configuration for ports {'7f8efc94-f33c-45bd-8ba3-5aae61ce07bb'} is completed#033[00m Nov 28 05:06:35 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:35.793 2 INFO neutron.agent.securitygroups_rpc [None req-8c0c18c9-b8b0-46ef-89c5-259d43510268 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']#033[00m Nov 28 05:06:35 localhost nova_compute[279673]: 2025-11-28 10:06:35.823 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:35 localhost dnsmasq[325150]: exiting on receipt of SIGTERM Nov 28 05:06:35 localhost podman[325285]: 2025-11-28 10:06:35.940726561 +0000 UTC m=+0.077981194 container kill f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 05:06:35 localhost systemd[1]: libpod-f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924.scope: Deactivated successfully. Nov 28 05:06:36 localhost nova_compute[279673]: 2025-11-28 10:06:35.999 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:36 localhost podman[325297]: 2025-11-28 10:06:36.026231905 +0000 UTC m=+0.073662671 container died f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 05:06:36 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924-userdata-shm.mount: Deactivated successfully. Nov 28 05:06:36 localhost podman[325297]: 2025-11-28 10:06:36.091800575 +0000 UTC m=+0.139231291 container cleanup f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 05:06:36 localhost systemd[1]: libpod-conmon-f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924.scope: Deactivated successfully. Nov 28 05:06:36 localhost podman[325304]: 2025-11-28 10:06:36.117763795 +0000 UTC m=+0.149504148 container remove f6c553e6ab42cf49764412ba1b72e5b5c30b69552575ab39008d9ba21770e924 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 28 05:06:36 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:36.180 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port dec1ab9d-b8d3-4c8e-9f82-8a5f826a7152 with type ""#033[00m Nov 28 05:06:36 localhost ovn_controller[152322]: 2025-11-28T10:06:36Z|00408|binding|INFO|Removing iface tapa7bebe57-e6 ovn-installed in OVS Nov 28 05:06:36 localhost ovn_controller[152322]: 2025-11-28T10:06:36Z|00409|binding|INFO|Removing lport a7bebe57-e6af-47fd-a208-2b421ce68fb2 ovn-installed in OVS Nov 28 05:06:36 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:36.182 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::2/64 2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8462a4a9a313405e8fd212f9ec4a0c92', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=337e4655-c093-49ec-8f8f-37f4f2ac3a09, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a7bebe57-e6af-47fd-a208-2b421ce68fb2) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:06:36 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:36.184 158130 INFO neutron.agent.ovn.metadata.agent [-] Port a7bebe57-e6af-47fd-a208-2b421ce68fb2 in datapath ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec unbound from our chassis#033[00m Nov 28 05:06:36 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:36.186 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:06:36 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:36.188 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[293fd241-bdf4-403f-a37c-0ce9847e93ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:06:36 localhost nova_compute[279673]: 2025-11-28 10:06:36.214 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:36 localhost systemd[1]: var-lib-containers-storage-overlay-49b9eae002394005d1393e4090115e2e8233d8d4759be7627d4b8be05351a27e-merged.mount: Deactivated successfully. Nov 28 05:06:36 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:36.603 2 INFO neutron.agent.securitygroups_rpc [None req-9cc8449a-c364-4646-9e4e-de66c7fab687 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']#033[00m Nov 28 05:06:36 localhost ovn_controller[152322]: 2025-11-28T10:06:36Z|00410|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:06:36 localhost nova_compute[279673]: 2025-11-28 10:06:36.856 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:37 localhost podman[325376]: Nov 28 05:06:37 localhost podman[325376]: 2025-11-28 10:06:37.0837441 +0000 UTC m=+0.093941186 container create 5421a7596753b4f40758f2935d73525fcbb1061f82225d4b12111cb760dbfd1d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 28 05:06:37 localhost systemd[1]: Started libpod-conmon-5421a7596753b4f40758f2935d73525fcbb1061f82225d4b12111cb760dbfd1d.scope. Nov 28 05:06:37 localhost podman[325376]: 2025-11-28 10:06:37.03895962 +0000 UTC m=+0.049156726 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:06:37 localhost systemd[1]: Started libcrun container. Nov 28 05:06:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d03bc8d97f8286d422fc4b16ab382ed0828346ca9e29d28b267fbe4ebf7f5e44/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:06:37 localhost podman[325376]: 2025-11-28 10:06:37.16133443 +0000 UTC m=+0.171531516 container init 5421a7596753b4f40758f2935d73525fcbb1061f82225d4b12111cb760dbfd1d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:06:37 localhost podman[325376]: 2025-11-28 10:06:37.171453562 +0000 UTC m=+0.181650638 container start 5421a7596753b4f40758f2935d73525fcbb1061f82225d4b12111cb760dbfd1d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Nov 28 05:06:37 localhost dnsmasq[325394]: started, version 2.85 cachesize 150 Nov 28 05:06:37 localhost dnsmasq[325394]: DNS service limited to local subnets Nov 28 05:06:37 localhost dnsmasq[325394]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:06:37 localhost dnsmasq[325394]: warning: no upstream servers configured Nov 28 05:06:37 localhost dnsmasq-dhcp[325394]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:06:37 localhost dnsmasq[325394]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/addn_hosts - 0 addresses Nov 28 05:06:37 localhost dnsmasq-dhcp[325394]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/host Nov 28 05:06:37 localhost dnsmasq-dhcp[325394]: read /var/lib/neutron/dhcp/ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec/opts Nov 28 05:06:37 localhost dnsmasq[325394]: exiting on receipt of SIGTERM Nov 28 05:06:37 localhost podman[325411]: 2025-11-28 10:06:37.408986101 +0000 UTC m=+0.060472054 container kill 5421a7596753b4f40758f2935d73525fcbb1061f82225d4b12111cb760dbfd1d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:06:37 localhost systemd[1]: libpod-5421a7596753b4f40758f2935d73525fcbb1061f82225d4b12111cb760dbfd1d.scope: Deactivated successfully. Nov 28 05:06:37 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:37.410 261084 INFO neutron.agent.dhcp.agent [None req-488084a0-f5ea-43bb-b091-6a7241dd8663 - - - - - -] DHCP configuration for ports {'a7bebe57-e6af-47fd-a208-2b421ce68fb2', 'bd93ee1b-7d5f-4009-bc66-cf9872b0e906'} is completed#033[00m Nov 28 05:06:37 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e180 do_prune osdmap full prune enabled Nov 28 05:06:37 localhost podman[325423]: 2025-11-28 10:06:37.472871779 +0000 UTC m=+0.053008313 container died 5421a7596753b4f40758f2935d73525fcbb1061f82225d4b12111cb760dbfd1d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:06:37 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e181 e181: 6 total, 6 up, 6 in Nov 28 05:06:37 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e181: 6 total, 6 up, 6 in Nov 28 05:06:37 localhost systemd[1]: tmp-crun.DxVM1n.mount: Deactivated successfully. Nov 28 05:06:37 localhost podman[325423]: 2025-11-28 10:06:37.521541219 +0000 UTC m=+0.101677703 container cleanup 5421a7596753b4f40758f2935d73525fcbb1061f82225d4b12111cb760dbfd1d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:06:37 localhost systemd[1]: libpod-conmon-5421a7596753b4f40758f2935d73525fcbb1061f82225d4b12111cb760dbfd1d.scope: Deactivated successfully. Nov 28 05:06:37 localhost podman[325430]: 2025-11-28 10:06:37.566874796 +0000 UTC m=+0.128783719 container remove 5421a7596753b4f40758f2935d73525fcbb1061f82225d4b12111cb760dbfd1d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae6c35a6-c3b5-4f0d-a342-15a4ee5d55ec, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:06:37 localhost nova_compute[279673]: 2025-11-28 10:06:37.580 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:37 localhost kernel: device tapa7bebe57-e6 left promiscuous mode Nov 28 05:06:37 localhost nova_compute[279673]: 2025-11-28 10:06:37.599 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:37 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:37.645 261084 INFO neutron.agent.dhcp.agent [None req-c9c02c35-0d8e-4eb8-bc38-641d5c13d0fa - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:06:37 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:37.646 261084 INFO neutron.agent.dhcp.agent [None req-c9c02c35-0d8e-4eb8-bc38-641d5c13d0fa - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:06:37 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:37.646 261084 INFO neutron.agent.dhcp.agent [None req-c9c02c35-0d8e-4eb8-bc38-641d5c13d0fa - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:06:38 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:38.265 2 INFO neutron.agent.securitygroups_rpc [None req-f5ad8b43-3120-4741-9a0b-dea18e860a97 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']#033[00m Nov 28 05:06:38 localhost systemd[1]: var-lib-containers-storage-overlay-d03bc8d97f8286d422fc4b16ab382ed0828346ca9e29d28b267fbe4ebf7f5e44-merged.mount: Deactivated successfully. Nov 28 05:06:38 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5421a7596753b4f40758f2935d73525fcbb1061f82225d4b12111cb760dbfd1d-userdata-shm.mount: Deactivated successfully. Nov 28 05:06:38 localhost systemd[1]: run-netns-qdhcp\x2dae6c35a6\x2dc3b5\x2d4f0d\x2da342\x2d15a4ee5d55ec.mount: Deactivated successfully. Nov 28 05:06:39 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:39.196 2 INFO neutron.agent.securitygroups_rpc [None req-cef696eb-a0d2-4ba5-86dc-1a9cdfd33b8c 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']#033[00m Nov 28 05:06:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:06:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:06:39 localhost podman[325455]: 2025-11-28 10:06:39.373191773 +0000 UTC m=+0.102182449 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 05:06:39 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:39.386 261084 INFO neutron.agent.linux.ip_lib [None req-232134e7-f096-47d4-b144-e1d26af50f89 - - - - - -] Device tapa9eb3ece-27 cannot be used as it has no MAC address#033[00m Nov 28 05:06:39 localhost podman[325455]: 2025-11-28 10:06:39.388449373 +0000 UTC m=+0.117440029 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 05:06:39 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:06:39 localhost nova_compute[279673]: 2025-11-28 10:06:39.410 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:39 localhost kernel: device tapa9eb3ece-27 entered promiscuous mode Nov 28 05:06:39 localhost ovn_controller[152322]: 2025-11-28T10:06:39Z|00411|binding|INFO|Claiming lport a9eb3ece-27d0-4844-bcef-e2f142103dde for this chassis. Nov 28 05:06:39 localhost ovn_controller[152322]: 2025-11-28T10:06:39Z|00412|binding|INFO|a9eb3ece-27d0-4844-bcef-e2f142103dde: Claiming unknown Nov 28 05:06:39 localhost NetworkManager[5967]: [1764324399.4242] manager: (tapa9eb3ece-27): new Generic device (/org/freedesktop/NetworkManager/Devices/68) Nov 28 05:06:39 localhost nova_compute[279673]: 2025-11-28 10:06:39.421 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:39 localhost systemd-udevd[325498]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:06:39 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:39.434 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-31e5a6ac-615e-4a89-968d-3e51c941359f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31e5a6ac-615e-4a89-968d-3e51c941359f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b2dec4622fe844c592a13e779612beaa', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3c093f57-d9af-4f3d-94fe-03cca09b4bb2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a9eb3ece-27d0-4844-bcef-e2f142103dde) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:06:39 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:39.436 158130 INFO neutron.agent.ovn.metadata.agent [-] Port a9eb3ece-27d0-4844-bcef-e2f142103dde in datapath 31e5a6ac-615e-4a89-968d-3e51c941359f bound to our chassis#033[00m Nov 28 05:06:39 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:39.438 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 31e5a6ac-615e-4a89-968d-3e51c941359f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:06:39 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:39.439 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[b6dc554b-8ee2-495e-bd1b-34cc6e0572c2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:06:39 localhost journal[227875]: ethtool ioctl error on tapa9eb3ece-27: No such device Nov 28 05:06:39 localhost nova_compute[279673]: 2025-11-28 10:06:39.461 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:39 localhost ovn_controller[152322]: 2025-11-28T10:06:39Z|00413|binding|INFO|Setting lport a9eb3ece-27d0-4844-bcef-e2f142103dde ovn-installed in OVS Nov 28 05:06:39 localhost ovn_controller[152322]: 2025-11-28T10:06:39Z|00414|binding|INFO|Setting lport a9eb3ece-27d0-4844-bcef-e2f142103dde up in Southbound Nov 28 05:06:39 localhost journal[227875]: ethtool ioctl error on tapa9eb3ece-27: No such device Nov 28 05:06:39 localhost nova_compute[279673]: 2025-11-28 10:06:39.465 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:39 localhost journal[227875]: ethtool ioctl error on tapa9eb3ece-27: No such device Nov 28 05:06:39 localhost journal[227875]: ethtool ioctl error on tapa9eb3ece-27: No such device Nov 28 05:06:39 localhost journal[227875]: ethtool ioctl error on tapa9eb3ece-27: No such device Nov 28 05:06:39 localhost journal[227875]: ethtool ioctl error on tapa9eb3ece-27: No such device Nov 28 05:06:39 localhost journal[227875]: ethtool ioctl error on tapa9eb3ece-27: No such device Nov 28 05:06:39 localhost podman[325456]: 2025-11-28 10:06:39.495380778 +0000 UTC m=+0.221034672 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 28 05:06:39 localhost journal[227875]: ethtool ioctl error on tapa9eb3ece-27: No such device Nov 28 05:06:39 localhost nova_compute[279673]: 2025-11-28 10:06:39.515 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e181 do_prune osdmap full prune enabled Nov 28 05:06:39 localhost podman[325456]: 2025-11-28 10:06:39.537107714 +0000 UTC m=+0.262761688 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS) Nov 28 05:06:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e182 e182: 6 total, 6 up, 6 in Nov 28 05:06:39 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e182: 6 total, 6 up, 6 in Nov 28 05:06:39 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:06:39 localhost nova_compute[279673]: 2025-11-28 10:06:39.561 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:40 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:40.086 2 INFO neutron.agent.securitygroups_rpc [None req-11baa0de-2c9c-4582-9ccd-cefaf494809d e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']#033[00m Nov 28 05:06:40 localhost podman[238687]: time="2025-11-28T10:06:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:06:40 localhost podman[238687]: @ - - [28/Nov/2025:10:06:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 163084 "" "Go-http-client/1.1" Nov 28 05:06:40 localhost podman[238687]: @ - - [28/Nov/2025:10:06:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 21185 "" "Go-http-client/1.1" Nov 28 05:06:40 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e182 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:06:40 localhost ovn_controller[152322]: 2025-11-28T10:06:40Z|00415|binding|INFO|Removing iface tapa9eb3ece-27 ovn-installed in OVS Nov 28 05:06:40 localhost ovn_controller[152322]: 2025-11-28T10:06:40Z|00416|binding|INFO|Removing lport a9eb3ece-27d0-4844-bcef-e2f142103dde ovn-installed in OVS Nov 28 05:06:40 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:40.596 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port a13bc4c2-7b6a-4d42-b85f-b16822a81f92 with type ""#033[00m Nov 28 05:06:40 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:40.598 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-31e5a6ac-615e-4a89-968d-3e51c941359f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31e5a6ac-615e-4a89-968d-3e51c941359f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b2dec4622fe844c592a13e779612beaa', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3c093f57-d9af-4f3d-94fe-03cca09b4bb2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a9eb3ece-27d0-4844-bcef-e2f142103dde) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:06:40 localhost nova_compute[279673]: 2025-11-28 10:06:40.598 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:40 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:40.601 158130 INFO neutron.agent.ovn.metadata.agent [-] Port a9eb3ece-27d0-4844-bcef-e2f142103dde in datapath 31e5a6ac-615e-4a89-968d-3e51c941359f unbound from our chassis#033[00m Nov 28 05:06:40 localhost nova_compute[279673]: 2025-11-28 10:06:40.602 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:40 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:40.603 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 31e5a6ac-615e-4a89-968d-3e51c941359f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:06:40 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:40.605 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[24cc693e-f538-4b3d-8801-e65f6241bb1a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:06:40 localhost podman[325578]: Nov 28 05:06:40 localhost podman[325578]: 2025-11-28 10:06:40.517197773 +0000 UTC m=+0.049481336 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:06:40 localhost podman[325578]: 2025-11-28 10:06:40.623204179 +0000 UTC m=+0.155487702 container create 343093b1230c8a80a4eb7d7db73a5df545e4e737e0951eee930ed27dec4e2877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31e5a6ac-615e-4a89-968d-3e51c941359f, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125) Nov 28 05:06:40 localhost nova_compute[279673]: 2025-11-28 10:06:40.645 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:06:40 localhost nova_compute[279673]: 2025-11-28 10:06:40.666 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Triggering sync for uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Nov 28 05:06:40 localhost nova_compute[279673]: 2025-11-28 10:06:40.667 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:06:40 localhost nova_compute[279673]: 2025-11-28 10:06:40.668 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:06:40 localhost systemd[1]: Started libpod-conmon-343093b1230c8a80a4eb7d7db73a5df545e4e737e0951eee930ed27dec4e2877.scope. Nov 28 05:06:40 localhost nova_compute[279673]: 2025-11-28 10:06:40.693 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.025s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:06:40 localhost systemd[1]: Started libcrun container. Nov 28 05:06:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95f6be0500b09649aba6df959f6cf789054a0f5abb35ea720a3084df1677d81d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:06:40 localhost podman[325578]: 2025-11-28 10:06:40.731442465 +0000 UTC m=+0.263725998 container init 343093b1230c8a80a4eb7d7db73a5df545e4e737e0951eee930ed27dec4e2877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31e5a6ac-615e-4a89-968d-3e51c941359f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125) Nov 28 05:06:40 localhost podman[325578]: 2025-11-28 10:06:40.740667799 +0000 UTC m=+0.272951332 container start 343093b1230c8a80a4eb7d7db73a5df545e4e737e0951eee930ed27dec4e2877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31e5a6ac-615e-4a89-968d-3e51c941359f, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 28 05:06:40 localhost dnsmasq[325597]: started, version 2.85 cachesize 150 Nov 28 05:06:40 localhost dnsmasq[325597]: DNS service limited to local subnets Nov 28 05:06:40 localhost dnsmasq[325597]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:06:40 localhost dnsmasq[325597]: warning: no upstream servers configured Nov 28 05:06:40 localhost dnsmasq-dhcp[325597]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:06:40 localhost dnsmasq[325597]: read /var/lib/neutron/dhcp/31e5a6ac-615e-4a89-968d-3e51c941359f/addn_hosts - 0 addresses Nov 28 05:06:40 localhost dnsmasq-dhcp[325597]: read /var/lib/neutron/dhcp/31e5a6ac-615e-4a89-968d-3e51c941359f/host Nov 28 05:06:40 localhost dnsmasq-dhcp[325597]: read /var/lib/neutron/dhcp/31e5a6ac-615e-4a89-968d-3e51c941359f/opts Nov 28 05:06:40 localhost nova_compute[279673]: 2025-11-28 10:06:40.793 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:06:40 localhost nova_compute[279673]: 2025-11-28 10:06:40.826 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:40 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:40.925 2 INFO neutron.agent.securitygroups_rpc [None req-13572f85-aaed-465a-b457-59f9816ff0f0 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']#033[00m Nov 28 05:06:40 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:40.926 261084 INFO neutron.agent.dhcp.agent [None req-a002d351-2131-493b-b881-45282a5c2164 - - - - - -] DHCP configuration for ports {'9cc2649b-29dd-4c3a-a346-a2df81021394'} is completed#033[00m Nov 28 05:06:41 localhost nova_compute[279673]: 2025-11-28 10:06:41.005 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:41 localhost podman[325613]: 2025-11-28 10:06:41.138531108 +0000 UTC m=+0.065310353 container kill 343093b1230c8a80a4eb7d7db73a5df545e4e737e0951eee930ed27dec4e2877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31e5a6ac-615e-4a89-968d-3e51c941359f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:06:41 localhost dnsmasq[325597]: read /var/lib/neutron/dhcp/31e5a6ac-615e-4a89-968d-3e51c941359f/addn_hosts - 0 addresses Nov 28 05:06:41 localhost dnsmasq-dhcp[325597]: read /var/lib/neutron/dhcp/31e5a6ac-615e-4a89-968d-3e51c941359f/host Nov 28 05:06:41 localhost dnsmasq-dhcp[325597]: read /var/lib/neutron/dhcp/31e5a6ac-615e-4a89-968d-3e51c941359f/opts Nov 28 05:06:41 localhost kernel: device tapa9eb3ece-27 left promiscuous mode Nov 28 05:06:41 localhost nova_compute[279673]: 2025-11-28 10:06:41.373 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:41 localhost nova_compute[279673]: 2025-11-28 10:06:41.384 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.454 261084 INFO neutron.agent.dhcp.agent [None req-cd180424-dc1d-43a8-aaaf-f233bc118d28 - - - - - -] DHCP configuration for ports {'9cc2649b-29dd-4c3a-a346-a2df81021394'} is completed#033[00m Nov 28 05:06:41 localhost dnsmasq[325597]: read /var/lib/neutron/dhcp/31e5a6ac-615e-4a89-968d-3e51c941359f/addn_hosts - 0 addresses Nov 28 05:06:41 localhost dnsmasq-dhcp[325597]: read /var/lib/neutron/dhcp/31e5a6ac-615e-4a89-968d-3e51c941359f/host Nov 28 05:06:41 localhost dnsmasq-dhcp[325597]: read /var/lib/neutron/dhcp/31e5a6ac-615e-4a89-968d-3e51c941359f/opts Nov 28 05:06:41 localhost podman[325654]: 2025-11-28 10:06:41.599797371 +0000 UTC m=+0.065518370 container kill 343093b1230c8a80a4eb7d7db73a5df545e4e737e0951eee930ed27dec4e2877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31e5a6ac-615e-4a89-968d-3e51c941359f, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 28 05:06:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e182 do_prune osdmap full prune enabled Nov 28 05:06:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e183 e183: 6 total, 6 up, 6 in Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent [None req-25a6e0c4-d89b-40f6-871a-bfbc558909aa - - - - - -] Unable to reload_allocations dhcp for 31e5a6ac-615e-4a89-968d-3e51c941359f.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapa9eb3ece-27 not found in namespace qdhcp-31e5a6ac-615e-4a89-968d-3e51c941359f. Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent return fut.result() Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent return self.__get_result() Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent raise self._exception Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapa9eb3ece-27 not found in namespace qdhcp-31e5a6ac-615e-4a89-968d-3e51c941359f. Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.634 261084 ERROR neutron.agent.dhcp.agent #033[00m Nov 28 05:06:41 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e183: 6 total, 6 up, 6 in Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.642 261084 INFO neutron.agent.dhcp.agent [None req-48712969-2fd3-4809-8736-106860b7dd0e - - - - - -] Synchronizing state#033[00m Nov 28 05:06:41 localhost ovn_controller[152322]: 2025-11-28T10:06:41Z|00417|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:06:41 localhost nova_compute[279673]: 2025-11-28 10:06:41.791 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:41 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:41.895 261084 INFO neutron.agent.dhcp.agent [None req-3d2868a0-ec37-4279-b7d1-5c39919092c4 - - - - - -] All active networks have been fetched through RPC.#033[00m Nov 28 05:06:42 localhost dnsmasq[325597]: exiting on receipt of SIGTERM Nov 28 05:06:42 localhost podman[325684]: 2025-11-28 10:06:42.083135784 +0000 UTC m=+0.051244150 container kill 343093b1230c8a80a4eb7d7db73a5df545e4e737e0951eee930ed27dec4e2877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31e5a6ac-615e-4a89-968d-3e51c941359f, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 28 05:06:42 localhost systemd[1]: libpod-343093b1230c8a80a4eb7d7db73a5df545e4e737e0951eee930ed27dec4e2877.scope: Deactivated successfully. Nov 28 05:06:42 localhost podman[325699]: 2025-11-28 10:06:42.166195333 +0000 UTC m=+0.059700091 container died 343093b1230c8a80a4eb7d7db73a5df545e4e737e0951eee930ed27dec4e2877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31e5a6ac-615e-4a89-968d-3e51c941359f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Nov 28 05:06:42 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-343093b1230c8a80a4eb7d7db73a5df545e4e737e0951eee930ed27dec4e2877-userdata-shm.mount: Deactivated successfully. Nov 28 05:06:42 localhost systemd[1]: var-lib-containers-storage-overlay-95f6be0500b09649aba6df959f6cf789054a0f5abb35ea720a3084df1677d81d-merged.mount: Deactivated successfully. Nov 28 05:06:42 localhost podman[325699]: 2025-11-28 10:06:42.271804757 +0000 UTC m=+0.165309485 container remove 343093b1230c8a80a4eb7d7db73a5df545e4e737e0951eee930ed27dec4e2877 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31e5a6ac-615e-4a89-968d-3e51c941359f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:06:42 localhost systemd[1]: libpod-conmon-343093b1230c8a80a4eb7d7db73a5df545e4e737e0951eee930ed27dec4e2877.scope: Deactivated successfully. Nov 28 05:06:42 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:42.304 261084 INFO neutron.agent.dhcp.agent [-] Starting network 220389a9-aaf3-4df4-9c10-df31c76e1a58 dhcp configuration#033[00m Nov 28 05:06:42 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:42.305 261084 INFO neutron.agent.dhcp.agent [-] Finished network 220389a9-aaf3-4df4-9c10-df31c76e1a58 dhcp configuration#033[00m Nov 28 05:06:42 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:42.306 261084 INFO neutron.agent.dhcp.agent [None req-1ab4110a-ac13-4ddb-a592-496f6a9cf3c6 - - - - - -] Synchronizing state complete#033[00m Nov 28 05:06:42 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:42.346 2 INFO neutron.agent.securitygroups_rpc [None req-38d8b4a3-75dd-41d2-a0db-a9c73ae0e2bb e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']#033[00m Nov 28 05:06:42 localhost systemd[1]: run-netns-qdhcp\x2d31e5a6ac\x2d615e\x2d4a89\x2d968d\x2d3e51c941359f.mount: Deactivated successfully. Nov 28 05:06:42 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e183 do_prune osdmap full prune enabled Nov 28 05:06:42 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e184 e184: 6 total, 6 up, 6 in Nov 28 05:06:42 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e184: 6 total, 6 up, 6 in Nov 28 05:06:43 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:43.024 2 INFO neutron.agent.securitygroups_rpc [None req-c8fe2171-fc33-407b-b80c-443549ec2e39 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']#033[00m Nov 28 05:06:43 localhost nova_compute[279673]: 2025-11-28 10:06:43.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:06:43 localhost nova_compute[279673]: 2025-11-28 10:06:43.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:06:43 localhost nova_compute[279673]: 2025-11-28 10:06:43.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:06:43 localhost nova_compute[279673]: 2025-11-28 10:06:43.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 05:06:43 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:06:43 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1049265492' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:06:43 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:06:43 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1049265492' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:06:44 localhost podman[325743]: 2025-11-28 10:06:44.052897547 +0000 UTC m=+0.061122984 container kill cccea3c566abb3eb2f1c1273d5967910e2ccce3eebec24f7603a8fc6ef2a2e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 28 05:06:44 localhost dnsmasq[324922]: exiting on receipt of SIGTERM Nov 28 05:06:44 localhost systemd[1]: libpod-cccea3c566abb3eb2f1c1273d5967910e2ccce3eebec24f7603a8fc6ef2a2e42.scope: Deactivated successfully. Nov 28 05:06:44 localhost podman[325757]: 2025-11-28 10:06:44.145965625 +0000 UTC m=+0.074467405 container died cccea3c566abb3eb2f1c1273d5967910e2ccce3eebec24f7603a8fc6ef2a2e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:06:44 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cccea3c566abb3eb2f1c1273d5967910e2ccce3eebec24f7603a8fc6ef2a2e42-userdata-shm.mount: Deactivated successfully. Nov 28 05:06:44 localhost podman[325757]: 2025-11-28 10:06:44.182278384 +0000 UTC m=+0.110780114 container cleanup cccea3c566abb3eb2f1c1273d5967910e2ccce3eebec24f7603a8fc6ef2a2e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:06:44 localhost systemd[1]: libpod-conmon-cccea3c566abb3eb2f1c1273d5967910e2ccce3eebec24f7603a8fc6ef2a2e42.scope: Deactivated successfully. Nov 28 05:06:44 localhost podman[325758]: 2025-11-28 10:06:44.270227404 +0000 UTC m=+0.192445341 container remove cccea3c566abb3eb2f1c1273d5967910e2ccce3eebec24f7603a8fc6ef2a2e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 05:06:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e184 do_prune osdmap full prune enabled Nov 28 05:06:44 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:44.707 2 INFO neutron.agent.securitygroups_rpc [None req-f4e28522-faee-4418-9069-18a470f0a6f6 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']#033[00m Nov 28 05:06:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e185 e185: 6 total, 6 up, 6 in Nov 28 05:06:44 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e185: 6 total, 6 up, 6 in Nov 28 05:06:45 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:45.005 261084 INFO neutron.agent.linux.ip_lib [None req-3b7aaba5-4be3-4472-85bf-062426fad64a - - - - - -] Device tap2dcd02a6-6e cannot be used as it has no MAC address#033[00m Nov 28 05:06:45 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:45.043 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 832a5508-3562-4501-a0e7-3a860e35260b with type ""#033[00m Nov 28 05:06:45 localhost ovn_controller[152322]: 2025-11-28T10:06:45Z|00418|binding|INFO|Removing iface tapac54af53-f9 ovn-installed in OVS Nov 28 05:06:45 localhost ovn_controller[152322]: 2025-11-28T10:06:45Z|00419|binding|INFO|Removing lport ac54af53-f927-47a3-a012-007eb09610ba ovn-installed in OVS Nov 28 05:06:45 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:45.046 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-c2ece010-6b32-45eb-a0e7-54a94c6d37c8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c2ece010-6b32-45eb-a0e7-54a94c6d37c8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b2dec4622fe844c592a13e779612beaa', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5d67334e-0468-412d-8122-94888f52f93e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ac54af53-f927-47a3-a012-007eb09610ba) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:06:45 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:45.048 158130 INFO neutron.agent.ovn.metadata.agent [-] Port ac54af53-f927-47a3-a012-007eb09610ba in datapath c2ece010-6b32-45eb-a0e7-54a94c6d37c8 unbound from our chassis#033[00m Nov 28 05:06:45 localhost systemd[1]: var-lib-containers-storage-overlay-ee9e293b8742e167b01ef0934def034c313f2eb915f944093d63d89ce9a34521-merged.mount: Deactivated successfully. Nov 28 05:06:45 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:45.059 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c2ece010-6b32-45eb-a0e7-54a94c6d37c8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:06:45 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:45.060 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[7881d730-dc52-4712-9450-4691a6db08a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:06:45 localhost nova_compute[279673]: 2025-11-28 10:06:45.074 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:45 localhost nova_compute[279673]: 2025-11-28 10:06:45.077 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:45 localhost kernel: device tap2dcd02a6-6e entered promiscuous mode Nov 28 05:06:45 localhost NetworkManager[5967]: [1764324405.0854] manager: (tap2dcd02a6-6e): new Generic device (/org/freedesktop/NetworkManager/Devices/69) Nov 28 05:06:45 localhost systemd-udevd[325824]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:06:45 localhost ovn_controller[152322]: 2025-11-28T10:06:45Z|00420|binding|INFO|Claiming lport 2dcd02a6-6ee7-4a78-a00f-13dc1294595f for this chassis. Nov 28 05:06:45 localhost nova_compute[279673]: 2025-11-28 10:06:45.089 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:45 localhost ovn_controller[152322]: 2025-11-28T10:06:45Z|00421|binding|INFO|2dcd02a6-6ee7-4a78-a00f-13dc1294595f: Claiming unknown Nov 28 05:06:45 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:45.104 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-49f178c5-0cae-4b0e-9bb3-8615842f2e56', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-49f178c5-0cae-4b0e-9bb3-8615842f2e56', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ac48318a-2b44-44d6-ac83-b660fc51fa7f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2dcd02a6-6ee7-4a78-a00f-13dc1294595f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:06:45 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:45.107 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 2dcd02a6-6ee7-4a78-a00f-13dc1294595f in datapath 49f178c5-0cae-4b0e-9bb3-8615842f2e56 bound to our chassis#033[00m Nov 28 05:06:45 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:45.111 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 2a341328-31a8-4e24-99e8-5139e38e23a6 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:06:45 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:45.111 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 49f178c5-0cae-4b0e-9bb3-8615842f2e56, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:06:45 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:45.113 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[3b81918c-4715-4f16-a964-10e3c5d3a852]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:06:45 localhost ovn_controller[152322]: 2025-11-28T10:06:45Z|00422|binding|INFO|Setting lport 2dcd02a6-6ee7-4a78-a00f-13dc1294595f ovn-installed in OVS Nov 28 05:06:45 localhost ovn_controller[152322]: 2025-11-28T10:06:45Z|00423|binding|INFO|Setting lport 2dcd02a6-6ee7-4a78-a00f-13dc1294595f up in Southbound Nov 28 05:06:45 localhost nova_compute[279673]: 2025-11-28 10:06:45.133 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:45 localhost nova_compute[279673]: 2025-11-28 10:06:45.187 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:45 localhost nova_compute[279673]: 2025-11-28 10:06:45.234 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:45 localhost podman[325853]: Nov 28 05:06:45 localhost podman[325853]: 2025-11-28 10:06:45.320763044 +0000 UTC m=+0.084626218 container create 9219fadb3f566625ebf598f73b4aa760b705fa2fb7c258e7460938a933ab8232 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:06:45 localhost podman[325853]: 2025-11-28 10:06:45.272800256 +0000 UTC m=+0.036663460 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:06:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:06:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e185 do_prune osdmap full prune enabled Nov 28 05:06:45 localhost systemd[1]: Started libpod-conmon-9219fadb3f566625ebf598f73b4aa760b705fa2fb7c258e7460938a933ab8232.scope. Nov 28 05:06:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e186 e186: 6 total, 6 up, 6 in Nov 28 05:06:45 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e186: 6 total, 6 up, 6 in Nov 28 05:06:45 localhost systemd[1]: Started libcrun container. Nov 28 05:06:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1d6ced94728650500b10b276b0c43bf2e09a46765f20bb4a704944ec73bf2bc3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:06:45 localhost podman[325853]: 2025-11-28 10:06:45.438635366 +0000 UTC m=+0.202498550 container init 9219fadb3f566625ebf598f73b4aa760b705fa2fb7c258e7460938a933ab8232 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:06:45 localhost podman[325853]: 2025-11-28 10:06:45.449235783 +0000 UTC m=+0.213098957 container start 9219fadb3f566625ebf598f73b4aa760b705fa2fb7c258e7460938a933ab8232 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:06:45 localhost dnsmasq[325876]: started, version 2.85 cachesize 150 Nov 28 05:06:45 localhost dnsmasq[325876]: DNS service limited to local subnets Nov 28 05:06:45 localhost dnsmasq[325876]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:06:45 localhost dnsmasq[325876]: warning: no upstream servers configured Nov 28 05:06:45 localhost dnsmasq-dhcp[325876]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:06:45 localhost dnsmasq[325876]: read /var/lib/neutron/dhcp/c2ece010-6b32-45eb-a0e7-54a94c6d37c8/addn_hosts - 0 addresses Nov 28 05:06:45 localhost dnsmasq-dhcp[325876]: read /var/lib/neutron/dhcp/c2ece010-6b32-45eb-a0e7-54a94c6d37c8/host Nov 28 05:06:45 localhost dnsmasq-dhcp[325876]: read /var/lib/neutron/dhcp/c2ece010-6b32-45eb-a0e7-54a94c6d37c8/opts Nov 28 05:06:45 localhost nova_compute[279673]: 2025-11-28 10:06:45.768 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:06:45 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:45.773 261084 INFO neutron.agent.dhcp.agent [None req-1b4e5cd6-f2b9-4940-acfa-8801257eb07a - - - - - -] DHCP configuration for ports {'ac54af53-f927-47a3-a012-007eb09610ba', '9de81172-b84e-48ed-a6ac-538e374abb7b'} is completed#033[00m Nov 28 05:06:45 localhost nova_compute[279673]: 2025-11-28 10:06:45.828 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:45 localhost dnsmasq[325876]: exiting on receipt of SIGTERM Nov 28 05:06:45 localhost podman[325908]: 2025-11-28 10:06:45.920346469 +0000 UTC m=+0.055930545 container kill 9219fadb3f566625ebf598f73b4aa760b705fa2fb7c258e7460938a933ab8232 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:06:45 localhost systemd[1]: libpod-9219fadb3f566625ebf598f73b4aa760b705fa2fb7c258e7460938a933ab8232.scope: Deactivated successfully. Nov 28 05:06:46 localhost podman[325922]: 2025-11-28 10:06:46.001268372 +0000 UTC m=+0.065952573 container died 9219fadb3f566625ebf598f73b4aa760b705fa2fb7c258e7460938a933ab8232 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 28 05:06:46 localhost nova_compute[279673]: 2025-11-28 10:06:46.006 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:46 localhost podman[325922]: 2025-11-28 10:06:46.031394511 +0000 UTC m=+0.096078682 container cleanup 9219fadb3f566625ebf598f73b4aa760b705fa2fb7c258e7460938a933ab8232 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Nov 28 05:06:46 localhost systemd[1]: libpod-conmon-9219fadb3f566625ebf598f73b4aa760b705fa2fb7c258e7460938a933ab8232.scope: Deactivated successfully. Nov 28 05:06:46 localhost systemd[1]: var-lib-containers-storage-overlay-1d6ced94728650500b10b276b0c43bf2e09a46765f20bb4a704944ec73bf2bc3-merged.mount: Deactivated successfully. Nov 28 05:06:46 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9219fadb3f566625ebf598f73b4aa760b705fa2fb7c258e7460938a933ab8232-userdata-shm.mount: Deactivated successfully. Nov 28 05:06:46 localhost podman[325924]: 2025-11-28 10:06:46.088270453 +0000 UTC m=+0.143105480 container remove 9219fadb3f566625ebf598f73b4aa760b705fa2fb7c258e7460938a933ab8232 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c2ece010-6b32-45eb-a0e7-54a94c6d37c8, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 28 05:06:46 localhost ovn_controller[152322]: 2025-11-28T10:06:46Z|00424|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:06:46 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:46.120 2 INFO neutron.agent.securitygroups_rpc [None req-33288b8b-e565-4871-a7ed-f7e6a715772e 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']#033[00m Nov 28 05:06:46 localhost nova_compute[279673]: 2025-11-28 10:06:46.133 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:46 localhost kernel: device tapac54af53-f9 left promiscuous mode Nov 28 05:06:46 localhost nova_compute[279673]: 2025-11-28 10:06:46.159 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:46 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:46.204 261084 INFO neutron.agent.dhcp.agent [None req-5af3994c-f40c-432a-8f54-67748103df35 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:06:46 localhost systemd[1]: run-netns-qdhcp\x2dc2ece010\x2d6b32\x2d45eb\x2da0e7\x2d54a94c6d37c8.mount: Deactivated successfully. Nov 28 05:06:46 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:46.205 261084 INFO neutron.agent.dhcp.agent [None req-5af3994c-f40c-432a-8f54-67748103df35 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:06:46 localhost podman[325970]: Nov 28 05:06:46 localhost podman[325970]: 2025-11-28 10:06:46.302477393 +0000 UTC m=+0.113779016 container create 9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-49f178c5-0cae-4b0e-9bb3-8615842f2e56, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:06:46 localhost systemd[1]: Started libpod-conmon-9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901.scope. Nov 28 05:06:46 localhost systemd[1]: Started libcrun container. Nov 28 05:06:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9bc781df874fccff849953263ff4783d37195584bd3533cf543244d5f9159b50/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:06:46 localhost podman[325970]: 2025-11-28 10:06:46.25985244 +0000 UTC m=+0.071154093 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:06:46 localhost podman[325970]: 2025-11-28 10:06:46.36825899 +0000 UTC m=+0.179560613 container init 9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-49f178c5-0cae-4b0e-9bb3-8615842f2e56, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 05:06:46 localhost podman[325970]: 2025-11-28 10:06:46.377869226 +0000 UTC m=+0.189170849 container start 9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-49f178c5-0cae-4b0e-9bb3-8615842f2e56, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125) Nov 28 05:06:46 localhost dnsmasq[325988]: started, version 2.85 cachesize 150 Nov 28 05:06:46 localhost dnsmasq[325988]: DNS service limited to local subnets Nov 28 05:06:46 localhost dnsmasq[325988]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:06:46 localhost dnsmasq[325988]: warning: no upstream servers configured Nov 28 05:06:46 localhost dnsmasq-dhcp[325988]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:06:46 localhost dnsmasq[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/addn_hosts - 0 addresses Nov 28 05:06:46 localhost dnsmasq-dhcp[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/host Nov 28 05:06:46 localhost dnsmasq-dhcp[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/opts Nov 28 05:06:46 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:46.453 261084 INFO neutron.agent.dhcp.agent [None req-f9f16e8e-da79-4140-89ae-8b2b2e290d57 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:06:44Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=62795658-8866-4a6e-9294-b579a034da47, ip_allocation=immediate, mac_address=fa:16:3e:0b:8c:a9, name=tempest-PortsTestJSON-1966847573, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:06:42Z, description=, dns_domain=, id=49f178c5-0cae-4b0e-9bb3-8615842f2e56, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-1662837750, port_security_enabled=True, project_id=5e7a07c97c664076bc825e05137c574c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=8809, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2398, status=ACTIVE, subnets=['f761224f-305a-4c1b-8715-9d97740c76a2'], tags=[], tenant_id=5e7a07c97c664076bc825e05137c574c, updated_at=2025-11-28T10:06:43Z, vlan_transparent=None, network_id=49f178c5-0cae-4b0e-9bb3-8615842f2e56, port_security_enabled=True, project_id=5e7a07c97c664076bc825e05137c574c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b'], standard_attr_id=2410, status=DOWN, tags=[], tenant_id=5e7a07c97c664076bc825e05137c574c, updated_at=2025-11-28T10:06:44Z on network 49f178c5-0cae-4b0e-9bb3-8615842f2e56#033[00m Nov 28 05:06:46 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:46.573 261084 INFO neutron.agent.dhcp.agent [None req-de1b2128-68a6-4865-b222-eacdfefbcf63 - - - - - -] DHCP configuration for ports {'f5d400af-a230-4bee-9fd3-04e139732c3d'} is completed#033[00m Nov 28 05:06:46 localhost dnsmasq[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/addn_hosts - 1 addresses Nov 28 05:06:46 localhost podman[326006]: 2025-11-28 10:06:46.717435669 +0000 UTC m=+0.066075437 container kill 9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-49f178c5-0cae-4b0e-9bb3-8615842f2e56, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:06:46 localhost dnsmasq-dhcp[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/host Nov 28 05:06:46 localhost dnsmasq-dhcp[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/opts Nov 28 05:06:46 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:46.764 2 INFO neutron.agent.securitygroups_rpc [None req-92a83df5-6e17-4f34-81bc-c1c1907c0872 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']#033[00m Nov 28 05:06:46 localhost nova_compute[279673]: 2025-11-28 10:06:46.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:06:46 localhost nova_compute[279673]: 2025-11-28 10:06:46.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:06:46 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:46.908 261084 INFO neutron.agent.dhcp.agent [None req-c0c9a2dd-6aba-4de0-8de3-956b4c569218 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:06:45Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=8e769ce7-7e42-447c-9aaf-873d0ff4a173, ip_allocation=immediate, mac_address=fa:16:3e:7a:92:f9, name=tempest-PortsTestJSON-895012161, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:06:42Z, description=, dns_domain=, id=49f178c5-0cae-4b0e-9bb3-8615842f2e56, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-1662837750, port_security_enabled=True, project_id=5e7a07c97c664076bc825e05137c574c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=8809, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2398, status=ACTIVE, subnets=['f761224f-305a-4c1b-8715-9d97740c76a2'], tags=[], tenant_id=5e7a07c97c664076bc825e05137c574c, updated_at=2025-11-28T10:06:43Z, vlan_transparent=None, network_id=49f178c5-0cae-4b0e-9bb3-8615842f2e56, port_security_enabled=True, project_id=5e7a07c97c664076bc825e05137c574c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b'], standard_attr_id=2416, status=DOWN, tags=[], tenant_id=5e7a07c97c664076bc825e05137c574c, updated_at=2025-11-28T10:06:45Z on network 49f178c5-0cae-4b0e-9bb3-8615842f2e56#033[00m Nov 28 05:06:46 localhost nova_compute[279673]: 2025-11-28 10:06:46.923 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:06:46 localhost nova_compute[279673]: 2025-11-28 10:06:46.925 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:06:46 localhost nova_compute[279673]: 2025-11-28 10:06:46.925 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:06:46 localhost nova_compute[279673]: 2025-11-28 10:06:46.926 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 05:06:46 localhost nova_compute[279673]: 2025-11-28 10:06:46.927 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:06:46 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:46.991 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:06:46 localhost nova_compute[279673]: 2025-11-28 10:06:46.992 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:46 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:46.994 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 28 05:06:47 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:47.097 261084 INFO neutron.agent.dhcp.agent [None req-38cb43d6-ffed-431f-a006-834980804f11 - - - - - -] DHCP configuration for ports {'62795658-8866-4a6e-9294-b579a034da47'} is completed#033[00m Nov 28 05:06:47 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:47.162 2 INFO neutron.agent.securitygroups_rpc [None req-09174bbe-f95e-4ff7-9cd4-901486ff4f01 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']#033[00m Nov 28 05:06:47 localhost dnsmasq[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/addn_hosts - 2 addresses Nov 28 05:06:47 localhost podman[326062]: 2025-11-28 10:06:47.232299333 +0000 UTC m=+0.070970057 container kill 9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-49f178c5-0cae-4b0e-9bb3-8615842f2e56, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:06:47 localhost dnsmasq-dhcp[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/host Nov 28 05:06:47 localhost dnsmasq-dhcp[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/opts Nov 28 05:06:47 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:06:47 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2354762036' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:06:47 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:47.374 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:06:47 localhost nova_compute[279673]: 2025-11-28 10:06:47.378 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:06:47 localhost nova_compute[279673]: 2025-11-28 10:06:47.474 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:06:47 localhost nova_compute[279673]: 2025-11-28 10:06:47.475 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:06:47 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:47.550 261084 INFO neutron.agent.dhcp.agent [None req-a56efb65-c118-4c96-af6b-d3e69a4c11a4 - - - - - -] DHCP configuration for ports {'8e769ce7-7e42-447c-9aaf-873d0ff4a173'} is completed#033[00m Nov 28 05:06:47 localhost dnsmasq[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/addn_hosts - 1 addresses Nov 28 05:06:47 localhost podman[326103]: 2025-11-28 10:06:47.715159582 +0000 UTC m=+0.075763006 container kill 9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-49f178c5-0cae-4b0e-9bb3-8615842f2e56, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 05:06:47 localhost dnsmasq-dhcp[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/host Nov 28 05:06:47 localhost dnsmasq-dhcp[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/opts Nov 28 05:06:47 localhost nova_compute[279673]: 2025-11-28 10:06:47.721 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 05:06:47 localhost nova_compute[279673]: 2025-11-28 10:06:47.723 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11091MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 05:06:47 localhost nova_compute[279673]: 2025-11-28 10:06:47.723 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:06:47 localhost nova_compute[279673]: 2025-11-28 10:06:47.723 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:06:47 localhost nova_compute[279673]: 2025-11-28 10:06:47.814 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 05:06:47 localhost nova_compute[279673]: 2025-11-28 10:06:47.815 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 05:06:47 localhost nova_compute[279673]: 2025-11-28 10:06:47.815 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 05:06:47 localhost nova_compute[279673]: 2025-11-28 10:06:47.867 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:06:48 localhost openstack_network_exporter[240658]: ERROR 10:06:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:06:48 localhost openstack_network_exporter[240658]: ERROR 10:06:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:06:48 localhost openstack_network_exporter[240658]: ERROR 10:06:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:06:48 localhost openstack_network_exporter[240658]: ERROR 10:06:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:06:48 localhost openstack_network_exporter[240658]: Nov 28 05:06:48 localhost openstack_network_exporter[240658]: ERROR 10:06:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:06:48 localhost openstack_network_exporter[240658]: Nov 28 05:06:48 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:48.341 2 INFO neutron.agent.securitygroups_rpc [None req-5e549f88-7906-4d9c-9524-bb7ac558174f 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']#033[00m Nov 28 05:06:48 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:06:48 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1057030524' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:06:48 localhost nova_compute[279673]: 2025-11-28 10:06:48.386 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.520s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:06:48 localhost nova_compute[279673]: 2025-11-28 10:06:48.393 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 05:06:48 localhost nova_compute[279673]: 2025-11-28 10:06:48.420 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 05:06:48 localhost nova_compute[279673]: 2025-11-28 10:06:48.421 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 05:06:48 localhost nova_compute[279673]: 2025-11-28 10:06:48.421 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.698s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:06:48 localhost dnsmasq[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/addn_hosts - 0 addresses Nov 28 05:06:48 localhost podman[326164]: 2025-11-28 10:06:48.573171599 +0000 UTC m=+0.050814817 container kill 9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-49f178c5-0cae-4b0e-9bb3-8615842f2e56, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 05:06:48 localhost dnsmasq-dhcp[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/host Nov 28 05:06:48 localhost dnsmasq-dhcp[325988]: read /var/lib/neutron/dhcp/49f178c5-0cae-4b0e-9bb3-8615842f2e56/opts Nov 28 05:06:49 localhost nova_compute[279673]: 2025-11-28 10:06:49.422 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:06:49 localhost nova_compute[279673]: 2025-11-28 10:06:49.423 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 05:06:49 localhost nova_compute[279673]: 2025-11-28 10:06:49.423 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 05:06:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:06:49 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:49.861 2 INFO neutron.agent.securitygroups_rpc [None req-be3aec7b-abf4-4211-97ca-2df83bf1c365 87cf17c48bab44279b2e7eca9e0882a2 d3c0d1ce8d854a7b9ffc953e88cd2c44 - - default default] Security group rule updated ['c52603b5-5f47-4123-b8fe-cc9f0a56d914']#033[00m Nov 28 05:06:49 localhost podman[326197]: 2025-11-28 10:06:49.874806225 +0000 UTC m=+0.101848809 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64) Nov 28 05:06:49 localhost podman[326197]: 2025-11-28 10:06:49.929112879 +0000 UTC m=+0.156155493 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, name=ubi9-minimal) Nov 28 05:06:49 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:06:49 localhost dnsmasq[322901]: exiting on receipt of SIGTERM Nov 28 05:06:49 localhost podman[326214]: 2025-11-28 10:06:49.96323431 +0000 UTC m=+0.119742381 container kill d45c3118e0cb447a1f990580ae47ce23aabce3b8035e5e02e901e9a0f3bc1683 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fa28040d-639a-454c-9515-60af86f8624b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 05:06:49 localhost systemd[1]: libpod-d45c3118e0cb447a1f990580ae47ce23aabce3b8035e5e02e901e9a0f3bc1683.scope: Deactivated successfully. Nov 28 05:06:49 localhost ovn_controller[152322]: 2025-11-28T10:06:49Z|00425|binding|INFO|Removing iface tapbbebc9e7-db ovn-installed in OVS Nov 28 05:06:49 localhost ovn_controller[152322]: 2025-11-28T10:06:49Z|00426|binding|INFO|Removing lport bbebc9e7-dbe0-47b5-b390-b5bcd4b40cc9 ovn-installed in OVS Nov 28 05:06:49 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:49.981 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port c9b05cd7-5e0a-4448-ad54-e279b37c3a36 with type ""#033[00m Nov 28 05:06:49 localhost nova_compute[279673]: 2025-11-28 10:06:49.982 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:49 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:49.984 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.255.242/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-fa28040d-639a-454c-9515-60af86f8624b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa28040d-639a-454c-9515-60af86f8624b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8499932c523b4e26933fff84403e296e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dcca6890-1675-46ad-9260-7f267479c535, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=bbebc9e7-dbe0-47b5-b390-b5bcd4b40cc9) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:06:49 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:49.990 158130 INFO neutron.agent.ovn.metadata.agent [-] Port bbebc9e7-dbe0-47b5-b390-b5bcd4b40cc9 in datapath fa28040d-639a-454c-9515-60af86f8624b unbound from our chassis#033[00m Nov 28 05:06:49 localhost nova_compute[279673]: 2025-11-28 10:06:49.991 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:49 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:49.993 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fa28040d-639a-454c-9515-60af86f8624b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:06:49 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:49.994 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[10d47530-ba3e-4435-bf5a-f26f56a95320]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:06:50 localhost podman[326248]: 2025-11-28 10:06:50.033635609 +0000 UTC m=+0.056131480 container died d45c3118e0cb447a1f990580ae47ce23aabce3b8035e5e02e901e9a0f3bc1683 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fa28040d-639a-454c-9515-60af86f8624b, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:06:50 localhost podman[326248]: 2025-11-28 10:06:50.073809688 +0000 UTC m=+0.096305519 container cleanup d45c3118e0cb447a1f990580ae47ce23aabce3b8035e5e02e901e9a0f3bc1683 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fa28040d-639a-454c-9515-60af86f8624b, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 05:06:50 localhost systemd[1]: libpod-conmon-d45c3118e0cb447a1f990580ae47ce23aabce3b8035e5e02e901e9a0f3bc1683.scope: Deactivated successfully. Nov 28 05:06:50 localhost nova_compute[279673]: 2025-11-28 10:06:50.084 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 05:06:50 localhost nova_compute[279673]: 2025-11-28 10:06:50.085 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 05:06:50 localhost nova_compute[279673]: 2025-11-28 10:06:50.085 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 05:06:50 localhost nova_compute[279673]: 2025-11-28 10:06:50.086 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:06:50 localhost podman[326250]: 2025-11-28 10:06:50.104557524 +0000 UTC m=+0.106753070 container remove d45c3118e0cb447a1f990580ae47ce23aabce3b8035e5e02e901e9a0f3bc1683 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fa28040d-639a-454c-9515-60af86f8624b, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:06:50 localhost nova_compute[279673]: 2025-11-28 10:06:50.119 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:50 localhost kernel: device tapbbebc9e7-db left promiscuous mode Nov 28 05:06:50 localhost nova_compute[279673]: 2025-11-28 10:06:50.134 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:50 localhost dnsmasq[325988]: exiting on receipt of SIGTERM Nov 28 05:06:50 localhost podman[326272]: 2025-11-28 10:06:50.142609027 +0000 UTC m=+0.115660035 container kill 9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-49f178c5-0cae-4b0e-9bb3-8615842f2e56, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 05:06:50 localhost systemd[1]: libpod-9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901.scope: Deactivated successfully. Nov 28 05:06:50 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:50.150 261084 INFO neutron.agent.dhcp.agent [None req-8bd5054b-d03e-4c05-8799-43d2ea5c917f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:06:50 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:50.193 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:06:50 localhost podman[326298]: 2025-11-28 10:06:50.206101594 +0000 UTC m=+0.042165321 container died 9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-49f178c5-0cae-4b0e-9bb3-8615842f2e56, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:06:50 localhost podman[326298]: 2025-11-28 10:06:50.24232091 +0000 UTC m=+0.078384617 container remove 9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-49f178c5-0cae-4b0e-9bb3-8615842f2e56, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 28 05:06:50 localhost ovn_controller[152322]: 2025-11-28T10:06:50Z|00427|binding|INFO|Releasing lport 2dcd02a6-6ee7-4a78-a00f-13dc1294595f from this chassis (sb_readonly=0) Nov 28 05:06:50 localhost nova_compute[279673]: 2025-11-28 10:06:50.258 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:50 localhost kernel: device tap2dcd02a6-6e left promiscuous mode Nov 28 05:06:50 localhost ovn_controller[152322]: 2025-11-28T10:06:50Z|00428|binding|INFO|Setting lport 2dcd02a6-6ee7-4a78-a00f-13dc1294595f down in Southbound Nov 28 05:06:50 localhost systemd[1]: libpod-conmon-9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901.scope: Deactivated successfully. Nov 28 05:06:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:50.272 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-49f178c5-0cae-4b0e-9bb3-8615842f2e56', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-49f178c5-0cae-4b0e-9bb3-8615842f2e56', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ac48318a-2b44-44d6-ac83-b660fc51fa7f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2dcd02a6-6ee7-4a78-a00f-13dc1294595f) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:06:50 localhost nova_compute[279673]: 2025-11-28 10:06:50.274 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:50.275 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 2dcd02a6-6ee7-4a78-a00f-13dc1294595f in datapath 49f178c5-0cae-4b0e-9bb3-8615842f2e56 unbound from our chassis#033[00m Nov 28 05:06:50 localhost nova_compute[279673]: 2025-11-28 10:06:50.278 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:50.279 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 49f178c5-0cae-4b0e-9bb3-8615842f2e56, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:06:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:50.280 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[85e3c570-9f3c-4a60-b91d-44921c14effb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:06:50 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:50.300 261084 INFO neutron.agent.dhcp.agent [None req-13f1892a-bb63-4f6b-97c5-f89954cfd0c0 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:06:50 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:50.313 2 INFO neutron.agent.securitygroups_rpc [None req-500702cc-90b6-4aaa-a265-7d7caae35722 87cf17c48bab44279b2e7eca9e0882a2 d3c0d1ce8d854a7b9ffc953e88cd2c44 - - default default] Security group rule updated ['c52603b5-5f47-4123-b8fe-cc9f0a56d914']#033[00m Nov 28 05:06:50 localhost ovn_controller[152322]: 2025-11-28T10:06:50Z|00429|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:06:50 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:06:50 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e186 do_prune osdmap full prune enabled Nov 28 05:06:50 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e187 e187: 6 total, 6 up, 6 in Nov 28 05:06:50 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e187: 6 total, 6 up, 6 in Nov 28 05:06:50 localhost nova_compute[279673]: 2025-11-28 10:06:50.495 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:50 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:50.692 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:06:50 localhost nova_compute[279673]: 2025-11-28 10:06:50.831 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:50.845 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:06:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:50.846 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:06:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:50.846 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:06:50 localhost systemd[1]: var-lib-containers-storage-overlay-9bc781df874fccff849953263ff4783d37195584bd3533cf543244d5f9159b50-merged.mount: Deactivated successfully. Nov 28 05:06:50 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9d14b9c8e4d8ad9f1c5059fca12b84abd9691c0c0dd95ed7d45a3dafab846901-userdata-shm.mount: Deactivated successfully. Nov 28 05:06:50 localhost systemd[1]: run-netns-qdhcp\x2d49f178c5\x2d0cae\x2d4b0e\x2d9bb3\x2d8615842f2e56.mount: Deactivated successfully. Nov 28 05:06:50 localhost systemd[1]: var-lib-containers-storage-overlay-14c3eefe9cc85a0964db7e342e45ff322ab308fd13a7f19aa1847f356aa5bafb-merged.mount: Deactivated successfully. Nov 28 05:06:50 localhost nova_compute[279673]: 2025-11-28 10:06:50.869 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:06:50 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d45c3118e0cb447a1f990580ae47ce23aabce3b8035e5e02e901e9a0f3bc1683-userdata-shm.mount: Deactivated successfully. Nov 28 05:06:50 localhost systemd[1]: run-netns-qdhcp\x2dfa28040d\x2d639a\x2d454c\x2d9515\x2d60af86f8624b.mount: Deactivated successfully. Nov 28 05:06:50 localhost nova_compute[279673]: 2025-11-28 10:06:50.892 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 05:06:50 localhost nova_compute[279673]: 2025-11-28 10:06:50.892 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 05:06:50 localhost nova_compute[279673]: 2025-11-28 10:06:50.893 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:06:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:50.997 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:06:51 localhost nova_compute[279673]: 2025-11-28 10:06:51.008 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:51 localhost dnsmasq[323161]: exiting on receipt of SIGTERM Nov 28 05:06:51 localhost podman[326341]: 2025-11-28 10:06:51.171775738 +0000 UTC m=+0.067274414 container kill c2f76ad321100bf3dd865ae468acc4ac2ff5793c21d4c81e01c36259a18458eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f532ea4-a0de-4113-8993-33f982144ec8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 28 05:06:51 localhost systemd[1]: libpod-c2f76ad321100bf3dd865ae468acc4ac2ff5793c21d4c81e01c36259a18458eb.scope: Deactivated successfully. Nov 28 05:06:51 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e46: np0005538515.yfkzhl(active, since 8m), standbys: np0005538513.dsfdlx, np0005538514.djozup Nov 28 05:06:51 localhost podman[326355]: 2025-11-28 10:06:51.265438024 +0000 UTC m=+0.077949913 container died c2f76ad321100bf3dd865ae468acc4ac2ff5793c21d4c81e01c36259a18458eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f532ea4-a0de-4113-8993-33f982144ec8, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3) Nov 28 05:06:51 localhost podman[326355]: 2025-11-28 10:06:51.309352008 +0000 UTC m=+0.121863857 container cleanup c2f76ad321100bf3dd865ae468acc4ac2ff5793c21d4c81e01c36259a18458eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f532ea4-a0de-4113-8993-33f982144ec8, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125) Nov 28 05:06:51 localhost systemd[1]: libpod-conmon-c2f76ad321100bf3dd865ae468acc4ac2ff5793c21d4c81e01c36259a18458eb.scope: Deactivated successfully. Nov 28 05:06:51 localhost podman[326357]: 2025-11-28 10:06:51.355970674 +0000 UTC m=+0.161743795 container remove c2f76ad321100bf3dd865ae468acc4ac2ff5793c21d4c81e01c36259a18458eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3f532ea4-a0de-4113-8993-33f982144ec8, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:06:51 localhost nova_compute[279673]: 2025-11-28 10:06:51.369 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:51 localhost ovn_controller[152322]: 2025-11-28T10:06:51Z|00430|binding|INFO|Releasing lport 87ef7272-14f7-4162-a8a9-b13090f8924f from this chassis (sb_readonly=0) Nov 28 05:06:51 localhost ovn_controller[152322]: 2025-11-28T10:06:51Z|00431|binding|INFO|Setting lport 87ef7272-14f7-4162-a8a9-b13090f8924f down in Southbound Nov 28 05:06:51 localhost kernel: device tap87ef7272-14 left promiscuous mode Nov 28 05:06:51 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:51.378 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-3f532ea4-a0de-4113-8993-33f982144ec8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f532ea4-a0de-4113-8993-33f982144ec8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b2dec4622fe844c592a13e779612beaa', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4a58c096-9217-4d0b-a64c-715683dae905, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=87ef7272-14f7-4162-a8a9-b13090f8924f) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:06:51 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:51.380 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 87ef7272-14f7-4162-a8a9-b13090f8924f in datapath 3f532ea4-a0de-4113-8993-33f982144ec8 unbound from our chassis#033[00m Nov 28 05:06:51 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:51.381 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3f532ea4-a0de-4113-8993-33f982144ec8 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:06:51 localhost ovn_metadata_agent[158125]: 2025-11-28 10:06:51.383 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[817fdb59-38d3-4d65-9037-2bb02ecf8aa0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:06:51 localhost nova_compute[279673]: 2025-11-28 10:06:51.392 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:51 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:51.425 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:06:51 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:51.726 2 INFO neutron.agent.securitygroups_rpc [None req-7220764d-588c-486e-8a18-42779bef93d4 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']#033[00m Nov 28 05:06:51 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:06:51.759 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:06:51 localhost systemd[1]: tmp-crun.FUdifc.mount: Deactivated successfully. Nov 28 05:06:51 localhost systemd[1]: var-lib-containers-storage-overlay-0a09506b8e900e354d47418961d4d79668634f1edb0c1cb9bb0d739ec90a1c16-merged.mount: Deactivated successfully. Nov 28 05:06:51 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c2f76ad321100bf3dd865ae468acc4ac2ff5793c21d4c81e01c36259a18458eb-userdata-shm.mount: Deactivated successfully. Nov 28 05:06:51 localhost systemd[1]: run-netns-qdhcp\x2d3f532ea4\x2da0de\x2d4113\x2d8993\x2d33f982144ec8.mount: Deactivated successfully. Nov 28 05:06:52 localhost ovn_controller[152322]: 2025-11-28T10:06:52Z|00432|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:06:52 localhost nova_compute[279673]: 2025-11-28 10:06:52.119 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:06:54 localhost podman[326383]: 2025-11-28 10:06:54.855359469 +0000 UTC m=+0.087312972 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 05:06:54 localhost podman[326383]: 2025-11-28 10:06:54.867570286 +0000 UTC m=+0.099523829 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 28 05:06:54 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:06:55 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:06:55 localhost nova_compute[279673]: 2025-11-28 10:06:55.833 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:55 localhost neutron_sriov_agent[254147]: 2025-11-28 10:06:55.873 2 INFO neutron.agent.securitygroups_rpc [None req-3852d4d4-6237-4177-8a94-f6db1e58a4b7 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']#033[00m Nov 28 05:06:56 localhost nova_compute[279673]: 2025-11-28 10:06:56.014 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:06:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:06:56 localhost podman[326406]: 2025-11-28 10:06:56.842576201 +0000 UTC m=+0.082372720 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:06:56 localhost podman[326406]: 2025-11-28 10:06:56.876511356 +0000 UTC m=+0.116307846 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2) Nov 28 05:06:56 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:06:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:06:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:06:58 localhost podman[326426]: 2025-11-28 10:06:58.907005181 +0000 UTC m=+0.142729339 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible) Nov 28 05:06:58 localhost podman[326426]: 2025-11-28 10:06:58.919839426 +0000 UTC m=+0.155563594 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true) Nov 28 05:06:58 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:06:58 localhost systemd[1]: tmp-crun.870Jpy.mount: Deactivated successfully. Nov 28 05:06:58 localhost podman[326427]: 2025-11-28 10:06:58.974993926 +0000 UTC m=+0.196567088 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:06:59 localhost podman[326427]: 2025-11-28 10:06:59.015635017 +0000 UTC m=+0.237208169 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller) Nov 28 05:06:59 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:07:00 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:07:00 localhost nova_compute[279673]: 2025-11-28 10:07:00.836 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:01 localhost nova_compute[279673]: 2025-11-28 10:07:01.016 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:02 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e47: np0005538515.yfkzhl(active, since 8m), standbys: np0005538513.dsfdlx, np0005538514.djozup Nov 28 05:07:03 localhost neutron_sriov_agent[254147]: 2025-11-28 10:07:03.080 2 INFO neutron.agent.securitygroups_rpc [req-e18f2dde-6bdd-4008-97a4-be84187f4807 req-3c6747af-afe3-40df-86cf-89416982a794 87cf17c48bab44279b2e7eca9e0882a2 d3c0d1ce8d854a7b9ffc953e88cd2c44 - - default default] Security group member updated ['c52603b5-5f47-4123-b8fe-cc9f0a56d914']#033[00m Nov 28 05:07:04 localhost neutron_sriov_agent[254147]: 2025-11-28 10:07:04.539 2 INFO neutron.agent.securitygroups_rpc [None req-ae7d6dfb-c119-45e0-bfec-235340ad22c9 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['19d31bf3-ea7b-49ec-820d-ba3fe5752e88']#033[00m Nov 28 05:07:05 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:07:05 localhost nova_compute[279673]: 2025-11-28 10:07:05.839 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:06 localhost nova_compute[279673]: 2025-11-28 10:07:06.018 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e187 do_prune osdmap full prune enabled Nov 28 05:07:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e188 e188: 6 total, 6 up, 6 in Nov 28 05:07:09 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e188: 6 total, 6 up, 6 in Nov 28 05:07:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:07:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:07:09 localhost podman[326469]: 2025-11-28 10:07:09.855937956 +0000 UTC m=+0.090966634 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 28 05:07:09 localhost podman[326469]: 2025-11-28 10:07:09.863920402 +0000 UTC m=+0.098949060 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 28 05:07:09 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:07:09 localhost podman[326470]: 2025-11-28 10:07:09.916379338 +0000 UTC m=+0.147838516 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:07:09 localhost podman[326470]: 2025-11-28 10:07:09.931594127 +0000 UTC m=+0.163053285 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible) Nov 28 05:07:09 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:07:10 localhost podman[238687]: time="2025-11-28T10:07:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:07:10 localhost podman[238687]: @ - - [28/Nov/2025:10:07:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157512 "" "Go-http-client/1.1" Nov 28 05:07:10 localhost podman[238687]: @ - - [28/Nov/2025:10:07:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19749 "" "Go-http-client/1.1" Nov 28 05:07:10 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:07:10 localhost neutron_sriov_agent[254147]: 2025-11-28 10:07:10.635 2 INFO neutron.agent.securitygroups_rpc [None req-570a0175-8080-4b40-9e80-d9942b63779e e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['c81397c2-33ee-481d-8257-b39c2b0c331e', '19d31bf3-ea7b-49ec-820d-ba3fe5752e88']#033[00m Nov 28 05:07:10 localhost neutron_sriov_agent[254147]: 2025-11-28 10:07:10.639 2 INFO neutron.agent.securitygroups_rpc [None req-22769ed5-ed71-4ef8-ab49-99801270d0d3 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']#033[00m Nov 28 05:07:10 localhost nova_compute[279673]: 2025-11-28 10:07:10.855 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:11 localhost nova_compute[279673]: 2025-11-28 10:07:11.021 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e188 do_prune osdmap full prune enabled Nov 28 05:07:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e189 e189: 6 total, 6 up, 6 in Nov 28 05:07:11 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e189: 6 total, 6 up, 6 in Nov 28 05:07:14 localhost neutron_sriov_agent[254147]: 2025-11-28 10:07:14.560 2 INFO neutron.agent.securitygroups_rpc [None req-94433583-aa1a-4467-b851-fbd6872bea34 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['c81397c2-33ee-481d-8257-b39c2b0c331e']#033[00m Nov 28 05:07:15 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e189 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:07:15 localhost nova_compute[279673]: 2025-11-28 10:07:15.857 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:16 localhost nova_compute[279673]: 2025-11-28 10:07:16.024 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:16 localhost dnsmasq[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/addn_hosts - 0 addresses Nov 28 05:07:16 localhost dnsmasq-dhcp[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/host Nov 28 05:07:16 localhost podman[326528]: 2025-11-28 10:07:16.419207218 +0000 UTC m=+0.061868377 container kill c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 05:07:16 localhost dnsmasq-dhcp[324234]: read /var/lib/neutron/dhcp/4dc5b71e-287e-4ec6-b6b7-4d131e85d551/opts Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0. Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.435897) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52 Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324436435984, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 873, "num_deletes": 255, "total_data_size": 1217512, "memory_usage": 1241120, "flush_reason": "Manual Compaction"} Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324436445197, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 1203374, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29805, "largest_seqno": 30677, "table_properties": {"data_size": 1199158, "index_size": 1879, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10649, "raw_average_key_size": 21, "raw_value_size": 1190251, "raw_average_value_size": 2352, "num_data_blocks": 82, "num_entries": 506, "num_filter_entries": 506, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324396, "oldest_key_time": 1764324396, "file_creation_time": 1764324436, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}} Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 9341 microseconds, and 4115 cpu microseconds. Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.445249) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 1203374 bytes OK Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.445275) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.447466) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.447488) EVENT_LOG_v1 {"time_micros": 1764324436447481, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.447511) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 1213078, prev total WAL file size 1213078, number of live WAL files 2. Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.448417) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132303438' seq:72057594037927935, type:22 .. '7061786F73003132333030' seq:0, type:0; will stop at (end) Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(1175KB)], [51(17MB)] Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324436448476, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 19057268, "oldest_snapshot_seqno": -1} Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 12755 keys, 17845211 bytes, temperature: kUnknown Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324436563959, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 17845211, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17771070, "index_size": 41144, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31941, "raw_key_size": 342033, "raw_average_key_size": 26, "raw_value_size": 17552586, "raw_average_value_size": 1376, "num_data_blocks": 1556, "num_entries": 12755, "num_filter_entries": 12755, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764324436, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}} Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.564331) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 17845211 bytes Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.566296) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 164.9 rd, 154.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 17.0 +0.0 blob) out(17.0 +0.0 blob), read-write-amplify(30.7) write-amplify(14.8) OK, records in: 13284, records dropped: 529 output_compression: NoCompression Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.566325) EVENT_LOG_v1 {"time_micros": 1764324436566313, "job": 30, "event": "compaction_finished", "compaction_time_micros": 115597, "compaction_time_cpu_micros": 51154, "output_level": 6, "num_output_files": 1, "total_output_size": 17845211, "num_input_records": 13284, "num_output_records": 12755, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324436566678, "job": 30, "event": "table_file_deletion", "file_number": 53} Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324436569248, "job": 30, "event": "table_file_deletion", "file_number": 51} Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.448319) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.569533) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.569560) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.569564) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.569567) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:07:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:07:16.569571) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:07:17 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e189 do_prune osdmap full prune enabled Nov 28 05:07:17 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e190 e190: 6 total, 6 up, 6 in Nov 28 05:07:17 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e190: 6 total, 6 up, 6 in Nov 28 05:07:18 localhost openstack_network_exporter[240658]: ERROR 10:07:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:07:18 localhost openstack_network_exporter[240658]: ERROR 10:07:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:07:18 localhost openstack_network_exporter[240658]: ERROR 10:07:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:07:18 localhost openstack_network_exporter[240658]: ERROR 10:07:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:07:18 localhost openstack_network_exporter[240658]: Nov 28 05:07:18 localhost openstack_network_exporter[240658]: ERROR 10:07:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:07:18 localhost openstack_network_exporter[240658]: Nov 28 05:07:18 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e190 do_prune osdmap full prune enabled Nov 28 05:07:18 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e191 e191: 6 total, 6 up, 6 in Nov 28 05:07:18 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e191: 6 total, 6 up, 6 in Nov 28 05:07:18 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:18.908 261084 INFO neutron.agent.linux.ip_lib [None req-bd01f321-4ae9-463f-bc5e-ff371036a5cc - - - - - -] Device tape7ad6507-8c cannot be used as it has no MAC address#033[00m Nov 28 05:07:18 localhost nova_compute[279673]: 2025-11-28 10:07:18.986 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:18 localhost kernel: device tape7ad6507-8c entered promiscuous mode Nov 28 05:07:18 localhost NetworkManager[5967]: [1764324438.9966] manager: (tape7ad6507-8c): new Generic device (/org/freedesktop/NetworkManager/Devices/70) Nov 28 05:07:19 localhost ovn_controller[152322]: 2025-11-28T10:07:18Z|00433|binding|INFO|Claiming lport e7ad6507-8cb9-4c54-9fde-23c7028d341d for this chassis. Nov 28 05:07:19 localhost ovn_controller[152322]: 2025-11-28T10:07:18Z|00434|binding|INFO|e7ad6507-8cb9-4c54-9fde-23c7028d341d: Claiming unknown Nov 28 05:07:19 localhost nova_compute[279673]: 2025-11-28 10:07:18.997 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:19 localhost systemd-udevd[326560]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:07:19 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:19.013 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe83:d5c4/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-e9912372-cec2-427d-b398-8b7ba1d00441', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e9912372-cec2-427d-b398-8b7ba1d00441', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ae10569a38284f298c961498da620c5f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5e17820c-609d-4556-ae57-25b66af88ca4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e7ad6507-8cb9-4c54-9fde-23c7028d341d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:07:19 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:19.015 158130 INFO neutron.agent.ovn.metadata.agent [-] Port e7ad6507-8cb9-4c54-9fde-23c7028d341d in datapath e9912372-cec2-427d-b398-8b7ba1d00441 bound to our chassis#033[00m Nov 28 05:07:19 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:19.018 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 9cf0f181-4a32-47e3-bbc4-a22358682295 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:07:19 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:19.019 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e9912372-cec2-427d-b398-8b7ba1d00441, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:07:19 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:19.019 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[d01d889d-4829-4555-bd10-f24fc8d9409c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:07:19 localhost journal[227875]: ethtool ioctl error on tape7ad6507-8c: No such device Nov 28 05:07:19 localhost nova_compute[279673]: 2025-11-28 10:07:19.032 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:19 localhost journal[227875]: ethtool ioctl error on tape7ad6507-8c: No such device Nov 28 05:07:19 localhost ovn_controller[152322]: 2025-11-28T10:07:19Z|00435|binding|INFO|Setting lport e7ad6507-8cb9-4c54-9fde-23c7028d341d ovn-installed in OVS Nov 28 05:07:19 localhost ovn_controller[152322]: 2025-11-28T10:07:19Z|00436|binding|INFO|Setting lport e7ad6507-8cb9-4c54-9fde-23c7028d341d up in Southbound Nov 28 05:07:19 localhost nova_compute[279673]: 2025-11-28 10:07:19.040 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:19 localhost journal[227875]: ethtool ioctl error on tape7ad6507-8c: No such device Nov 28 05:07:19 localhost nova_compute[279673]: 2025-11-28 10:07:19.044 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:19 localhost journal[227875]: ethtool ioctl error on tape7ad6507-8c: No such device Nov 28 05:07:19 localhost journal[227875]: ethtool ioctl error on tape7ad6507-8c: No such device Nov 28 05:07:19 localhost journal[227875]: ethtool ioctl error on tape7ad6507-8c: No such device Nov 28 05:07:19 localhost journal[227875]: ethtool ioctl error on tape7ad6507-8c: No such device Nov 28 05:07:19 localhost journal[227875]: ethtool ioctl error on tape7ad6507-8c: No such device Nov 28 05:07:19 localhost nova_compute[279673]: 2025-11-28 10:07:19.074 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:19 localhost nova_compute[279673]: 2025-11-28 10:07:19.104 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:19 localhost neutron_sriov_agent[254147]: 2025-11-28 10:07:19.267 2 INFO neutron.agent.securitygroups_rpc [None req-5151d855-124b-4682-b39f-fc47e0550bce 6b80a7d6f65f4ebe8363ccaae26f3e87 ae10569a38284f298c961498da620c5f - - default default] Security group member updated ['c5eee24b-0bed-4035-a2ab-e6c531c94e43']#033[00m Nov 28 05:07:19 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e191 do_prune osdmap full prune enabled Nov 28 05:07:19 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e192 e192: 6 total, 6 up, 6 in Nov 28 05:07:19 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e192: 6 total, 6 up, 6 in Nov 28 05:07:19 localhost podman[326631]: Nov 28 05:07:19 localhost podman[326631]: 2025-11-28 10:07:19.890459956 +0000 UTC m=+0.079987536 container create cc39888ed5db5db0a23c989dfc0a97ee095b412f428289d5af1294588c5515f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e9912372-cec2-427d-b398-8b7ba1d00441, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0) Nov 28 05:07:19 localhost systemd[1]: Started libpod-conmon-cc39888ed5db5db0a23c989dfc0a97ee095b412f428289d5af1294588c5515f0.scope. Nov 28 05:07:19 localhost podman[326631]: 2025-11-28 10:07:19.843494729 +0000 UTC m=+0.033022369 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:07:19 localhost systemd[1]: tmp-crun.QsoE6J.mount: Deactivated successfully. Nov 28 05:07:19 localhost systemd[1]: Started libcrun container. Nov 28 05:07:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:07:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c04ff6f7b4fcac360df3aa4ab5727cea04640414569d0565dad260384a9ac12/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:07:19 localhost podman[326631]: 2025-11-28 10:07:19.974826905 +0000 UTC m=+0.164354465 container init cc39888ed5db5db0a23c989dfc0a97ee095b412f428289d5af1294588c5515f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e9912372-cec2-427d-b398-8b7ba1d00441, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 05:07:19 localhost podman[326631]: 2025-11-28 10:07:19.982280315 +0000 UTC m=+0.171807895 container start cc39888ed5db5db0a23c989dfc0a97ee095b412f428289d5af1294588c5515f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e9912372-cec2-427d-b398-8b7ba1d00441, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 28 05:07:19 localhost dnsmasq[326657]: started, version 2.85 cachesize 150 Nov 28 05:07:20 localhost dnsmasq[326657]: DNS service limited to local subnets Nov 28 05:07:20 localhost dnsmasq[326657]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:07:20 localhost dnsmasq[326657]: warning: no upstream servers configured Nov 28 05:07:20 localhost dnsmasq[326657]: read /var/lib/neutron/dhcp/e9912372-cec2-427d-b398-8b7ba1d00441/addn_hosts - 0 addresses Nov 28 05:07:20 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:20.051 261084 INFO neutron.agent.dhcp.agent [None req-bd01f321-4ae9-463f-bc5e-ff371036a5cc - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:07:18Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d2f8dc65-4856-423e-a9bf-07311833bbd3, ip_allocation=immediate, mac_address=fa:16:3e:fd:dd:41, name=tempest-NetworksIpV6TestAttrs-1405885968, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:07:08Z, description=, dns_domain=, id=e9912372-cec2-427d-b398-8b7ba1d00441, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksIpV6TestAttrs-test-network-1194919161, port_security_enabled=True, project_id=ae10569a38284f298c961498da620c5f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=8532, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2471, status=ACTIVE, subnets=['961cfa80-c1ec-4a76-97a0-a027c3bf1d91'], tags=[], tenant_id=ae10569a38284f298c961498da620c5f, updated_at=2025-11-28T10:07:14Z, vlan_transparent=None, network_id=e9912372-cec2-427d-b398-8b7ba1d00441, port_security_enabled=True, project_id=ae10569a38284f298c961498da620c5f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['c5eee24b-0bed-4035-a2ab-e6c531c94e43'], standard_attr_id=2489, status=DOWN, tags=[], tenant_id=ae10569a38284f298c961498da620c5f, updated_at=2025-11-28T10:07:18Z on network e9912372-cec2-427d-b398-8b7ba1d00441#033[00m Nov 28 05:07:20 localhost podman[326648]: 2025-11-28 10:07:20.073138275 +0000 UTC m=+0.103782969 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, release=1755695350, version=9.6, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7) Nov 28 05:07:20 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:20.112 261084 INFO neutron.agent.dhcp.agent [None req-d31dc310-df46-4ca4-a98c-03f2a7a6da98 - - - - - -] DHCP configuration for ports {'82cd7cf2-21f9-4a3e-b3cf-743f79ff64b7'} is completed#033[00m Nov 28 05:07:20 localhost podman[326648]: 2025-11-28 10:07:20.116628685 +0000 UTC m=+0.147273419 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, version=9.6, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9) Nov 28 05:07:20 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:07:20 localhost dnsmasq[326657]: read /var/lib/neutron/dhcp/e9912372-cec2-427d-b398-8b7ba1d00441/addn_hosts - 1 addresses Nov 28 05:07:20 localhost podman[326688]: 2025-11-28 10:07:20.258777674 +0000 UTC m=+0.062634550 container kill cc39888ed5db5db0a23c989dfc0a97ee095b412f428289d5af1294588c5515f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e9912372-cec2-427d-b398-8b7ba1d00441, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:07:20 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e192 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:07:20 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e192 do_prune osdmap full prune enabled Nov 28 05:07:20 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e193 e193: 6 total, 6 up, 6 in Nov 28 05:07:20 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e193: 6 total, 6 up, 6 in Nov 28 05:07:20 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:20.575 261084 INFO neutron.agent.dhcp.agent [None req-65c014d6-57ba-49e0-9664-c40a5c4d2c1c - - - - - -] DHCP configuration for ports {'d2f8dc65-4856-423e-a9bf-07311833bbd3'} is completed#033[00m Nov 28 05:07:20 localhost dnsmasq[326657]: exiting on receipt of SIGTERM Nov 28 05:07:20 localhost podman[326728]: 2025-11-28 10:07:20.717735326 +0000 UTC m=+0.060512315 container kill cc39888ed5db5db0a23c989dfc0a97ee095b412f428289d5af1294588c5515f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e9912372-cec2-427d-b398-8b7ba1d00441, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:07:20 localhost systemd[1]: libpod-cc39888ed5db5db0a23c989dfc0a97ee095b412f428289d5af1294588c5515f0.scope: Deactivated successfully. Nov 28 05:07:20 localhost podman[326741]: 2025-11-28 10:07:20.785246506 +0000 UTC m=+0.055671715 container died cc39888ed5db5db0a23c989dfc0a97ee095b412f428289d5af1294588c5515f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e9912372-cec2-427d-b398-8b7ba1d00441, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 28 05:07:20 localhost nova_compute[279673]: 2025-11-28 10:07:20.858 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:20 localhost podman[326741]: 2025-11-28 10:07:20.870849554 +0000 UTC m=+0.141274783 container cleanup cc39888ed5db5db0a23c989dfc0a97ee095b412f428289d5af1294588c5515f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e9912372-cec2-427d-b398-8b7ba1d00441, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 05:07:20 localhost systemd[1]: libpod-conmon-cc39888ed5db5db0a23c989dfc0a97ee095b412f428289d5af1294588c5515f0.scope: Deactivated successfully. Nov 28 05:07:20 localhost systemd[1]: var-lib-containers-storage-overlay-7c04ff6f7b4fcac360df3aa4ab5727cea04640414569d0565dad260384a9ac12-merged.mount: Deactivated successfully. Nov 28 05:07:20 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cc39888ed5db5db0a23c989dfc0a97ee095b412f428289d5af1294588c5515f0-userdata-shm.mount: Deactivated successfully. Nov 28 05:07:20 localhost podman[326748]: 2025-11-28 10:07:20.900974143 +0000 UTC m=+0.156399170 container remove cc39888ed5db5db0a23c989dfc0a97ee095b412f428289d5af1294588c5515f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e9912372-cec2-427d-b398-8b7ba1d00441, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:07:20 localhost nova_compute[279673]: 2025-11-28 10:07:20.913 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:20 localhost ovn_controller[152322]: 2025-11-28T10:07:20Z|00437|binding|INFO|Releasing lport e7ad6507-8cb9-4c54-9fde-23c7028d341d from this chassis (sb_readonly=0) Nov 28 05:07:20 localhost kernel: device tape7ad6507-8c left promiscuous mode Nov 28 05:07:20 localhost ovn_controller[152322]: 2025-11-28T10:07:20Z|00438|binding|INFO|Setting lport e7ad6507-8cb9-4c54-9fde-23c7028d341d down in Southbound Nov 28 05:07:20 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:20.928 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe83:d5c4/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-e9912372-cec2-427d-b398-8b7ba1d00441', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e9912372-cec2-427d-b398-8b7ba1d00441', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ae10569a38284f298c961498da620c5f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5e17820c-609d-4556-ae57-25b66af88ca4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e7ad6507-8cb9-4c54-9fde-23c7028d341d) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:07:20 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:20.930 158130 INFO neutron.agent.ovn.metadata.agent [-] Port e7ad6507-8cb9-4c54-9fde-23c7028d341d in datapath e9912372-cec2-427d-b398-8b7ba1d00441 unbound from our chassis#033[00m Nov 28 05:07:20 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:20.932 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e9912372-cec2-427d-b398-8b7ba1d00441, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:07:20 localhost nova_compute[279673]: 2025-11-28 10:07:20.933 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:20 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:20.933 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[f85f68d9-2111-4e02-98f9-00ba59b7151b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:07:21 localhost nova_compute[279673]: 2025-11-28 10:07:21.027 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:21 localhost neutron_sriov_agent[254147]: 2025-11-28 10:07:21.377 2 INFO neutron.agent.securitygroups_rpc [None req-fe8ed995-ee4b-4312-80c7-fb60647feb81 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['235f4ca9-4e7e-483e-ba22-a609f7751fe8']#033[00m Nov 28 05:07:21 localhost systemd[1]: run-netns-qdhcp\x2de9912372\x2dcec2\x2d427d\x2db398\x2d8b7ba1d00441.mount: Deactivated successfully. Nov 28 05:07:21 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:21.560 261084 INFO neutron.agent.dhcp.agent [None req-cc845ac2-af30-45d2-8b35-e5e6b189e3e7 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:07:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e193 do_prune osdmap full prune enabled Nov 28 05:07:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e194 e194: 6 total, 6 up, 6 in Nov 28 05:07:21 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e194: 6 total, 6 up, 6 in Nov 28 05:07:21 localhost ovn_controller[152322]: 2025-11-28T10:07:21Z|00439|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:07:21 localhost nova_compute[279673]: 2025-11-28 10:07:21.729 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:22 localhost podman[326788]: 2025-11-28 10:07:22.553187072 +0000 UTC m=+0.067388028 container kill c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 28 05:07:22 localhost dnsmasq[324234]: exiting on receipt of SIGTERM Nov 28 05:07:22 localhost systemd[1]: libpod-c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952.scope: Deactivated successfully. Nov 28 05:07:22 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e194 do_prune osdmap full prune enabled Nov 28 05:07:22 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e195 e195: 6 total, 6 up, 6 in Nov 28 05:07:22 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e195: 6 total, 6 up, 6 in Nov 28 05:07:22 localhost podman[326803]: 2025-11-28 10:07:22.649294973 +0000 UTC m=+0.076450296 container died c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 28 05:07:22 localhost systemd[1]: tmp-crun.tmSlzw.mount: Deactivated successfully. Nov 28 05:07:22 localhost ovn_controller[152322]: 2025-11-28T10:07:22Z|00440|binding|INFO|Removing iface tap1bff36cc-f5 ovn-installed in OVS Nov 28 05:07:22 localhost podman[326803]: 2025-11-28 10:07:22.749982066 +0000 UTC m=+0.177137339 container cleanup c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS) Nov 28 05:07:22 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:22.751 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port d4b5658d-b7d6-4f11-9507-550379ce2d7c with type ""#033[00m Nov 28 05:07:22 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:22.752 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-4dc5b71e-287e-4ec6-b6b7-4d131e85d551', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4dc5b71e-287e-4ec6-b6b7-4d131e85d551', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '29d0d5b3ba0745d58aee3845ea704b73', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c0295c06-e7c1-42d0-9d25-c6c6ebd15e16, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1bff36cc-f508-4066-a5d7-c55bc5baf4a9) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:07:22 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:22.754 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 1bff36cc-f508-4066-a5d7-c55bc5baf4a9 in datapath 4dc5b71e-287e-4ec6-b6b7-4d131e85d551 unbound from our chassis#033[00m Nov 28 05:07:22 localhost ovn_controller[152322]: 2025-11-28T10:07:22Z|00441|binding|INFO|Removing lport 1bff36cc-f508-4066-a5d7-c55bc5baf4a9 ovn-installed in OVS Nov 28 05:07:22 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:22.757 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4dc5b71e-287e-4ec6-b6b7-4d131e85d551, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:07:22 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:22.758 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[0789e78a-bf16-42fd-acbc-847949501156]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:07:22 localhost systemd[1]: libpod-conmon-c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952.scope: Deactivated successfully. Nov 28 05:07:22 localhost podman[326804]: 2025-11-28 10:07:22.781730554 +0000 UTC m=+0.205308747 container remove c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4dc5b71e-287e-4ec6-b6b7-4d131e85d551, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:07:22 localhost nova_compute[279673]: 2025-11-28 10:07:22.799 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:22 localhost nova_compute[279673]: 2025-11-28 10:07:22.807 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:22 localhost kernel: device tap1bff36cc-f5 left promiscuous mode Nov 28 05:07:22 localhost nova_compute[279673]: 2025-11-28 10:07:22.819 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:22 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:22.841 261084 INFO neutron.agent.dhcp.agent [None req-5eb5fd94-8165-46ea-b681-aa8e86340968 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:07:23 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:23.033 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:07:23 localhost systemd[1]: var-lib-containers-storage-overlay-518cfa22c4e318530a6e5aefab2c4e20bcb90abb837fcbd18b8f75b2b31294f3-merged.mount: Deactivated successfully. Nov 28 05:07:23 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c34b0864521ca6e0485c3e746c68d7ef4b44888e5e0551f73bd31f34fa84e952-userdata-shm.mount: Deactivated successfully. Nov 28 05:07:23 localhost systemd[1]: run-netns-qdhcp\x2d4dc5b71e\x2d287e\x2d4ec6\x2db6b7\x2d4d131e85d551.mount: Deactivated successfully. Nov 28 05:07:23 localhost ovn_controller[152322]: 2025-11-28T10:07:23Z|00442|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:07:24 localhost nova_compute[279673]: 2025-11-28 10:07:24.027 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:24 localhost neutron_sriov_agent[254147]: 2025-11-28 10:07:24.085 2 INFO neutron.agent.securitygroups_rpc [None req-b1e53ab7-c922-4511-9eea-36891b374394 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['ef6c27ab-7008-4940-88ab-f495a3348997', '235f4ca9-4e7e-483e-ba22-a609f7751fe8', 'ad28b9ca-0164-4a23-9923-7d61ac565e84']#033[00m Nov 28 05:07:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e195 do_prune osdmap full prune enabled Nov 28 05:07:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e196 e196: 6 total, 6 up, 6 in Nov 28 05:07:24 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e196: 6 total, 6 up, 6 in Nov 28 05:07:24 localhost neutron_sriov_agent[254147]: 2025-11-28 10:07:24.749 2 INFO neutron.agent.securitygroups_rpc [None req-ded8a162-fe49-472e-9107-11e07cb8573a e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['ef6c27ab-7008-4940-88ab-f495a3348997', 'ad28b9ca-0164-4a23-9923-7d61ac565e84']#033[00m Nov 28 05:07:24 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 05:07:24 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:07:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:07:25 localhost systemd[1]: tmp-crun.WHYdv1.mount: Deactivated successfully. Nov 28 05:07:25 localhost podman[326916]: 2025-11-28 10:07:25.080261327 +0000 UTC m=+0.111039052 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 05:07:25 localhost podman[326916]: 2025-11-28 10:07:25.087893512 +0000 UTC m=+0.118671227 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 28 05:07:25 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:07:25 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e196 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:07:25 localhost neutron_sriov_agent[254147]: 2025-11-28 10:07:25.418 2 INFO neutron.agent.securitygroups_rpc [None req-84883400-a0bd-45dd-a8ae-3bc8b417b162 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['acf02bd6-8fdb-4bdf-b655-c11d3c48057a']#033[00m Nov 28 05:07:25 localhost neutron_sriov_agent[254147]: 2025-11-28 10:07:25.467 2 INFO neutron.agent.securitygroups_rpc [None req-0fb934e7-9ad4-4c2e-8ef8-4b9c21b34e7a 6b80a7d6f65f4ebe8363ccaae26f3e87 ae10569a38284f298c961498da620c5f - - default default] Security group member updated ['c5eee24b-0bed-4035-a2ab-e6c531c94e43']#033[00m Nov 28 05:07:25 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:25.550 261084 INFO neutron.agent.linux.ip_lib [None req-d39bfd49-8596-4874-b6a1-9865d4404117 - - - - - -] Device tapce85abaa-55 cannot be used as it has no MAC address#033[00m Nov 28 05:07:25 localhost nova_compute[279673]: 2025-11-28 10:07:25.575 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:25 localhost kernel: device tapce85abaa-55 entered promiscuous mode Nov 28 05:07:25 localhost nova_compute[279673]: 2025-11-28 10:07:25.584 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:25 localhost NetworkManager[5967]: [1764324445.5858] manager: (tapce85abaa-55): new Generic device (/org/freedesktop/NetworkManager/Devices/71) Nov 28 05:07:25 localhost ovn_controller[152322]: 2025-11-28T10:07:25Z|00443|binding|INFO|Claiming lport ce85abaa-55ed-4384-873f-4fd7f3eb0d9a for this chassis. Nov 28 05:07:25 localhost ovn_controller[152322]: 2025-11-28T10:07:25Z|00444|binding|INFO|ce85abaa-55ed-4384-873f-4fd7f3eb0d9a: Claiming unknown Nov 28 05:07:25 localhost systemd-udevd[326949]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:07:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:25.603 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ea7bdd98a9904e47a2dfed5bcc54bc4a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e24bc530-a0d1-4a44-8a84-effdb241e447, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ce85abaa-55ed-4384-873f-4fd7f3eb0d9a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:07:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:25.606 158130 INFO neutron.agent.ovn.metadata.agent [-] Port ce85abaa-55ed-4384-873f-4fd7f3eb0d9a in datapath 92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4 bound to our chassis#033[00m Nov 28 05:07:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:25.609 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port a82ab757-b45b-4425-bce5-f385f9345b1a IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:07:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:25.609 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:07:25 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:25.610 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[ec241232-189e-4039-81cd-1dab12410f05]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:07:25 localhost journal[227875]: ethtool ioctl error on tapce85abaa-55: No such device Nov 28 05:07:25 localhost ovn_controller[152322]: 2025-11-28T10:07:25Z|00445|binding|INFO|Setting lport ce85abaa-55ed-4384-873f-4fd7f3eb0d9a ovn-installed in OVS Nov 28 05:07:25 localhost ovn_controller[152322]: 2025-11-28T10:07:25Z|00446|binding|INFO|Setting lport ce85abaa-55ed-4384-873f-4fd7f3eb0d9a up in Southbound Nov 28 05:07:25 localhost journal[227875]: ethtool ioctl error on tapce85abaa-55: No such device Nov 28 05:07:25 localhost nova_compute[279673]: 2025-11-28 10:07:25.618 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:25 localhost journal[227875]: ethtool ioctl error on tapce85abaa-55: No such device Nov 28 05:07:25 localhost journal[227875]: ethtool ioctl error on tapce85abaa-55: No such device Nov 28 05:07:25 localhost journal[227875]: ethtool ioctl error on tapce85abaa-55: No such device Nov 28 05:07:25 localhost journal[227875]: ethtool ioctl error on tapce85abaa-55: No such device Nov 28 05:07:25 localhost journal[227875]: ethtool ioctl error on tapce85abaa-55: No such device Nov 28 05:07:25 localhost journal[227875]: ethtool ioctl error on tapce85abaa-55: No such device Nov 28 05:07:25 localhost nova_compute[279673]: 2025-11-28 10:07:25.657 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:25 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 05:07:25 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:07:25 localhost nova_compute[279673]: 2025-11-28 10:07:25.688 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:25 localhost nova_compute[279673]: 2025-11-28 10:07:25.859 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:25 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 28 05:07:25 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:07:26 localhost nova_compute[279673]: 2025-11-28 10:07:26.031 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:26 localhost podman[327020]: Nov 28 05:07:26 localhost podman[327020]: 2025-11-28 10:07:26.611176958 +0000 UTC m=+0.094625687 container create 99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:07:26 localhost podman[327020]: 2025-11-28 10:07:26.564653254 +0000 UTC m=+0.048102003 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:07:26 localhost systemd[1]: Started libpod-conmon-99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d.scope. Nov 28 05:07:26 localhost systemd[1]: tmp-crun.UnMZ5k.mount: Deactivated successfully. Nov 28 05:07:26 localhost systemd[1]: Started libcrun container. Nov 28 05:07:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82a10448f7eafab39fdb6ae375b218bc52faf1c65a99161c7b78ef796fa5b5c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:07:26 localhost podman[327020]: 2025-11-28 10:07:26.742518065 +0000 UTC m=+0.225966784 container init 99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3) Nov 28 05:07:26 localhost podman[327020]: 2025-11-28 10:07:26.752382429 +0000 UTC m=+0.235831158 container start 99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:07:26 localhost dnsmasq[327038]: started, version 2.85 cachesize 150 Nov 28 05:07:26 localhost dnsmasq[327038]: DNS service limited to local subnets Nov 28 05:07:26 localhost dnsmasq[327038]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:07:26 localhost dnsmasq[327038]: warning: no upstream servers configured Nov 28 05:07:26 localhost dnsmasq-dhcp[327038]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:07:26 localhost dnsmasq[327038]: read /var/lib/neutron/dhcp/92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4/addn_hosts - 0 addresses Nov 28 05:07:26 localhost dnsmasq-dhcp[327038]: read /var/lib/neutron/dhcp/92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4/host Nov 28 05:07:26 localhost dnsmasq-dhcp[327038]: read /var/lib/neutron/dhcp/92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4/opts Nov 28 05:07:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:07:26 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/408988445' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:07:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:07:26 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/408988445' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:07:26 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:07:27 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:27.011 261084 INFO neutron.agent.dhcp.agent [None req-bedd59c9-509d-419b-8106-7330bf2f0acf - - - - - -] DHCP configuration for ports {'1a03bcf8-8713-4678-8570-227cfd5a5392'} is completed#033[00m Nov 28 05:07:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:07:27 localhost podman[327039]: 2025-11-28 10:07:27.138493006 +0000 UTC m=+0.094283825 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:07:27 localhost podman[327039]: 2025-11-28 10:07:27.167621904 +0000 UTC m=+0.123412693 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125) Nov 28 05:07:27 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:07:27 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:27.300 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:07:26Z, description=, device_id=b1aacfa1-fecc-4e03-9799-90b5b92b4c0a, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=042bddfe-2b7c-46de-b29c-e6d7c7393875, ip_allocation=immediate, mac_address=fa:16:3e:3d:6c:06, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:07:18Z, description=, dns_domain=, id=92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--616540833, port_security_enabled=True, project_id=ea7bdd98a9904e47a2dfed5bcc54bc4a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=53568, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2487, status=ACTIVE, subnets=['bf8de819-4f5d-4834-9015-5caa46badb1e'], tags=[], tenant_id=ea7bdd98a9904e47a2dfed5bcc54bc4a, updated_at=2025-11-28T10:07:22Z, vlan_transparent=None, network_id=92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, port_security_enabled=False, project_id=ea7bdd98a9904e47a2dfed5bcc54bc4a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2527, status=DOWN, tags=[], tenant_id=ea7bdd98a9904e47a2dfed5bcc54bc4a, updated_at=2025-11-28T10:07:26Z on network 92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4#033[00m Nov 28 05:07:27 localhost dnsmasq[327038]: read /var/lib/neutron/dhcp/92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4/addn_hosts - 1 addresses Nov 28 05:07:27 localhost dnsmasq-dhcp[327038]: read /var/lib/neutron/dhcp/92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4/host Nov 28 05:07:27 localhost dnsmasq-dhcp[327038]: read /var/lib/neutron/dhcp/92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4/opts Nov 28 05:07:27 localhost podman[327077]: 2025-11-28 10:07:27.542592978 +0000 UTC m=+0.062464377 container kill 99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 28 05:07:27 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:27.766 261084 INFO neutron.agent.dhcp.agent [None req-13302878-b347-41b2-b109-a81db60f8daa - - - - - -] DHCP configuration for ports {'042bddfe-2b7c-46de-b29c-e6d7c7393875'} is completed#033[00m Nov 28 05:07:29 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:07:29 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4119185207' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:07:29 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:07:29 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4119185207' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:07:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:07:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:07:29 localhost podman[327100]: 2025-11-28 10:07:29.861849599 +0000 UTC m=+0.092027396 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=edpm, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 05:07:29 localhost podman[327101]: 2025-11-28 10:07:29.912533562 +0000 UTC m=+0.137343933 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Nov 28 05:07:29 localhost podman[327100]: 2025-11-28 10:07:29.932317431 +0000 UTC m=+0.162495188 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 05:07:29 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:07:29 localhost podman[327101]: 2025-11-28 10:07:29.946352834 +0000 UTC m=+0.171163175 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:07:29 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:07:30 localhost neutron_sriov_agent[254147]: 2025-11-28 10:07:30.293 2 INFO neutron.agent.securitygroups_rpc [None req-bb1d0f4f-4080-47c2-b71c-a5aaec3a62e2 e32848e36ae94f66ae634ff4d7716d6f 8462a4a9a313405e8fd212f9ec4a0c92 - - default default] Security group member updated ['78343c03-098f-4faf-a880-2814fe3611d6']#033[00m Nov 28 05:07:30 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:30.314 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:07:26Z, description=, device_id=b1aacfa1-fecc-4e03-9799-90b5b92b4c0a, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=042bddfe-2b7c-46de-b29c-e6d7c7393875, ip_allocation=immediate, mac_address=fa:16:3e:3d:6c:06, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:07:18Z, description=, dns_domain=, id=92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--616540833, port_security_enabled=True, project_id=ea7bdd98a9904e47a2dfed5bcc54bc4a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=53568, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2487, status=ACTIVE, subnets=['bf8de819-4f5d-4834-9015-5caa46badb1e'], tags=[], tenant_id=ea7bdd98a9904e47a2dfed5bcc54bc4a, updated_at=2025-11-28T10:07:22Z, vlan_transparent=None, network_id=92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, port_security_enabled=False, project_id=ea7bdd98a9904e47a2dfed5bcc54bc4a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2527, status=DOWN, tags=[], tenant_id=ea7bdd98a9904e47a2dfed5bcc54bc4a, updated_at=2025-11-28T10:07:26Z on network 92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4#033[00m Nov 28 05:07:30 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e196 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:07:30 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e196 do_prune osdmap full prune enabled Nov 28 05:07:30 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e197 e197: 6 total, 6 up, 6 in Nov 28 05:07:30 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e197: 6 total, 6 up, 6 in Nov 28 05:07:30 localhost dnsmasq[327038]: read /var/lib/neutron/dhcp/92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4/addn_hosts - 1 addresses Nov 28 05:07:30 localhost podman[327158]: 2025-11-28 10:07:30.565095678 +0000 UTC m=+0.048387771 container kill 99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Nov 28 05:07:30 localhost dnsmasq-dhcp[327038]: read /var/lib/neutron/dhcp/92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4/host Nov 28 05:07:30 localhost dnsmasq-dhcp[327038]: read /var/lib/neutron/dhcp/92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4/opts Nov 28 05:07:30 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:30.833 261084 INFO neutron.agent.dhcp.agent [None req-1b50611d-c2b0-4e83-b826-b82babeb1694 - - - - - -] DHCP configuration for ports {'042bddfe-2b7c-46de-b29c-e6d7c7393875'} is completed#033[00m Nov 28 05:07:30 localhost nova_compute[279673]: 2025-11-28 10:07:30.861 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:31 localhost nova_compute[279673]: 2025-11-28 10:07:31.034 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:31 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:31.389 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:3b:02 10.100.0.19 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af9fa2ef-906d-4c5f-8a61-e350b84d90cf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=636c21fa-d6bd-405e-95ed-d59498827d6f) old=Port_Binding(mac=['fa:16:3e:af:3b:02 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:07:31 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:31.391 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 636c21fa-d6bd-405e-95ed-d59498827d6f in datapath 744b5a82-3c5c-4b41-ba44-527244a209c4 updated#033[00m Nov 28 05:07:31 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:31.394 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 744b5a82-3c5c-4b41-ba44-527244a209c4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:07:31 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:31.395 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[bed56a50-2b2f-4f8f-bdf2-c06e8e8aea0c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:07:32 localhost neutron_sriov_agent[254147]: 2025-11-28 10:07:32.625 2 INFO neutron.agent.securitygroups_rpc [None req-b1e499b5-5b30-44cb-89ef-2d90dabf973f 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['b5d46958-1542-44c0-a82a-37e69acb7089', 'acf02bd6-8fdb-4bdf-b655-c11d3c48057a']#033[00m Nov 28 05:07:32 localhost dnsmasq[327038]: read /var/lib/neutron/dhcp/92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4/addn_hosts - 0 addresses Nov 28 05:07:32 localhost podman[327197]: 2025-11-28 10:07:32.700105033 +0000 UTC m=+0.072608098 container kill 99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 05:07:32 localhost dnsmasq-dhcp[327038]: read /var/lib/neutron/dhcp/92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4/host Nov 28 05:07:32 localhost dnsmasq-dhcp[327038]: read /var/lib/neutron/dhcp/92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4/opts Nov 28 05:07:32 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:32.769 261084 INFO neutron.agent.linux.ip_lib [None req-b63dd2b8-eda8-420d-866d-e5e5e70c08ce - - - - - -] Device tap5d322373-6d cannot be used as it has no MAC address#033[00m Nov 28 05:07:32 localhost nova_compute[279673]: 2025-11-28 10:07:32.840 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:32 localhost kernel: device tap5d322373-6d entered promiscuous mode Nov 28 05:07:32 localhost nova_compute[279673]: 2025-11-28 10:07:32.849 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:32 localhost ovn_controller[152322]: 2025-11-28T10:07:32Z|00447|binding|INFO|Claiming lport 5d322373-6d84-4420-9452-6cf70afb5d0c for this chassis. Nov 28 05:07:32 localhost ovn_controller[152322]: 2025-11-28T10:07:32Z|00448|binding|INFO|5d322373-6d84-4420-9452-6cf70afb5d0c: Claiming unknown Nov 28 05:07:32 localhost NetworkManager[5967]: [1764324452.8527] manager: (tap5d322373-6d): new Generic device (/org/freedesktop/NetworkManager/Devices/72) Nov 28 05:07:32 localhost systemd-udevd[327227]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:07:32 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:32.859 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe37:d8d9/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-daa663db-7797-4de5-a3b0-b0197a7ec3d6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-daa663db-7797-4de5-a3b0-b0197a7ec3d6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ae10569a38284f298c961498da620c5f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=949040a4-3a48-4918-a792-56947d4e0e1e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5d322373-6d84-4420-9452-6cf70afb5d0c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:07:32 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:32.861 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 5d322373-6d84-4420-9452-6cf70afb5d0c in datapath daa663db-7797-4de5-a3b0-b0197a7ec3d6 bound to our chassis#033[00m Nov 28 05:07:32 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:32.863 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network daa663db-7797-4de5-a3b0-b0197a7ec3d6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:07:32 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:32.864 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[9ac2a2d2-5f44-4441-a46a-1bbb76650433]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:07:32 localhost journal[227875]: ethtool ioctl error on tap5d322373-6d: No such device Nov 28 05:07:32 localhost ovn_controller[152322]: 2025-11-28T10:07:32Z|00449|binding|INFO|Setting lport 5d322373-6d84-4420-9452-6cf70afb5d0c ovn-installed in OVS Nov 28 05:07:32 localhost ovn_controller[152322]: 2025-11-28T10:07:32Z|00450|binding|INFO|Setting lport 5d322373-6d84-4420-9452-6cf70afb5d0c up in Southbound Nov 28 05:07:32 localhost nova_compute[279673]: 2025-11-28 10:07:32.887 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:32 localhost journal[227875]: ethtool ioctl error on tap5d322373-6d: No such device Nov 28 05:07:32 localhost journal[227875]: ethtool ioctl error on tap5d322373-6d: No such device Nov 28 05:07:32 localhost journal[227875]: ethtool ioctl error on tap5d322373-6d: No such device Nov 28 05:07:32 localhost journal[227875]: ethtool ioctl error on tap5d322373-6d: No such device Nov 28 05:07:32 localhost journal[227875]: ethtool ioctl error on tap5d322373-6d: No such device Nov 28 05:07:32 localhost journal[227875]: ethtool ioctl error on tap5d322373-6d: No such device Nov 28 05:07:32 localhost journal[227875]: ethtool ioctl error on tap5d322373-6d: No such device Nov 28 05:07:32 localhost nova_compute[279673]: 2025-11-28 10:07:32.940 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:32 localhost ovn_controller[152322]: 2025-11-28T10:07:32Z|00451|binding|INFO|Releasing lport ce85abaa-55ed-4384-873f-4fd7f3eb0d9a from this chassis (sb_readonly=0) Nov 28 05:07:32 localhost nova_compute[279673]: 2025-11-28 10:07:32.960 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:32 localhost ovn_controller[152322]: 2025-11-28T10:07:32Z|00452|binding|INFO|Setting lport ce85abaa-55ed-4384-873f-4fd7f3eb0d9a down in Southbound Nov 28 05:07:32 localhost kernel: device tapce85abaa-55 left promiscuous mode Nov 28 05:07:32 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:32.968 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ea7bdd98a9904e47a2dfed5bcc54bc4a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e24bc530-a0d1-4a44-8a84-effdb241e447, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ce85abaa-55ed-4384-873f-4fd7f3eb0d9a) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:07:32 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:32.970 158130 INFO neutron.agent.ovn.metadata.agent [-] Port ce85abaa-55ed-4384-873f-4fd7f3eb0d9a in datapath 92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4 unbound from our chassis#033[00m Nov 28 05:07:32 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:32.973 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:07:32 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:32.974 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[d7ec5c04-8b84-4678-ae20-ad8c963935ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:07:32 localhost nova_compute[279673]: 2025-11-28 10:07:32.981 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:32 localhost nova_compute[279673]: 2025-11-28 10:07:32.985 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:33 localhost neutron_sriov_agent[254147]: 2025-11-28 10:07:33.592 2 INFO neutron.agent.securitygroups_rpc [None req-7e9cce24-3864-452e-838f-0b8e85be3343 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['b5d46958-1542-44c0-a82a-37e69acb7089']#033[00m Nov 28 05:07:34 localhost ovn_controller[152322]: 2025-11-28T10:07:34Z|00453|binding|INFO|Removing iface tap5d322373-6d ovn-installed in OVS Nov 28 05:07:34 localhost ovn_controller[152322]: 2025-11-28T10:07:34Z|00454|binding|INFO|Removing lport 5d322373-6d84-4420-9452-6cf70afb5d0c ovn-installed in OVS Nov 28 05:07:34 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:34.744 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 16d3b12c-1081-4d8a-95a4-ea996c678275 with type ""#033[00m Nov 28 05:07:34 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:34.746 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-daa663db-7797-4de5-a3b0-b0197a7ec3d6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-daa663db-7797-4de5-a3b0-b0197a7ec3d6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ae10569a38284f298c961498da620c5f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=949040a4-3a48-4918-a792-56947d4e0e1e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5d322373-6d84-4420-9452-6cf70afb5d0c) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:07:34 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:34.747 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 5d322373-6d84-4420-9452-6cf70afb5d0c in datapath daa663db-7797-4de5-a3b0-b0197a7ec3d6 unbound from our chassis#033[00m Nov 28 05:07:34 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:34.749 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network daa663db-7797-4de5-a3b0-b0197a7ec3d6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:07:34 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:34.775 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[8f28470d-dbe0-4fba-8aaf-a80cc68a64dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:07:34 localhost nova_compute[279673]: 2025-11-28 10:07:34.776 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:34 localhost ovn_controller[152322]: 2025-11-28T10:07:34Z|00455|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:07:34 localhost nova_compute[279673]: 2025-11-28 10:07:34.899 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:35 localhost nova_compute[279673]: 2025-11-28 10:07:35.891 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:36 localhost nova_compute[279673]: 2025-11-28 10:07:36.398 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:36 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e197 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:07:36 localhost podman[327299]: Nov 28 05:07:36 localhost podman[327299]: 2025-11-28 10:07:36.531008592 +0000 UTC m=+0.075518128 container create 3055ecc468897a11ebcb877d0194042c517b3fedc96876f552471a5d8cbabedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-daa663db-7797-4de5-a3b0-b0197a7ec3d6, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:07:36 localhost systemd[1]: Started libpod-conmon-3055ecc468897a11ebcb877d0194042c517b3fedc96876f552471a5d8cbabedf.scope. Nov 28 05:07:36 localhost systemd[1]: tmp-crun.mvCUfp.mount: Deactivated successfully. Nov 28 05:07:36 localhost podman[327299]: 2025-11-28 10:07:36.491257148 +0000 UTC m=+0.035766714 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:07:36 localhost systemd[1]: Started libcrun container. Nov 28 05:07:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/127f1433ad0e975c7d90441f253c42229b39618ab4ce9e24e298473f69915576/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:07:36 localhost podman[327299]: 2025-11-28 10:07:36.609553762 +0000 UTC m=+0.154063308 container init 3055ecc468897a11ebcb877d0194042c517b3fedc96876f552471a5d8cbabedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-daa663db-7797-4de5-a3b0-b0197a7ec3d6, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true) Nov 28 05:07:36 localhost podman[327299]: 2025-11-28 10:07:36.622001896 +0000 UTC m=+0.166511432 container start 3055ecc468897a11ebcb877d0194042c517b3fedc96876f552471a5d8cbabedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-daa663db-7797-4de5-a3b0-b0197a7ec3d6, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:07:36 localhost dnsmasq[327317]: started, version 2.85 cachesize 150 Nov 28 05:07:36 localhost dnsmasq[327317]: DNS service limited to local subnets Nov 28 05:07:36 localhost dnsmasq[327317]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:07:36 localhost dnsmasq[327317]: warning: no upstream servers configured Nov 28 05:07:36 localhost dnsmasq[327317]: read /var/lib/neutron/dhcp/daa663db-7797-4de5-a3b0-b0197a7ec3d6/addn_hosts - 0 addresses Nov 28 05:07:36 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:36.737 261084 INFO neutron.agent.dhcp.agent [None req-d95b2dfb-3a6b-473b-bb81-b381767e15ed - - - - - -] DHCP configuration for ports {'2795b1ae-64ed-4052-8e9d-7d6b8bc4f852'} is completed#033[00m Nov 28 05:07:36 localhost dnsmasq[327317]: exiting on receipt of SIGTERM Nov 28 05:07:36 localhost systemd[1]: libpod-3055ecc468897a11ebcb877d0194042c517b3fedc96876f552471a5d8cbabedf.scope: Deactivated successfully. Nov 28 05:07:36 localhost podman[327335]: 2025-11-28 10:07:36.849424353 +0000 UTC m=+0.071096342 container kill 3055ecc468897a11ebcb877d0194042c517b3fedc96876f552471a5d8cbabedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-daa663db-7797-4de5-a3b0-b0197a7ec3d6, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 05:07:36 localhost podman[327347]: 2025-11-28 10:07:36.911894267 +0000 UTC m=+0.054489859 container died 3055ecc468897a11ebcb877d0194042c517b3fedc96876f552471a5d8cbabedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-daa663db-7797-4de5-a3b0-b0197a7ec3d6, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:07:36 localhost podman[327347]: 2025-11-28 10:07:36.943068778 +0000 UTC m=+0.085664370 container cleanup 3055ecc468897a11ebcb877d0194042c517b3fedc96876f552471a5d8cbabedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-daa663db-7797-4de5-a3b0-b0197a7ec3d6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:07:36 localhost systemd[1]: libpod-conmon-3055ecc468897a11ebcb877d0194042c517b3fedc96876f552471a5d8cbabedf.scope: Deactivated successfully. Nov 28 05:07:36 localhost podman[327354]: 2025-11-28 10:07:36.962251799 +0000 UTC m=+0.088483248 container remove 3055ecc468897a11ebcb877d0194042c517b3fedc96876f552471a5d8cbabedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-daa663db-7797-4de5-a3b0-b0197a7ec3d6, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 05:07:37 localhost nova_compute[279673]: 2025-11-28 10:07:37.006 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:37 localhost kernel: device tap5d322373-6d left promiscuous mode Nov 28 05:07:37 localhost nova_compute[279673]: 2025-11-28 10:07:37.023 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:37 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:37.080 261084 INFO neutron.agent.dhcp.agent [None req-3ba41adf-3747-42f4-8d5a-9aecc82b2488 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:07:37 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:37.080 261084 INFO neutron.agent.dhcp.agent [None req-3ba41adf-3747-42f4-8d5a-9aecc82b2488 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:07:37 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:37.091 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:3b:02 10.100.0.19 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af9fa2ef-906d-4c5f-8a61-e350b84d90cf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=636c21fa-d6bd-405e-95ed-d59498827d6f) old=Port_Binding(mac=['fa:16:3e:af:3b:02 10.100.0.19 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:07:37 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:37.093 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 636c21fa-d6bd-405e-95ed-d59498827d6f in datapath 744b5a82-3c5c-4b41-ba44-527244a209c4 updated#033[00m Nov 28 05:07:37 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:37.096 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 744b5a82-3c5c-4b41-ba44-527244a209c4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:07:37 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:37.097 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[bdc38dfe-92c9-4160-b4bd-4a4e7e246c07]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:07:37 localhost systemd[1]: var-lib-containers-storage-overlay-127f1433ad0e975c7d90441f253c42229b39618ab4ce9e24e298473f69915576-merged.mount: Deactivated successfully. Nov 28 05:07:37 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3055ecc468897a11ebcb877d0194042c517b3fedc96876f552471a5d8cbabedf-userdata-shm.mount: Deactivated successfully. Nov 28 05:07:37 localhost systemd[1]: run-netns-qdhcp\x2ddaa663db\x2d7797\x2d4de5\x2da3b0\x2db0197a7ec3d6.mount: Deactivated successfully. Nov 28 05:07:37 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:37.733 261084 INFO neutron.agent.linux.ip_lib [None req-c525de8f-f56d-43be-a185-942dd3110014 - - - - - -] Device tap00e4c315-3e cannot be used as it has no MAC address#033[00m Nov 28 05:07:37 localhost nova_compute[279673]: 2025-11-28 10:07:37.755 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:37 localhost kernel: device tap00e4c315-3e entered promiscuous mode Nov 28 05:07:37 localhost NetworkManager[5967]: [1764324457.7629] manager: (tap00e4c315-3e): new Generic device (/org/freedesktop/NetworkManager/Devices/73) Nov 28 05:07:37 localhost nova_compute[279673]: 2025-11-28 10:07:37.762 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:37 localhost ovn_controller[152322]: 2025-11-28T10:07:37Z|00456|binding|INFO|Claiming lport 00e4c315-3e1e-4938-965c-c4f68912eeb6 for this chassis. Nov 28 05:07:37 localhost ovn_controller[152322]: 2025-11-28T10:07:37Z|00457|binding|INFO|00e4c315-3e1e-4938-965c-c4f68912eeb6: Claiming unknown Nov 28 05:07:37 localhost systemd-udevd[327387]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:07:37 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:37.780 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-ad4aa8e2-9c92-45f9-bbe9-94669f61eefc', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad4aa8e2-9c92-45f9-bbe9-94669f61eefc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ae10569a38284f298c961498da620c5f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=33dc3788-7d28-4537-aa7d-ffee00e0827e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=00e4c315-3e1e-4938-965c-c4f68912eeb6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:07:37 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:37.782 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 00e4c315-3e1e-4938-965c-c4f68912eeb6 in datapath ad4aa8e2-9c92-45f9-bbe9-94669f61eefc bound to our chassis#033[00m Nov 28 05:07:37 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:37.783 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ad4aa8e2-9c92-45f9-bbe9-94669f61eefc or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:07:37 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:37.784 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[6673fee1-4b97-4476-a6e2-d053176068de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:07:37 localhost ovn_controller[152322]: 2025-11-28T10:07:37Z|00458|binding|INFO|Setting lport 00e4c315-3e1e-4938-965c-c4f68912eeb6 ovn-installed in OVS Nov 28 05:07:37 localhost ovn_controller[152322]: 2025-11-28T10:07:37Z|00459|binding|INFO|Setting lport 00e4c315-3e1e-4938-965c-c4f68912eeb6 up in Southbound Nov 28 05:07:37 localhost nova_compute[279673]: 2025-11-28 10:07:37.803 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:37 localhost nova_compute[279673]: 2025-11-28 10:07:37.836 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:37 localhost nova_compute[279673]: 2025-11-28 10:07:37.861 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:38 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e197 do_prune osdmap full prune enabled Nov 28 05:07:38 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e198 e198: 6 total, 6 up, 6 in Nov 28 05:07:38 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e198: 6 total, 6 up, 6 in Nov 28 05:07:38 localhost ovn_controller[152322]: 2025-11-28T10:07:38Z|00460|binding|INFO|Removing iface tap00e4c315-3e ovn-installed in OVS Nov 28 05:07:38 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:38.660 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 671b42da-83dc-42d9-a64b-2b08afdb0874 with type ""#033[00m Nov 28 05:07:38 localhost ovn_controller[152322]: 2025-11-28T10:07:38Z|00461|binding|INFO|Removing lport 00e4c315-3e1e-4938-965c-c4f68912eeb6 ovn-installed in OVS Nov 28 05:07:38 localhost nova_compute[279673]: 2025-11-28 10:07:38.700 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:38 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:38.701 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-ad4aa8e2-9c92-45f9-bbe9-94669f61eefc', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad4aa8e2-9c92-45f9-bbe9-94669f61eefc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ae10569a38284f298c961498da620c5f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=33dc3788-7d28-4537-aa7d-ffee00e0827e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=00e4c315-3e1e-4938-965c-c4f68912eeb6) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:07:38 localhost podman[327442]: Nov 28 05:07:38 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:38.704 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 00e4c315-3e1e-4938-965c-c4f68912eeb6 in datapath ad4aa8e2-9c92-45f9-bbe9-94669f61eefc unbound from our chassis#033[00m Nov 28 05:07:38 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:38.705 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ad4aa8e2-9c92-45f9-bbe9-94669f61eefc or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:07:38 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:38.706 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[74cc4119-f1d8-4d2c-98b9-fe44647cae50]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:07:38 localhost podman[327442]: 2025-11-28 10:07:38.716962396 +0000 UTC m=+0.131464202 container create 121ee354b14d751973dfa6418a9f1066874b780d89e8d75d41c201c608817066 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad4aa8e2-9c92-45f9-bbe9-94669f61eefc, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:07:38 localhost podman[327442]: 2025-11-28 10:07:38.636567469 +0000 UTC m=+0.051069295 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:07:38 localhost systemd[1]: Started libpod-conmon-121ee354b14d751973dfa6418a9f1066874b780d89e8d75d41c201c608817066.scope. Nov 28 05:07:38 localhost systemd[1]: Started libcrun container. Nov 28 05:07:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1cea78f4777bc89c99ec85a41246e19eead3cfa72d20170a383cd0e27dbabb0f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:07:38 localhost podman[327442]: 2025-11-28 10:07:38.79174214 +0000 UTC m=+0.206243936 container init 121ee354b14d751973dfa6418a9f1066874b780d89e8d75d41c201c608817066 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad4aa8e2-9c92-45f9-bbe9-94669f61eefc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Nov 28 05:07:38 localhost podman[327442]: 2025-11-28 10:07:38.800738807 +0000 UTC m=+0.215240603 container start 121ee354b14d751973dfa6418a9f1066874b780d89e8d75d41c201c608817066 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad4aa8e2-9c92-45f9-bbe9-94669f61eefc, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 05:07:38 localhost dnsmasq[327460]: started, version 2.85 cachesize 150 Nov 28 05:07:38 localhost dnsmasq[327460]: DNS service limited to local subnets Nov 28 05:07:38 localhost dnsmasq[327460]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:07:38 localhost dnsmasq[327460]: warning: no upstream servers configured Nov 28 05:07:38 localhost dnsmasq-dhcp[327460]: DHCPv6, static leases only on 2001:db8::, lease time 1d Nov 28 05:07:38 localhost dnsmasq[327460]: read /var/lib/neutron/dhcp/ad4aa8e2-9c92-45f9-bbe9-94669f61eefc/addn_hosts - 0 addresses Nov 28 05:07:38 localhost dnsmasq-dhcp[327460]: read /var/lib/neutron/dhcp/ad4aa8e2-9c92-45f9-bbe9-94669f61eefc/host Nov 28 05:07:38 localhost dnsmasq-dhcp[327460]: read /var/lib/neutron/dhcp/ad4aa8e2-9c92-45f9-bbe9-94669f61eefc/opts Nov 28 05:07:38 localhost ovn_controller[152322]: 2025-11-28T10:07:38Z|00462|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:07:38 localhost nova_compute[279673]: 2025-11-28 10:07:38.886 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:38 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:38.963 261084 INFO neutron.agent.dhcp.agent [None req-9c0aa46a-b06b-4fae-a46e-f35377c92106 - - - - - -] DHCP configuration for ports {'41918217-ad17-401a-852c-a666f673df3f'} is completed#033[00m Nov 28 05:07:39 localhost dnsmasq[327460]: exiting on receipt of SIGTERM Nov 28 05:07:39 localhost podman[327478]: 2025-11-28 10:07:39.035732308 +0000 UTC m=+0.048916888 container kill 121ee354b14d751973dfa6418a9f1066874b780d89e8d75d41c201c608817066 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad4aa8e2-9c92-45f9-bbe9-94669f61eefc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Nov 28 05:07:39 localhost systemd[1]: libpod-121ee354b14d751973dfa6418a9f1066874b780d89e8d75d41c201c608817066.scope: Deactivated successfully. Nov 28 05:07:39 localhost podman[327493]: 2025-11-28 10:07:39.105641142 +0000 UTC m=+0.052280921 container died 121ee354b14d751973dfa6418a9f1066874b780d89e8d75d41c201c608817066 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad4aa8e2-9c92-45f9-bbe9-94669f61eefc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 05:07:39 localhost podman[327493]: 2025-11-28 10:07:39.150865895 +0000 UTC m=+0.097505634 container remove 121ee354b14d751973dfa6418a9f1066874b780d89e8d75d41c201c608817066 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad4aa8e2-9c92-45f9-bbe9-94669f61eefc, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Nov 28 05:07:39 localhost systemd[1]: libpod-conmon-121ee354b14d751973dfa6418a9f1066874b780d89e8d75d41c201c608817066.scope: Deactivated successfully. Nov 28 05:07:39 localhost nova_compute[279673]: 2025-11-28 10:07:39.166 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:39 localhost kernel: device tap00e4c315-3e left promiscuous mode Nov 28 05:07:39 localhost nova_compute[279673]: 2025-11-28 10:07:39.183 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:39 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:39.213 261084 INFO neutron.agent.dhcp.agent [None req-5126da88-cfac-4ecb-b04a-5153371e68f8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:07:39 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:39.214 261084 INFO neutron.agent.dhcp.agent [None req-5126da88-cfac-4ecb-b04a-5153371e68f8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:07:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e198 do_prune osdmap full prune enabled Nov 28 05:07:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e199 e199: 6 total, 6 up, 6 in Nov 28 05:07:39 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e199: 6 total, 6 up, 6 in Nov 28 05:07:39 localhost systemd[1]: var-lib-containers-storage-overlay-1cea78f4777bc89c99ec85a41246e19eead3cfa72d20170a383cd0e27dbabb0f-merged.mount: Deactivated successfully. Nov 28 05:07:39 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-121ee354b14d751973dfa6418a9f1066874b780d89e8d75d41c201c608817066-userdata-shm.mount: Deactivated successfully. Nov 28 05:07:39 localhost systemd[1]: run-netns-qdhcp\x2dad4aa8e2\x2d9c92\x2d45f9\x2dbbe9\x2d94669f61eefc.mount: Deactivated successfully. Nov 28 05:07:39 localhost neutron_sriov_agent[254147]: 2025-11-28 10:07:39.727 2 INFO neutron.agent.securitygroups_rpc [None req-c0e4f748-a4dd-449c-b793-430f30c9256f 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['58a0f932-b9f0-4bad-a5cb-a3c8cba7c65b']#033[00m Nov 28 05:07:40 localhost podman[238687]: time="2025-11-28T10:07:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:07:40 localhost podman[238687]: @ - - [28/Nov/2025:10:07:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157512 "" "Go-http-client/1.1" Nov 28 05:07:40 localhost podman[238687]: @ - - [28/Nov/2025:10:07:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19747 "" "Go-http-client/1.1" Nov 28 05:07:40 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e199 do_prune osdmap full prune enabled Nov 28 05:07:40 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e200 e200: 6 total, 6 up, 6 in Nov 28 05:07:40 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e200: 6 total, 6 up, 6 in Nov 28 05:07:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:07:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:07:40 localhost podman[327520]: 2025-11-28 10:07:40.856672326 +0000 UTC m=+0.085364821 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 05:07:40 localhost podman[327520]: 2025-11-28 10:07:40.869388618 +0000 UTC m=+0.098081113 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 28 05:07:40 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:07:40 localhost nova_compute[279673]: 2025-11-28 10:07:40.894 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:40 localhost podman[327521]: 2025-11-28 10:07:40.969383648 +0000 UTC m=+0.193567735 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd) Nov 28 05:07:40 localhost podman[327521]: 2025-11-28 10:07:40.980324926 +0000 UTC m=+0.204508973 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 28 05:07:40 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:07:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e200 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:07:41 localhost nova_compute[279673]: 2025-11-28 10:07:41.446 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e200 do_prune osdmap full prune enabled Nov 28 05:07:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e201 e201: 6 total, 6 up, 6 in Nov 28 05:07:41 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e201: 6 total, 6 up, 6 in Nov 28 05:07:42 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e201 do_prune osdmap full prune enabled Nov 28 05:07:42 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e202 e202: 6 total, 6 up, 6 in Nov 28 05:07:42 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e202: 6 total, 6 up, 6 in Nov 28 05:07:42 localhost nova_compute[279673]: 2025-11-28 10:07:42.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:07:43 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:43.483 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:3b:02 10.100.0.19 10.100.0.2 10.100.0.34'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=af9fa2ef-906d-4c5f-8a61-e350b84d90cf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=636c21fa-d6bd-405e-95ed-d59498827d6f) old=Port_Binding(mac=['fa:16:3e:af:3b:02 10.100.0.19 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-744b5a82-3c5c-4b41-ba44-527244a209c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e7a07c97c664076bc825e05137c574c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:07:43 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:43.485 158130 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 636c21fa-d6bd-405e-95ed-d59498827d6f in datapath 744b5a82-3c5c-4b41-ba44-527244a209c4 updated#033[00m Nov 28 05:07:43 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:43.488 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 744b5a82-3c5c-4b41-ba44-527244a209c4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:07:43 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:43.489 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[4d2b736c-053b-484f-8a47-d373ec4aef9a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:07:43 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Nov 28 05:07:43 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1359194721' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Nov 28 05:07:44 localhost neutron_sriov_agent[254147]: 2025-11-28 10:07:44.329 2 INFO neutron.agent.securitygroups_rpc [None req-cecabd37-7803-4c2d-a13a-d3905bbc0cfc 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['18fcd73e-4837-425f-bf44-9ed4ac2aa187', '58a0f932-b9f0-4bad-a5cb-a3c8cba7c65b', 'bec6547e-445f-4500-b371-6e2fc240d4db']#033[00m Nov 28 05:07:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e202 do_prune osdmap full prune enabled Nov 28 05:07:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e203 e203: 6 total, 6 up, 6 in Nov 28 05:07:44 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e203: 6 total, 6 up, 6 in Nov 28 05:07:44 localhost neutron_sriov_agent[254147]: 2025-11-28 10:07:44.742 2 INFO neutron.agent.securitygroups_rpc [None req-d99eee51-2169-45db-889c-fcddfc1e6db2 6b80a7d6f65f4ebe8363ccaae26f3e87 ae10569a38284f298c961498da620c5f - - default default] Security group member updated ['c5eee24b-0bed-4035-a2ab-e6c531c94e43']#033[00m Nov 28 05:07:44 localhost nova_compute[279673]: 2025-11-28 10:07:44.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:07:44 localhost nova_compute[279673]: 2025-11-28 10:07:44.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:07:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e203 do_prune osdmap full prune enabled Nov 28 05:07:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e204 e204: 6 total, 6 up, 6 in Nov 28 05:07:45 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e204: 6 total, 6 up, 6 in Nov 28 05:07:45 localhost nova_compute[279673]: 2025-11-28 10:07:45.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:07:45 localhost nova_compute[279673]: 2025-11-28 10:07:45.770 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 05:07:45 localhost nova_compute[279673]: 2025-11-28 10:07:45.896 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:46 localhost neutron_sriov_agent[254147]: 2025-11-28 10:07:46.121 2 INFO neutron.agent.securitygroups_rpc [None req-ff6bd390-74ec-4285-8021-b1b301c7b944 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['18fcd73e-4837-425f-bf44-9ed4ac2aa187', 'bec6547e-445f-4500-b371-6e2fc240d4db']#033[00m Nov 28 05:07:46 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e204 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:07:46 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e204 do_prune osdmap full prune enabled Nov 28 05:07:46 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e205 e205: 6 total, 6 up, 6 in Nov 28 05:07:46 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e205: 6 total, 6 up, 6 in Nov 28 05:07:46 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:07:46 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/611242034' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:07:46 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:07:46 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/611242034' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:07:46 localhost nova_compute[279673]: 2025-11-28 10:07:46.483 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:46 localhost nova_compute[279673]: 2025-11-28 10:07:46.767 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:07:46 localhost neutron_sriov_agent[254147]: 2025-11-28 10:07:46.872 2 INFO neutron.agent.securitygroups_rpc [None req-02f5cc80-e4fe-45b5-9f60-d631f95eb2c9 6b80a7d6f65f4ebe8363ccaae26f3e87 ae10569a38284f298c961498da620c5f - - default default] Security group member updated ['c5eee24b-0bed-4035-a2ab-e6c531c94e43']#033[00m Nov 28 05:07:47 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:47.002 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:07:47 localhost podman[327578]: 2025-11-28 10:07:47.157767749 +0000 UTC m=+0.064315382 container kill 99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:07:47 localhost dnsmasq[327038]: exiting on receipt of SIGTERM Nov 28 05:07:47 localhost systemd[1]: tmp-crun.mmSGk4.mount: Deactivated successfully. Nov 28 05:07:47 localhost systemd[1]: libpod-99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d.scope: Deactivated successfully. Nov 28 05:07:47 localhost podman[327593]: 2025-11-28 10:07:47.213452935 +0000 UTC m=+0.040447397 container died 99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:07:47 localhost podman[327593]: 2025-11-28 10:07:47.238009562 +0000 UTC m=+0.065003994 container cleanup 99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:07:47 localhost systemd[1]: libpod-conmon-99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d.scope: Deactivated successfully. Nov 28 05:07:47 localhost podman[327594]: 2025-11-28 10:07:47.298687192 +0000 UTC m=+0.121397333 container remove 99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-92e1c9c1-574e-40d0-8b6a-bd313cc5f7d4, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 05:07:47 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e205 do_prune osdmap full prune enabled Nov 28 05:07:47 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e206 e206: 6 total, 6 up, 6 in Nov 28 05:07:47 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e206: 6 total, 6 up, 6 in Nov 28 05:07:47 localhost nova_compute[279673]: 2025-11-28 10:07:47.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:07:47 localhost nova_compute[279673]: 2025-11-28 10:07:47.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:07:48 localhost openstack_network_exporter[240658]: ERROR 10:07:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:07:48 localhost openstack_network_exporter[240658]: ERROR 10:07:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:07:48 localhost openstack_network_exporter[240658]: ERROR 10:07:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:07:48 localhost openstack_network_exporter[240658]: ERROR 10:07:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:07:48 localhost openstack_network_exporter[240658]: Nov 28 05:07:48 localhost openstack_network_exporter[240658]: ERROR 10:07:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:07:48 localhost openstack_network_exporter[240658]: Nov 28 05:07:48 localhost systemd[1]: var-lib-containers-storage-overlay-f82a10448f7eafab39fdb6ae375b218bc52faf1c65a99161c7b78ef796fa5b5c-merged.mount: Deactivated successfully. Nov 28 05:07:48 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-99c6926edfef36e785cbd98c29296e1647aa16a0dca7122135393dbf02832c2d-userdata-shm.mount: Deactivated successfully. Nov 28 05:07:48 localhost nova_compute[279673]: 2025-11-28 10:07:48.229 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:07:48 localhost nova_compute[279673]: 2025-11-28 10:07:48.230 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:07:48 localhost nova_compute[279673]: 2025-11-28 10:07:48.230 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:07:48 localhost nova_compute[279673]: 2025-11-28 10:07:48.230 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 05:07:48 localhost nova_compute[279673]: 2025-11-28 10:07:48.231 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:07:48 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:48.372 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:07:48 localhost nova_compute[279673]: 2025-11-28 10:07:48.372 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:48 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:48.375 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 28 05:07:48 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:48.457 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:07:48 localhost systemd[1]: run-netns-qdhcp\x2d92e1c9c1\x2d574e\x2d40d0\x2d8b6a\x2dbd313cc5f7d4.mount: Deactivated successfully. Nov 28 05:07:48 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:48.548 261084 INFO neutron.agent.dhcp.agent [None req-1464a885-bb69-4317-98d8-b05b7a31be10 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:07:48 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:07:48 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1635609151' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:07:48 localhost nova_compute[279673]: 2025-11-28 10:07:48.680 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:07:48 localhost nova_compute[279673]: 2025-11-28 10:07:48.787 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:07:48 localhost nova_compute[279673]: 2025-11-28 10:07:48.788 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:07:48 localhost nova_compute[279673]: 2025-11-28 10:07:48.996 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 05:07:48 localhost nova_compute[279673]: 2025-11-28 10:07:48.998 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11102MB free_disk=41.7000732421875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 05:07:48 localhost nova_compute[279673]: 2025-11-28 10:07:48.999 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:07:49 localhost nova_compute[279673]: 2025-11-28 10:07:48.999 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:07:49 localhost nova_compute[279673]: 2025-11-28 10:07:49.561 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 05:07:49 localhost nova_compute[279673]: 2025-11-28 10:07:49.562 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 05:07:49 localhost nova_compute[279673]: 2025-11-28 10:07:49.562 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 05:07:49 localhost nova_compute[279673]: 2025-11-28 10:07:49.622 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:07:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e206 do_prune osdmap full prune enabled Nov 28 05:07:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e207 e207: 6 total, 6 up, 6 in Nov 28 05:07:49 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e207: 6 total, 6 up, 6 in Nov 28 05:07:50 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:07:50 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1418648319' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:07:50 localhost nova_compute[279673]: 2025-11-28 10:07:50.078 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:07:50 localhost nova_compute[279673]: 2025-11-28 10:07:50.084 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 05:07:50 localhost nova_compute[279673]: 2025-11-28 10:07:50.131 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 05:07:50 localhost nova_compute[279673]: 2025-11-28 10:07:50.134 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 05:07:50 localhost nova_compute[279673]: 2025-11-28 10:07:50.134 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:07:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:07:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:50.846 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:07:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:50.846 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:07:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:50.847 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:07:50 localhost podman[327665]: 2025-11-28 10:07:50.85241129 +0000 UTC m=+0.083982049 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, architecture=x86_64, config_id=edpm, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Nov 28 05:07:50 localhost podman[327665]: 2025-11-28 10:07:50.890638558 +0000 UTC m=+0.122209377 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, release=1755695350, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Nov 28 05:07:50 localhost nova_compute[279673]: 2025-11-28 10:07:50.898 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:50 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:07:51 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:07:51.392 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:07:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e207 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:07:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e207 do_prune osdmap full prune enabled Nov 28 05:07:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e208 e208: 6 total, 6 up, 6 in Nov 28 05:07:51 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e208: 6 total, 6 up, 6 in Nov 28 05:07:51 localhost nova_compute[279673]: 2025-11-28 10:07:51.485 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:07:51 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3232289778' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:07:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:07:51 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3232289778' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:07:52 localhost nova_compute[279673]: 2025-11-28 10:07:52.135 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:07:52 localhost nova_compute[279673]: 2025-11-28 10:07:52.136 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 05:07:52 localhost nova_compute[279673]: 2025-11-28 10:07:52.136 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 05:07:52 localhost nova_compute[279673]: 2025-11-28 10:07:52.266 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 05:07:52 localhost nova_compute[279673]: 2025-11-28 10:07:52.266 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 05:07:52 localhost nova_compute[279673]: 2025-11-28 10:07:52.267 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 05:07:52 localhost nova_compute[279673]: 2025-11-28 10:07:52.267 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:07:52 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e208 do_prune osdmap full prune enabled Nov 28 05:07:52 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e209 e209: 6 total, 6 up, 6 in Nov 28 05:07:52 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e209: 6 total, 6 up, 6 in Nov 28 05:07:52 localhost neutron_sriov_agent[254147]: 2025-11-28 10:07:52.454 2 INFO neutron.agent.securitygroups_rpc [None req-6326bf78-d29d-46c0-b3b7-72df824a50bd 646af4638a054ec4b7eb9e5438a2ab60 5e7a07c97c664076bc825e05137c574c - - default default] Security group member updated ['11213b09-f0e7-4cd0-8dcb-72dc58a4cd0b']#033[00m Nov 28 05:07:53 localhost nova_compute[279673]: 2025-11-28 10:07:53.317 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:07:53 localhost nova_compute[279673]: 2025-11-28 10:07:53.506 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 05:07:53 localhost nova_compute[279673]: 2025-11-28 10:07:53.507 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 05:07:53 localhost nova_compute[279673]: 2025-11-28 10:07:53.507 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:07:53 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e209 do_prune osdmap full prune enabled Nov 28 05:07:53 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e210 e210: 6 total, 6 up, 6 in Nov 28 05:07:53 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e210: 6 total, 6 up, 6 in Nov 28 05:07:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e210 do_prune osdmap full prune enabled Nov 28 05:07:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e211 e211: 6 total, 6 up, 6 in Nov 28 05:07:54 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e211: 6 total, 6 up, 6 in Nov 28 05:07:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:07:55 localhost systemd[1]: tmp-crun.uFpSun.mount: Deactivated successfully. Nov 28 05:07:55 localhost podman[327686]: 2025-11-28 10:07:55.858634275 +0000 UTC m=+0.094828892 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 05:07:55 localhost podman[327686]: 2025-11-28 10:07:55.86950569 +0000 UTC m=+0.105700277 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 28 05:07:55 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:07:55 localhost nova_compute[279673]: 2025-11-28 10:07:55.900 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:56 localhost ovn_metadata_agent[158125]: 2025-11-28 10:07:56.377 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:07:56 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e211 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:07:56 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e211 do_prune osdmap full prune enabled Nov 28 05:07:56 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e212 e212: 6 total, 6 up, 6 in Nov 28 05:07:56 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e212: 6 total, 6 up, 6 in Nov 28 05:07:56 localhost nova_compute[279673]: 2025-11-28 10:07:56.487 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:07:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:07:57 localhost podman[327708]: 2025-11-28 10:07:57.838953884 +0000 UTC m=+0.077709275 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 05:07:57 localhost podman[327708]: 2025-11-28 10:07:57.844801754 +0000 UTC m=+0.083557115 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible) Nov 28 05:07:57 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:07:58 localhost nova_compute[279673]: 2025-11-28 10:07:58.139 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:07:59 localhost neutron_sriov_agent[254147]: 2025-11-28 10:07:59.975 2 INFO neutron.agent.securitygroups_rpc [None req-db55947c-11ec-46cf-8b4d-c6d9fdfd5571 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['6089c18b-265b-455e-adb1-d3701c826867']#033[00m Nov 28 05:08:00 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:00.370 2 INFO neutron.agent.securitygroups_rpc [None req-c7648c00-3d9b-484d-a8fa-078b96af4727 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['6089c18b-265b-455e-adb1-d3701c826867']#033[00m Nov 28 05:08:00 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:00.423 2 INFO neutron.agent.securitygroups_rpc [req-b7f1a51c-049a-4a1b-9cd5-f5d4f2c3744c req-dd3110e0-51f5-4337-ac22-6a0998bcc00c 87cf17c48bab44279b2e7eca9e0882a2 d3c0d1ce8d854a7b9ffc953e88cd2c44 - - default default] Security group member updated ['c52603b5-5f47-4123-b8fe-cc9f0a56d914']#033[00m Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.676 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.677 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.677 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.681 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be5790d3-fbfe-48cc-8616-ffeac053ccba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:08:00.677387', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '1ee69e92-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.849278866, 'message_signature': '787a5a161a65cd00b8218feff9d59b51ac107d5a9df60b386c15c906906a5250'}]}, 'timestamp': '2025-11-28 10:08:00.682664', '_unique_id': '71bfa88aa0f142e29764fceb3f8f6d49'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.684 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.685 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.685 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ccb3c9c0-e991-485c-b2ff-2bbcb89520bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:08:00.685541', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '1ee723da-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.849278866, 'message_signature': '771e76ad5d87a316ecb3f8b61a92dff0c8a7166e5c5525d03124c316424852de'}]}, 'timestamp': '2025-11-28 10:08:00.686008', '_unique_id': 'b49b5debc4c84e399eaab7ee4dd17810'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.686 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.688 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.688 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b4a8a4d-2cf1-4007-bc02-ddff0ae3a66a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:08:00.688187', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '1ee78b36-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.849278866, 'message_signature': 'c3868b7dd8f0e841828363811df492f5dc526a826494aa72bf96f56952e25d0e'}]}, 'timestamp': '2025-11-28 10:08:00.688652', '_unique_id': 'ab5a97db60cd44d39acc8234361cc98d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.689 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.690 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.690 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.690 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f08f13cc-08df-49ef-be0c-5f53a8efd67b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:08:00.690938', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '1ee7f882-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.849278866, 'message_signature': '0ab2e221c63939ec07d33e759ba42c7f2f951e32c62d3f621e796aa8d78ff02f'}]}, 'timestamp': '2025-11-28 10:08:00.691450', '_unique_id': '582cdb3ce55e4123b80d0952d8f7deca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.692 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.693 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.709 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 17630000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3a64570-a236-4ebc-81bc-64c7b3f33324', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17630000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:08:00.693537', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '1eeae128-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.881761447, 'message_signature': '125ed3b80a3da8c73370b315701298d3209758b9de7572b8678afbb5e108b360'}]}, 'timestamp': '2025-11-28 10:08:00.710510', '_unique_id': 'e63312a722234c71a65f425ed9103f86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.711 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.712 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.740 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.740 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38ee98b8-6b68-42b6-bb8b-d840ec3e02c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:08:00.712825', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1eef800c-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.884719728, 'message_signature': 'a41986580a900725e88a143d9c97d14e7d62495c0b7358b3b4da8be4fbe87820'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:08:00.712825', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1eef9164-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.884719728, 'message_signature': 'bd6f8dbcce7d43b411e6a17c07fdfd720ca743fc4cbe78c298d1ab3962a17c42'}]}, 'timestamp': '2025-11-28 10:08:00.741241', '_unique_id': '029d9993920f47a8aa0b0e0ac09e4ed9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.742 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.743 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.743 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0953e16-0969-4b67-9442-3e68d2f6795e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:08:00.743798', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '1ef008f6-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.881761447, 'message_signature': '6c558c1acd8cf2f7b3446a12bc495b16c89f0ae9909d80bb4177efaf6c8e8430'}]}, 'timestamp': '2025-11-28 10:08:00.744287', '_unique_id': '4f294d41408c40609879741b010ea03b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.745 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.746 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.746 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ca549ff-a878-4b56-822f-966de80da965', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:08:00.746447', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '1ef06f1c-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.849278866, 'message_signature': '64240da5122c1c58245044c902c43f113b105cee5a4ae11ced3e614fa81e0a11'}]}, 'timestamp': '2025-11-28 10:08:00.746945', '_unique_id': '724d38e915e64d4f966e2592f8e32fd7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.747 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.748 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.749 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.749 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.749 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a36f8b63-d922-4228-b49b-476e5cf4482f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:08:00.749224', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ef0db82-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.884719728, 'message_signature': 'df838aa38e8f321a83d1fd209af32cd6c4f5bbaf17892da5be09357e0b7775c9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:08:00.749224', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ef0eb86-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.884719728, 'message_signature': '15f7d9ab38fffc9147ac52d51348028671135000a72d0f3c40781d41dd9d2d96'}]}, 'timestamp': '2025-11-28 10:08:00.750093', '_unique_id': 'e393e062ee0d4d74bb3734c7f16ddfe6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.751 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.752 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.752 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.752 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8141fc43-9139-4010-bb34-7702ee0cc7a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:08:00.752330', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ef158c8-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.884719728, 'message_signature': 'f228323b9028861439b0528800f6491c68d6ca7e869932b67e3a937010b18d4f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:08:00.752330', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ef16a34-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.884719728, 'message_signature': '3b19b85686d26e93874e902d78626a9307de860842ff8e0b811a8fe690a0e526'}]}, 'timestamp': '2025-11-28 10:08:00.753309', '_unique_id': 'a1c7a7e705da44bca88ee38b187cc83d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.754 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.755 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.755 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '401924f4-1ae6-466a-a99c-018474204d59', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:08:00.755501', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '1ef1d0dc-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.849278866, 'message_signature': '2fdcf0a8413f734bc797cb378c5fc1ef98a35e6db56a6229c4163f28969ca670'}]}, 'timestamp': '2025-11-28 10:08:00.755967', '_unique_id': '55797a652864478cba8a2af1a8e6b2a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.756 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.757 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.771 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.772 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb274843-5e8f-49eb-a74a-0791caf49fa8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:08:00.758095', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ef45924-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.930011233, 'message_signature': 'c38e31573b2ffede32c2878e4230d9cc2d8d268f3f401f4ca558d1c475214542'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:08:00.758095', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ef46d60-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.930011233, 'message_signature': '304980e490d74051d8f5c4b73193f3f81ec85db623d3c9879543eb2d6816c41a'}]}, 'timestamp': '2025-11-28 10:08:00.773098', '_unique_id': '3e92882fe14748148827a550d94ccd8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.774 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.775 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.775 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53aa3748-353b-4ae3-af2b-b817259656c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:08:00.775914', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '1ef4f0be-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.849278866, 'message_signature': '79988ea8a3cd6219c519d42624817219698af698c867530e1708048d80a33e98'}]}, 'timestamp': '2025-11-28 10:08:00.776449', '_unique_id': '687538df9687422a8c02838c18306e7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.777 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.779 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.779 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa3bb2aa-5c3f-48c6-babb-a32563c6d34b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:08:00.779598', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '1ef58358-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.849278866, 'message_signature': '02b6cb7fc59985b52df167714c04368576bafb4dbab56a20db83533d588d0c9c'}]}, 'timestamp': '2025-11-28 10:08:00.780319', '_unique_id': 'b36df62bb2c1444da010952e6cfb8a13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.781 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.783 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.783 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd214fe69-39fd-4d48-88b0-376f85ea2644', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:08:00.783346', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '1ef61052-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.849278866, 'message_signature': 'f4679ad9d36c3172ca4936811cee3bbe9b3b8c5c8891c576776cdcdd2c9a2ae1'}]}, 'timestamp': '2025-11-28 10:08:00.783807', '_unique_id': '04b0d05b2a584b21bb829fe204ac246b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.785 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.785 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27561f6f-04b8-46fe-801d-2a8e24e06bc7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:08:00.785916', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '1ef675c4-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.849278866, 'message_signature': '10d1be8fb126e733cd7c75db48dfac2f2fb3a8946f6ec79009c7d2c108cda10e'}]}, 'timestamp': '2025-11-28 10:08:00.786402', '_unique_id': '2717a9fa624a4214a291c2f4b6676d39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.788 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.788 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.788 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.789 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95f27a3a-629a-44c2-a0f2-6e5dd8afc54a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:08:00.788663', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ef6e04a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.930011233, 'message_signature': '9290f4784dcb6d804a8c9f2c67eb55fa9fc216e47c8246f453bcab0a52c3eb20'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:08:00.788663', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ef6f17a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.930011233, 'message_signature': '0b3672baab00173fd5c62104fbb2d08681932d14e442b845842965ea9a6a63b2'}]}, 'timestamp': '2025-11-28 10:08:00.789539', '_unique_id': '2a499300f95943209f1c4a05635db35c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.790 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.791 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.791 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.792 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb162322-9b11-4c76-b7f9-f49573689d57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:08:00.791705', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ef75638-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.930011233, 'message_signature': '6729ba7d410ac70188a03499688cf866e3a6eae3b9645266fddaff5805d012e0'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:08:00.791705', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ef76772-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.930011233, 'message_signature': '1033354e8dff52a8a3fe00b08fa1ac3f6b26d0b6f761ab33d22e025a52d1bd33'}]}, 'timestamp': '2025-11-28 10:08:00.792589', '_unique_id': '317724244dcd43c3bc8ca91a8f8aae43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.793 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.794 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.795 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.795 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4393f2c-e8ce-4752-b001-0dde6c4e707f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:08:00.795058', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ef7d9dc-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.884719728, 'message_signature': 'b5c86c3669ec3fa3aca6fb65b21a273ba1baf195a8410029dbcf79bd350ebc62'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:08:00.795058', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ef7e94a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.884719728, 'message_signature': '609fda4cdbe3ce932e2d91b1916018283d136ca8711f0a80cad3d73d5ee79a63'}]}, 'timestamp': '2025-11-28 10:08:00.795879', '_unique_id': 'af12df17a2a7425fbc0dd7213afb6816'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.796 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.797 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.797 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.798 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6e2c83c-2b13-410d-a748-bf5e41d70bef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:08:00.797943', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ef84b4c-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.884719728, 'message_signature': '657003e21620bfd0acd1f1755d0e3be83947cbffa80276c9085ab3c15db6ca7b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:08:00.797943', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ef85ad8-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.884719728, 'message_signature': 'aa9e2a670636c241fc3e30e56b6c39620d65265d036d4af6c21d909ef87028a5'}]}, 'timestamp': '2025-11-28 10:08:00.798786', '_unique_id': '9e458f9315204d0ebd943d005a26f86f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.799 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.800 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.800 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.801 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '964445b0-af43-4b97-95d3-3e1de9d334cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:08:00.800937', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ef8c40a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.884719728, 'message_signature': '597d133acb51312cabaf3d0e0a8d9f8ed43cd1ac3471d762105718cc5a35531e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:08:00.800937', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ef8d3b4-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12314.884719728, 'message_signature': '618cc57e0f261d145e37b4c6c04b8817bb07e217bf59d2c320d0bc4d9c049691'}]}, 'timestamp': '2025-11-28 10:08:00.801878', '_unique_id': '185d7c7aad7340e3ba7cb6d56aa2089b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:08:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:08:00.802 12 ERROR oslo_messaging.notify.messaging Nov 28 05:08:00 localhost podman[327727]: 2025-11-28 10:08:00.860099405 +0000 UTC m=+0.092218033 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:08:00 localhost podman[327727]: 2025-11-28 10:08:00.874904521 +0000 UTC m=+0.107023139 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:08:00 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:08:00 localhost nova_compute[279673]: 2025-11-28 10:08:00.902 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:00 localhost systemd[1]: tmp-crun.TnU2rj.mount: Deactivated successfully. Nov 28 05:08:00 localhost podman[327728]: 2025-11-28 10:08:00.965136981 +0000 UTC m=+0.196958740 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Nov 28 05:08:01 localhost podman[327728]: 2025-11-28 10:08:01.005423213 +0000 UTC m=+0.237244972 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:08:01 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:08:01 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1. Nov 28 05:08:01 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:01.204 2 INFO neutron.agent.securitygroups_rpc [None req-565e0ad3-78a4-4813-9446-c55fe79fb3b2 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['d7b997f9-4b8e-48df-a7bc-cf1a88435b19']#033[00m Nov 28 05:08:01 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:01.334 2 INFO neutron.agent.securitygroups_rpc [None req-51f1f60e-f6ad-4b91-b7b5-2db567d0c6ed 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']#033[00m Nov 28 05:08:01 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:01.395 2 INFO neutron.agent.securitygroups_rpc [None req-84e0009b-c65d-4325-9480-9689cb3fdfb2 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['d7b997f9-4b8e-48df-a7bc-cf1a88435b19']#033[00m Nov 28 05:08:01 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e212 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:08:01 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e212 do_prune osdmap full prune enabled Nov 28 05:08:01 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e213 e213: 6 total, 6 up, 6 in Nov 28 05:08:01 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e213: 6 total, 6 up, 6 in Nov 28 05:08:01 localhost nova_compute[279673]: 2025-11-28 10:08:01.491 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:01 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:01.552 2 INFO neutron.agent.securitygroups_rpc [None req-f01fa5c9-de49-46e8-bc93-b89c3fed3f57 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']#033[00m Nov 28 05:08:01 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:08:01.772 261084 INFO neutron.agent.linux.ip_lib [None req-ffde1999-4035-4756-98bf-8def416c45d7 - - - - - -] Device tapdfb15593-fe cannot be used as it has no MAC address#033[00m Nov 28 05:08:01 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:01.774 2 INFO neutron.agent.securitygroups_rpc [None req-05d81c06-d401-4205-8dff-14a86090a368 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']#033[00m Nov 28 05:08:01 localhost nova_compute[279673]: 2025-11-28 10:08:01.801 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:01 localhost kernel: device tapdfb15593-fe entered promiscuous mode Nov 28 05:08:01 localhost ovn_controller[152322]: 2025-11-28T10:08:01Z|00463|binding|INFO|Claiming lport dfb15593-fe1f-4aaf-af25-32ab66c1a780 for this chassis. Nov 28 05:08:01 localhost ovn_controller[152322]: 2025-11-28T10:08:01Z|00464|binding|INFO|dfb15593-fe1f-4aaf-af25-32ab66c1a780: Claiming unknown Nov 28 05:08:01 localhost systemd-udevd[327781]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:08:01 localhost NetworkManager[5967]: [1764324481.8139] manager: (tapdfb15593-fe): new Generic device (/org/freedesktop/NetworkManager/Devices/74) Nov 28 05:08:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:08:01.830 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-39526589-27d4-41ad-9aef-f63f534ecbf0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-39526589-27d4-41ad-9aef-f63f534ecbf0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ce143270a4649669232b53b6a44e4ba', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7ab6281c-5843-47df-b4b0-ac9f0fa87790, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=dfb15593-fe1f-4aaf-af25-32ab66c1a780) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:08:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:08:01.834 158130 INFO neutron.agent.ovn.metadata.agent [-] Port dfb15593-fe1f-4aaf-af25-32ab66c1a780 in datapath 39526589-27d4-41ad-9aef-f63f534ecbf0 bound to our chassis#033[00m Nov 28 05:08:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:08:01.837 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 39526589-27d4-41ad-9aef-f63f534ecbf0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:08:01 localhost ovn_metadata_agent[158125]: 2025-11-28 10:08:01.838 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[23b7541e-bd13-4cc8-b102-47bcf4ee2e60]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:08:01 localhost journal[227875]: ethtool ioctl error on tapdfb15593-fe: No such device Nov 28 05:08:01 localhost journal[227875]: ethtool ioctl error on tapdfb15593-fe: No such device Nov 28 05:08:01 localhost ovn_controller[152322]: 2025-11-28T10:08:01Z|00465|binding|INFO|Setting lport dfb15593-fe1f-4aaf-af25-32ab66c1a780 ovn-installed in OVS Nov 28 05:08:01 localhost ovn_controller[152322]: 2025-11-28T10:08:01Z|00466|binding|INFO|Setting lport dfb15593-fe1f-4aaf-af25-32ab66c1a780 up in Southbound Nov 28 05:08:01 localhost journal[227875]: ethtool ioctl error on tapdfb15593-fe: No such device Nov 28 05:08:01 localhost nova_compute[279673]: 2025-11-28 10:08:01.851 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:01 localhost journal[227875]: ethtool ioctl error on tapdfb15593-fe: No such device Nov 28 05:08:01 localhost journal[227875]: ethtool ioctl error on tapdfb15593-fe: No such device Nov 28 05:08:01 localhost journal[227875]: ethtool ioctl error on tapdfb15593-fe: No such device Nov 28 05:08:01 localhost journal[227875]: ethtool ioctl error on tapdfb15593-fe: No such device Nov 28 05:08:01 localhost journal[227875]: ethtool ioctl error on tapdfb15593-fe: No such device Nov 28 05:08:01 localhost nova_compute[279673]: 2025-11-28 10:08:01.892 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:01 localhost nova_compute[279673]: 2025-11-28 10:08:01.926 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:01 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:01.939 2 INFO neutron.agent.securitygroups_rpc [None req-22968046-714b-4dc6-9b18-f55692eae2e8 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']#033[00m Nov 28 05:08:02 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:02.109 2 INFO neutron.agent.securitygroups_rpc [None req-1c9f5803-20cb-4e69-804a-0617a068dfc7 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']#033[00m Nov 28 05:08:02 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:02.253 2 INFO neutron.agent.securitygroups_rpc [None req-2bab2459-ad99-4514-94d3-affbaffcf884 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']#033[00m Nov 28 05:08:02 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:02.782 2 INFO neutron.agent.securitygroups_rpc [None req-c50c53a2-fd90-43a9-8f31-8eb2d8fc3f23 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']#033[00m Nov 28 05:08:02 localhost podman[327852]: Nov 28 05:08:02 localhost podman[327852]: 2025-11-28 10:08:02.838231476 +0000 UTC m=+0.094030958 container create 7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39526589-27d4-41ad-9aef-f63f534ecbf0, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 28 05:08:02 localhost systemd[1]: Started libpod-conmon-7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36.scope. Nov 28 05:08:02 localhost podman[327852]: 2025-11-28 10:08:02.794102976 +0000 UTC m=+0.049902478 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:08:02 localhost systemd[1]: Started libcrun container. Nov 28 05:08:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4341b7420827c25c7624eca3707ed25a6232f868ac4d13c1195965a03f452de4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:08:02 localhost podman[327852]: 2025-11-28 10:08:02.918077436 +0000 UTC m=+0.173876928 container init 7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39526589-27d4-41ad-9aef-f63f534ecbf0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 28 05:08:02 localhost podman[327852]: 2025-11-28 10:08:02.92924013 +0000 UTC m=+0.185039612 container start 7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39526589-27d4-41ad-9aef-f63f534ecbf0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:08:02 localhost dnsmasq[327872]: started, version 2.85 cachesize 150 Nov 28 05:08:02 localhost dnsmasq[327872]: DNS service limited to local subnets Nov 28 05:08:02 localhost dnsmasq[327872]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:08:02 localhost dnsmasq[327872]: warning: no upstream servers configured Nov 28 05:08:02 localhost dnsmasq-dhcp[327872]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:08:02 localhost dnsmasq[327872]: read /var/lib/neutron/dhcp/39526589-27d4-41ad-9aef-f63f534ecbf0/addn_hosts - 0 addresses Nov 28 05:08:02 localhost dnsmasq-dhcp[327872]: read /var/lib/neutron/dhcp/39526589-27d4-41ad-9aef-f63f534ecbf0/host Nov 28 05:08:02 localhost dnsmasq-dhcp[327872]: read /var/lib/neutron/dhcp/39526589-27d4-41ad-9aef-f63f534ecbf0/opts Nov 28 05:08:02 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:02.982 2 INFO neutron.agent.securitygroups_rpc [None req-be6115b0-0729-4a7d-96a1-2edd3462b47c 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']#033[00m Nov 28 05:08:03 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:03.025 2 INFO neutron.agent.securitygroups_rpc [None req-bd38caac-e12b-4351-8e6f-97a983a9711e 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']#033[00m Nov 28 05:08:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:08:03.160 261084 INFO neutron.agent.dhcp.agent [None req-a7d3a5cf-ca51-413f-864d-c6bf6e800648 - - - - - -] DHCP configuration for ports {'5d2c2476-c1d0-4426-9803-949f4055c439'} is completed#033[00m Nov 28 05:08:03 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:03.179 2 INFO neutron.agent.securitygroups_rpc [None req-edb67cac-fd5d-4927-8f94-9c49e0c903d7 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']#033[00m Nov 28 05:08:03 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:03.331 2 INFO neutron.agent.securitygroups_rpc [None req-20312780-c455-466a-b7ab-f675d7eabde1 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']#033[00m Nov 28 05:08:03 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:03.422 2 INFO neutron.agent.securitygroups_rpc [None req-14a5f88b-dc6d-4539-ba50-a6898f3db0a1 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']#033[00m Nov 28 05:08:03 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:03.476 2 INFO neutron.agent.securitygroups_rpc [None req-13772359-a1cd-4764-8f15-d95cc5fef900 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']#033[00m Nov 28 05:08:03 localhost systemd[1]: tmp-crun.zoUCyR.mount: Deactivated successfully. Nov 28 05:08:03 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:03.869 2 INFO neutron.agent.securitygroups_rpc [None req-6ebeaf50-19b9-453a-b8c6-1888da8279d8 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']#033[00m Nov 28 05:08:03 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:03.889 2 INFO neutron.agent.securitygroups_rpc [None req-13ec4ccc-0950-4875-9a0e-1e0334716892 f56d2237e5b74576a33d9840c9346817 9ce143270a4649669232b53b6a44e4ba - - default default] Security group member updated ['f3e50b86-f5a6-4339-897f-e9e754c264f3']#033[00m Nov 28 05:08:03 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:03.892 2 INFO neutron.agent.securitygroups_rpc [None req-a9fad0b5-d55e-463a-8ffe-8066b04b35f2 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']#033[00m Nov 28 05:08:03 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:08:03.948 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:08:03Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=0edceee0-9905-42aa-b928-f37bc201af3e, ip_allocation=immediate, mac_address=fa:16:3e:3c:35:7f, name=tempest-TagsExtTest-478727359, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:07:59Z, description=, dns_domain=, id=39526589-27d4-41ad-9aef-f63f534ecbf0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TagsExtTest-test-network-1109491877, port_security_enabled=True, project_id=9ce143270a4649669232b53b6a44e4ba, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42885, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2638, status=ACTIVE, subnets=['9308db1a-3a7e-47d2-a1c6-556103f500ef'], tags=[], tenant_id=9ce143270a4649669232b53b6a44e4ba, updated_at=2025-11-28T10:08:00Z, vlan_transparent=None, network_id=39526589-27d4-41ad-9aef-f63f534ecbf0, port_security_enabled=True, project_id=9ce143270a4649669232b53b6a44e4ba, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['f3e50b86-f5a6-4339-897f-e9e754c264f3'], standard_attr_id=2669, status=DOWN, tags=[], tenant_id=9ce143270a4649669232b53b6a44e4ba, updated_at=2025-11-28T10:08:03Z on network 39526589-27d4-41ad-9aef-f63f534ecbf0#033[00m Nov 28 05:08:04 localhost systemd[1]: tmp-crun.Ffqw6h.mount: Deactivated successfully. Nov 28 05:08:04 localhost dnsmasq[327872]: read /var/lib/neutron/dhcp/39526589-27d4-41ad-9aef-f63f534ecbf0/addn_hosts - 1 addresses Nov 28 05:08:04 localhost dnsmasq-dhcp[327872]: read /var/lib/neutron/dhcp/39526589-27d4-41ad-9aef-f63f534ecbf0/host Nov 28 05:08:04 localhost dnsmasq-dhcp[327872]: read /var/lib/neutron/dhcp/39526589-27d4-41ad-9aef-f63f534ecbf0/opts Nov 28 05:08:04 localhost podman[327890]: 2025-11-28 10:08:04.191587497 +0000 UTC m=+0.077642934 container kill 7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39526589-27d4-41ad-9aef-f63f534ecbf0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:08:04 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:04.234 2 INFO neutron.agent.securitygroups_rpc [None req-179f04c6-9ef2-42e7-8385-4b9a01b4f84f 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']#033[00m Nov 28 05:08:04 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:04.272 2 INFO neutron.agent.securitygroups_rpc [None req-703ffdf6-c471-414a-b1d5-827cd1496408 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']#033[00m Nov 28 05:08:04 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:08:04.409 261084 INFO neutron.agent.dhcp.agent [None req-d27c7e74-de34-4f03-9809-a4bbc65da21f - - - - - -] DHCP configuration for ports {'0edceee0-9905-42aa-b928-f37bc201af3e'} is completed#033[00m Nov 28 05:08:04 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e213 do_prune osdmap full prune enabled Nov 28 05:08:04 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:04.651 2 INFO neutron.agent.securitygroups_rpc [None req-0ac9408a-42d9-4a9d-8ec9-d45f37e64efb 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']#033[00m Nov 28 05:08:04 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:04.675 2 INFO neutron.agent.securitygroups_rpc [None req-1da31546-dc0a-4e96-b9fc-5fecbaaa8ced 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['15f5d617-f52a-49d8-bbd0-19868b48ee91']#033[00m Nov 28 05:08:04 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e214 e214: 6 total, 6 up, 6 in Nov 28 05:08:05 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1. Nov 28 05:08:05 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e214: 6 total, 6 up, 6 in Nov 28 05:08:05 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:05.181 2 INFO neutron.agent.securitygroups_rpc [None req-7ed5809e-b041-4ea1-b265-703b22d08b78 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['b758e87b-4de3-47dd-a896-167b766787f3']#033[00m Nov 28 05:08:05 localhost nova_compute[279673]: 2025-11-28 10:08:05.905 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:06 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e214 do_prune osdmap full prune enabled Nov 28 05:08:06 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e215 e215: 6 total, 6 up, 6 in Nov 28 05:08:06 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e215: 6 total, 6 up, 6 in Nov 28 05:08:06 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e215 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:08:06 localhost nova_compute[279673]: 2025-11-28 10:08:06.495 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:06 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:06.529 2 INFO neutron.agent.securitygroups_rpc [None req-c3c34998-5ba8-4c09-bfd4-af4362061351 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['77da4666-3c7e-4eb4-bd89-e0f6bc0cfb77']#033[00m Nov 28 05:08:06 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:06.832 2 INFO neutron.agent.securitygroups_rpc [None req-cf7500dc-e72d-4341-8bc2-ebaed58e7094 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['2521adb0-8644-4922-aaf5-9462c312df8d']#033[00m Nov 28 05:08:07 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e215 do_prune osdmap full prune enabled Nov 28 05:08:07 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e216 e216: 6 total, 6 up, 6 in Nov 28 05:08:07 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e216: 6 total, 6 up, 6 in Nov 28 05:08:07 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:07.943 2 INFO neutron.agent.securitygroups_rpc [None req-410f4b75-8a53-43bd-aeb3-73827c1fb9d5 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['cc6c7909-68f3-4243-ad85-ca295b324967']#033[00m Nov 28 05:08:08 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:08.423 2 INFO neutron.agent.securitygroups_rpc [None req-9e0fb103-8bda-4ed9-896b-20548b225439 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['cc6c7909-68f3-4243-ad85-ca295b324967']#033[00m Nov 28 05:08:08 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:08.637 2 INFO neutron.agent.securitygroups_rpc [None req-5e0e412c-d85c-446d-af13-157bbc4d1b94 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['723b43a6-7c6c-4eb3-a519-023b34d9a2b5']#033[00m Nov 28 05:08:08 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:08.893 2 INFO neutron.agent.securitygroups_rpc [None req-aa17c84c-96b9-4f58-abde-f1675a957a10 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['723b43a6-7c6c-4eb3-a519-023b34d9a2b5']#033[00m Nov 28 05:08:09 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:09.707 2 INFO neutron.agent.securitygroups_rpc [None req-e693dfaa-acf9-4ef6-91f0-fa32111d34d5 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['89d85f46-caf0-4632-8f88-6aa2b20ffab5']#033[00m Nov 28 05:08:09 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:09.967 2 INFO neutron.agent.securitygroups_rpc [None req-40ac5e44-c334-4848-ad22-bdb0d1d393a9 f56d2237e5b74576a33d9840c9346817 9ce143270a4649669232b53b6a44e4ba - - default default] Security group member updated ['f3e50b86-f5a6-4339-897f-e9e754c264f3']#033[00m Nov 28 05:08:10 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:10.028 2 INFO neutron.agent.securitygroups_rpc [None req-aa8fcbbc-e15f-465d-a79b-54ba8e5d4dfa 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['89d85f46-caf0-4632-8f88-6aa2b20ffab5']#033[00m Nov 28 05:08:10 localhost podman[238687]: time="2025-11-28T10:08:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:08:10 localhost podman[238687]: @ - - [28/Nov/2025:10:08:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157512 "" "Go-http-client/1.1" Nov 28 05:08:10 localhost podman[238687]: @ - - [28/Nov/2025:10:08:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19739 "" "Go-http-client/1.1" Nov 28 05:08:10 localhost dnsmasq[327872]: read /var/lib/neutron/dhcp/39526589-27d4-41ad-9aef-f63f534ecbf0/addn_hosts - 0 addresses Nov 28 05:08:10 localhost dnsmasq-dhcp[327872]: read /var/lib/neutron/dhcp/39526589-27d4-41ad-9aef-f63f534ecbf0/host Nov 28 05:08:10 localhost dnsmasq-dhcp[327872]: read /var/lib/neutron/dhcp/39526589-27d4-41ad-9aef-f63f534ecbf0/opts Nov 28 05:08:10 localhost podman[327930]: 2025-11-28 10:08:10.257114721 +0000 UTC m=+0.061998281 container kill 7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39526589-27d4-41ad-9aef-f63f534ecbf0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 05:08:10 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:10.758 2 INFO neutron.agent.securitygroups_rpc [None req-652590f9-d0df-4db1-90c9-4125d049cb03 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['527f33ea-6583-4033-be5c-a5d3ccd20912']#033[00m Nov 28 05:08:10 localhost nova_compute[279673]: 2025-11-28 10:08:10.907 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:11 localhost dnsmasq[327872]: exiting on receipt of SIGTERM Nov 28 05:08:11 localhost podman[327966]: 2025-11-28 10:08:11.032674168 +0000 UTC m=+0.065880481 container kill 7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39526589-27d4-41ad-9aef-f63f534ecbf0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true) Nov 28 05:08:11 localhost systemd[1]: tmp-crun.Rqf2pe.mount: Deactivated successfully. Nov 28 05:08:11 localhost systemd[1]: libpod-7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36.scope: Deactivated successfully. Nov 28 05:08:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:08:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:08:11 localhost ovn_controller[152322]: 2025-11-28T10:08:11Z|00467|binding|INFO|Removing iface tapdfb15593-fe ovn-installed in OVS Nov 28 05:08:11 localhost ovn_controller[152322]: 2025-11-28T10:08:11Z|00468|binding|INFO|Removing lport dfb15593-fe1f-4aaf-af25-32ab66c1a780 ovn-installed in OVS Nov 28 05:08:11 localhost nova_compute[279673]: 2025-11-28 10:08:11.118 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:11 localhost ovn_metadata_agent[158125]: 2025-11-28 10:08:11.121 158130 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port a7fb3ffc-8e40-4885-9fc0-3b779892c20d with type ""#033[00m Nov 28 05:08:11 localhost ovn_metadata_agent[158125]: 2025-11-28 10:08:11.122 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-39526589-27d4-41ad-9aef-f63f534ecbf0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-39526589-27d4-41ad-9aef-f63f534ecbf0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ce143270a4649669232b53b6a44e4ba', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7ab6281c-5843-47df-b4b0-ac9f0fa87790, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=dfb15593-fe1f-4aaf-af25-32ab66c1a780) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:08:11 localhost ovn_metadata_agent[158125]: 2025-11-28 10:08:11.124 158130 INFO neutron.agent.ovn.metadata.agent [-] Port dfb15593-fe1f-4aaf-af25-32ab66c1a780 in datapath 39526589-27d4-41ad-9aef-f63f534ecbf0 unbound from our chassis#033[00m Nov 28 05:08:11 localhost ovn_metadata_agent[158125]: 2025-11-28 10:08:11.126 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 39526589-27d4-41ad-9aef-f63f534ecbf0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:08:11 localhost nova_compute[279673]: 2025-11-28 10:08:11.127 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:11 localhost ovn_metadata_agent[158125]: 2025-11-28 10:08:11.127 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[59296e7b-81f6-43e3-864c-174c03bff93c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:08:11 localhost podman[327980]: 2025-11-28 10:08:11.128768589 +0000 UTC m=+0.080029867 container died 7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39526589-27d4-41ad-9aef-f63f534ecbf0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:08:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e216 do_prune osdmap full prune enabled Nov 28 05:08:11 localhost podman[327980]: 2025-11-28 10:08:11.160965121 +0000 UTC m=+0.112226369 container cleanup 7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39526589-27d4-41ad-9aef-f63f534ecbf0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true) Nov 28 05:08:11 localhost systemd[1]: libpod-conmon-7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36.scope: Deactivated successfully. Nov 28 05:08:11 localhost podman[327982]: 2025-11-28 10:08:11.195744462 +0000 UTC m=+0.137975512 container remove 7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-39526589-27d4-41ad-9aef-f63f534ecbf0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:08:11 localhost nova_compute[279673]: 2025-11-28 10:08:11.207 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:11 localhost kernel: device tapdfb15593-fe left promiscuous mode Nov 28 05:08:11 localhost nova_compute[279673]: 2025-11-28 10:08:11.222 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:11 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:08:11.252 261084 INFO neutron.agent.dhcp.agent [None req-d46cfb24-85e5-4d25-bf2b-4f53a1e873cc - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:08:11 localhost systemd[1]: var-lib-containers-storage-overlay-4341b7420827c25c7624eca3707ed25a6232f868ac4d13c1195965a03f452de4-merged.mount: Deactivated successfully. Nov 28 05:08:11 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7083d3dcda2dcbc1b280d612b29f695e76343b2129a3aab4802e3bcaf207ce36-userdata-shm.mount: Deactivated successfully. Nov 28 05:08:11 localhost systemd[1]: run-netns-qdhcp\x2d39526589\x2d27d4\x2d41ad\x2d9aef\x2df63f534ecbf0.mount: Deactivated successfully. Nov 28 05:08:11 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:11.260 2 INFO neutron.agent.securitygroups_rpc [None req-8e5272cc-6346-4fdc-ba9b-b2622fce7146 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['527f33ea-6583-4033-be5c-a5d3ccd20912']#033[00m Nov 28 05:08:11 localhost podman[327988]: 2025-11-28 10:08:11.260686573 +0000 UTC m=+0.193193253 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 05:08:11 localhost podman[327988]: 2025-11-28 10:08:11.268200395 +0000 UTC m=+0.200707105 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 05:08:11 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:08:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e217 e217: 6 total, 6 up, 6 in Nov 28 05:08:11 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e217: 6 total, 6 up, 6 in Nov 28 05:08:11 localhost podman[327994]: 2025-11-28 10:08:11.315984887 +0000 UTC m=+0.245821435 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 28 05:08:11 localhost podman[327994]: 2025-11-28 10:08:11.353768882 +0000 UTC m=+0.283605450 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125) Nov 28 05:08:11 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:08:11 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:08:11.375 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:08:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e217 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:08:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e217 do_prune osdmap full prune enabled Nov 28 05:08:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e218 e218: 6 total, 6 up, 6 in Nov 28 05:08:11 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e218: 6 total, 6 up, 6 in Nov 28 05:08:11 localhost nova_compute[279673]: 2025-11-28 10:08:11.497 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:11 localhost ovn_controller[152322]: 2025-11-28T10:08:11Z|00469|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:08:11 localhost nova_compute[279673]: 2025-11-28 10:08:11.555 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:11 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:11.737 2 INFO neutron.agent.securitygroups_rpc [None req-fc93531f-0f28-40b2-be64-5c8ae1a05f2f 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['f4a575b2-e757-4f17-8902-ce29566c2707']#033[00m Nov 28 05:08:12 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:12.425 2 INFO neutron.agent.securitygroups_rpc [None req-950a4a21-ae98-4893-86d1-a721665d75e4 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['f4a575b2-e757-4f17-8902-ce29566c2707']#033[00m Nov 28 05:08:12 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:12.616 2 INFO neutron.agent.securitygroups_rpc [None req-70e65ce9-39cb-464e-830b-530df9be0aa7 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['609542d6-94ac-4376-bfbb-29a5ac4d9008']#033[00m Nov 28 05:08:12 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:12.928 2 INFO neutron.agent.securitygroups_rpc [None req-75b9de2d-ffc0-4329-a7f5-226897331b9b 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['f4a575b2-e757-4f17-8902-ce29566c2707']#033[00m Nov 28 05:08:13 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:13.009 2 INFO neutron.agent.securitygroups_rpc [None req-d8139777-22d3-42a1-ac35-78ef0fdf0858 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['609542d6-94ac-4376-bfbb-29a5ac4d9008']#033[00m Nov 28 05:08:13 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:13.254 2 INFO neutron.agent.securitygroups_rpc [None req-08ef0569-f479-4e5d-8704-2ba0f20ecb11 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['609542d6-94ac-4376-bfbb-29a5ac4d9008']#033[00m Nov 28 05:08:13 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:13.388 2 INFO neutron.agent.securitygroups_rpc [None req-a98371b3-db2f-4475-8ff8-c8019a0287c5 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['f4a575b2-e757-4f17-8902-ce29566c2707']#033[00m Nov 28 05:08:13 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:13.575 2 INFO neutron.agent.securitygroups_rpc [None req-86361f15-5494-4722-ab9a-5dacf7fdc04d 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['609542d6-94ac-4376-bfbb-29a5ac4d9008']#033[00m Nov 28 05:08:13 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:13.689 2 INFO neutron.agent.securitygroups_rpc [None req-d46ea504-c7ea-4ca3-916c-fe85715f24c7 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['f4a575b2-e757-4f17-8902-ce29566c2707']#033[00m Nov 28 05:08:13 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:13.891 2 INFO neutron.agent.securitygroups_rpc [None req-865b6df2-96d7-4677-b689-d1af588b7477 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['609542d6-94ac-4376-bfbb-29a5ac4d9008']#033[00m Nov 28 05:08:14 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:14.008 2 INFO neutron.agent.securitygroups_rpc [None req-447348cf-3ac4-498a-af94-5fdba49716f9 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['f4a575b2-e757-4f17-8902-ce29566c2707']#033[00m Nov 28 05:08:14 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:14.121 2 INFO neutron.agent.securitygroups_rpc [None req-e5d09537-743d-43a9-bf9b-ecb2832cda1b 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['609542d6-94ac-4376-bfbb-29a5ac4d9008']#033[00m Nov 28 05:08:14 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:14.890 2 INFO neutron.agent.securitygroups_rpc [None req-185e5877-2263-42be-9a4e-22f963823be4 4946fab2fc3e4b6a824c5b97d5395e1f 6aa84c7049ec4634978c2f2583d7c1dd - - default default] Security group rule updated ['dae70bc2-83a0-4e05-bc5e-659aa86d0528']#033[00m Nov 28 05:08:14 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:14.913 2 INFO neutron.agent.securitygroups_rpc [None req-d7a70732-d57d-42ae-a12a-793909186bfd 98015f849ce5423d83599c846511f268 ed1e45c023a5404a97377641c4a6c580 - - default default] Security group rule updated ['d99d3754-6453-4c3f-8498-8ac20a4744c7']#033[00m Nov 28 05:08:15 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e218 do_prune osdmap full prune enabled Nov 28 05:08:15 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e219 e219: 6 total, 6 up, 6 in Nov 28 05:08:15 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e219: 6 total, 6 up, 6 in Nov 28 05:08:15 localhost nova_compute[279673]: 2025-11-28 10:08:15.910 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:08:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e219 do_prune osdmap full prune enabled Nov 28 05:08:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e220 e220: 6 total, 6 up, 6 in Nov 28 05:08:16 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e220: 6 total, 6 up, 6 in Nov 28 05:08:16 localhost nova_compute[279673]: 2025-11-28 10:08:16.500 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:18 localhost openstack_network_exporter[240658]: ERROR 10:08:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:08:18 localhost openstack_network_exporter[240658]: ERROR 10:08:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:08:18 localhost openstack_network_exporter[240658]: ERROR 10:08:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:08:18 localhost openstack_network_exporter[240658]: ERROR 10:08:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:08:18 localhost openstack_network_exporter[240658]: Nov 28 05:08:18 localhost openstack_network_exporter[240658]: ERROR 10:08:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:08:18 localhost openstack_network_exporter[240658]: Nov 28 05:08:20 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:20.216 2 INFO neutron.agent.securitygroups_rpc [None req-d053b12b-4cea-4da2-a120-7dbb5ec3bd14 b3ad92f082324bf2b498b6ec57fa1994 f4aa6a98849143efbe0d34d745657eb8 - - default default] Security group rule updated ['b905493a-8ebf-4d2f-8822-0b2d1ac4a85c']#033[00m Nov 28 05:08:20 localhost neutron_sriov_agent[254147]: 2025-11-28 10:08:20.622 2 INFO neutron.agent.securitygroups_rpc [None req-5d5ff3a5-db5c-4c22-9208-8bc209a22601 2d65c21983fa4a008a09c7a8bb7a6484 2603cf17f09846a397a42aba4be9d81b - - default default] Security group rule updated ['90aec1a6-5e99-47c4-8e4c-11b88cdc4ca9']#033[00m Nov 28 05:08:20 localhost nova_compute[279673]: 2025-11-28 10:08:20.912 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:08:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e220 do_prune osdmap full prune enabled Nov 28 05:08:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e221 e221: 6 total, 6 up, 6 in Nov 28 05:08:21 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e221: 6 total, 6 up, 6 in Nov 28 05:08:21 localhost nova_compute[279673]: 2025-11-28 10:08:21.502 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:08:21 localhost podman[328051]: 2025-11-28 10:08:21.845312825 +0000 UTC m=+0.082255546 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.expose-services=) Nov 28 05:08:21 localhost podman[328051]: 2025-11-28 10:08:21.862538126 +0000 UTC m=+0.099480777 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.6, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git) Nov 28 05:08:21 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:08:22 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:08:22 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1272958995' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:08:22 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:08:22 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1272958995' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:08:24 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e48: np0005538515.yfkzhl(active, since 10m), standbys: np0005538513.dsfdlx, np0005538514.djozup Nov 28 05:08:25 localhost nova_compute[279673]: 2025-11-28 10:08:25.913 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:08:26 localhost podman[328153]: 2025-11-28 10:08:26.070354978 +0000 UTC m=+0.084622928 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 28 05:08:26 localhost podman[328153]: 2025-11-28 10:08:26.155567053 +0000 UTC m=+0.169835003 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 05:08:26 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:08:26 localhost podman[328204]: 2025-11-28 10:08:26.288393397 +0000 UTC m=+0.095083501 container exec bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, io.buildah.version=1.33.12, maintainer=Guillaume Abrioux , release=553, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/agreements, name=rhceph, distribution-scope=public, build-date=2025-09-24T08:57:55, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-type=git, ceph=True, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=rhceph-container, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vendor=Red Hat, Inc.) Nov 28 05:08:26 localhost podman[328204]: 2025-11-28 10:08:26.431222167 +0000 UTC m=+0.237912252 container exec_died bd2bb19ca93973ccec5b61b058e1f23b302e96234190f51a14e8957382729491 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-2c5417c9-00eb-57d5-a565-ddecbc7995c1-crash-np0005538513, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-ref=cba612d428f1498c8ae5570dd75a670ccf94c03d, org.opencontainers.image.revision=0c20ee48321f5d64135f6208d1332c0b032df6c3, CEPH_POINT_RELEASE=, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/agreements, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, RELEASE=main, url=https://access.redhat.com/containers/#/registry.access.redhat.com/rhceph/images/7-553, build-date=2025-09-24T08:57:55, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.33.12, release=553, GIT_CLEAN=True, GIT_BRANCH=main, distribution-scope=public, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Nov 28 05:08:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e221 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:08:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e221 do_prune osdmap full prune enabled Nov 28 05:08:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e222 e222: 6 total, 6 up, 6 in Nov 28 05:08:26 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e222: 6 total, 6 up, 6 in Nov 28 05:08:26 localhost nova_compute[279673]: 2025-11-28 10:08:26.505 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 05:08:26 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:08:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 05:08:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 05:08:26 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:08:26 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:08:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 05:08:27 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:08:27 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 05:08:27 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:08:27 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 05:08:27 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:08:27 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:08:27 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1911934516' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:08:27 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:08:27 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1911934516' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:08:27 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:08:27 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:08:27 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:08:27 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:08:27 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:08:27 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:08:27 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0) Nov 28 05:08:27 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 28 05:08:27 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0) Nov 28 05:08:27 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 28 05:08:27 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Nov 28 05:08:28 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0) Nov 28 05:08:28 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 28 05:08:28 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0) Nov 28 05:08:28 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 28 05:08:28 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0) Nov 28 05:08:28 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 28 05:08:28 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0) Nov 28 05:08:28 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 28 05:08:28 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Nov 28 05:08:28 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Nov 28 05:08:28 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 05:08:28 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:08:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:08:28 localhost podman[328406]: 2025-11-28 10:08:28.275629358 +0000 UTC m=+0.083983539 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:08:28 localhost podman[328406]: 2025-11-28 10:08:28.310432981 +0000 UTC m=+0.118787152 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Nov 28 05:08:28 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:08:28 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 28 05:08:28 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Nov 28 05:08:28 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 28 05:08:28 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Nov 28 05:08:28 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 28 05:08:28 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 28 05:08:28 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 28 05:08:28 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Nov 28 05:08:28 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 28 05:08:28 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Nov 28 05:08:28 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Nov 28 05:08:28 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Nov 28 05:08:28 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 05:08:28 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:08:29 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:08:29 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/786658190' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:08:29 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:08:29 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/786658190' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:08:29 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e222 do_prune osdmap full prune enabled Nov 28 05:08:29 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e223 e223: 6 total, 6 up, 6 in Nov 28 05:08:29 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e223: 6 total, 6 up, 6 in Nov 28 05:08:29 localhost ceph-mon[292954]: Adjusting osd_memory_target on np0005538514.localdomain to 836.6M Nov 28 05:08:29 localhost ceph-mon[292954]: Unable to set osd_memory_target on np0005538514.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 05:08:29 localhost ceph-mon[292954]: Adjusting osd_memory_target on np0005538515.localdomain to 836.6M Nov 28 05:08:29 localhost ceph-mon[292954]: Adjusting osd_memory_target on np0005538513.localdomain to 836.6M Nov 28 05:08:29 localhost ceph-mon[292954]: Unable to set osd_memory_target on np0005538515.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 05:08:29 localhost ceph-mon[292954]: Unable to set osd_memory_target on np0005538513.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Nov 28 05:08:30 localhost nova_compute[279673]: 2025-11-28 10:08:30.915 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:30 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 28 05:08:30 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:08:31 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e223 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:08:31 localhost nova_compute[279673]: 2025-11-28 10:08:31.510 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:31 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:08:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:08:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:08:31 localhost systemd[1]: tmp-crun.QCIAQV.mount: Deactivated successfully. Nov 28 05:08:31 localhost podman[328427]: 2025-11-28 10:08:31.865299187 +0000 UTC m=+0.098817357 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Nov 28 05:08:31 localhost podman[328428]: 2025-11-28 10:08:31.904924688 +0000 UTC m=+0.134954641 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 05:08:31 localhost podman[328427]: 2025-11-28 10:08:31.926348687 +0000 UTC m=+0.159866857 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:08:31 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:08:31 localhost podman[328428]: 2025-11-28 10:08:31.97154929 +0000 UTC m=+0.201579303 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Nov 28 05:08:31 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:08:32 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e223 do_prune osdmap full prune enabled Nov 28 05:08:32 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e224 e224: 6 total, 6 up, 6 in Nov 28 05:08:32 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e224: 6 total, 6 up, 6 in Nov 28 05:08:33 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e224 do_prune osdmap full prune enabled Nov 28 05:08:33 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e225 e225: 6 total, 6 up, 6 in Nov 28 05:08:33 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e225: 6 total, 6 up, 6 in Nov 28 05:08:34 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e225 do_prune osdmap full prune enabled Nov 28 05:08:34 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e226 e226: 6 total, 6 up, 6 in Nov 28 05:08:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 05:08:34 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 8400.1 total, 600.0 interval#012Cumulative writes: 14K writes, 57K keys, 14K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.01 MB/s#012Cumulative WAL: 14K writes, 4566 syncs, 3.17 writes per sync, written: 0.04 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 9397 writes, 34K keys, 9397 commit groups, 1.0 writes per commit group, ingest: 26.84 MB, 0.04 MB/s#012Interval WAL: 9397 writes, 3883 syncs, 2.42 writes per sync, written: 0.03 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 28 05:08:34 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e226: 6 total, 6 up, 6 in Nov 28 05:08:35 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e226 do_prune osdmap full prune enabled Nov 28 05:08:35 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e227 e227: 6 total, 6 up, 6 in Nov 28 05:08:35 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e227: 6 total, 6 up, 6 in Nov 28 05:08:35 localhost nova_compute[279673]: 2025-11-28 10:08:35.916 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:36 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:08:36 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/594035084' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:08:36 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:08:36 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/594035084' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:08:36 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e227 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:08:36 localhost nova_compute[279673]: 2025-11-28 10:08:36.513 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:37 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e227 do_prune osdmap full prune enabled Nov 28 05:08:37 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e228 e228: 6 total, 6 up, 6 in Nov 28 05:08:37 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e228: 6 total, 6 up, 6 in Nov 28 05:08:38 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:08:38 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/83214699' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:08:38 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:08:38 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/83214699' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:08:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 05:08:39 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 8400.3 total, 600.0 interval#012Cumulative writes: 16K writes, 61K keys, 16K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.01 MB/s#012Cumulative WAL: 16K writes, 5217 syncs, 3.12 writes per sync, written: 0.04 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 10K writes, 36K keys, 10K commit groups, 1.0 writes per commit group, ingest: 23.73 MB, 0.04 MB/s#012Interval WAL: 10K writes, 4356 syncs, 2.39 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Nov 28 05:08:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e228 do_prune osdmap full prune enabled Nov 28 05:08:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e229 e229: 6 total, 6 up, 6 in Nov 28 05:08:39 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e229: 6 total, 6 up, 6 in Nov 28 05:08:40 localhost podman[238687]: time="2025-11-28T10:08:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:08:40 localhost podman[238687]: @ - - [28/Nov/2025:10:08:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 28 05:08:40 localhost podman[238687]: @ - - [28/Nov/2025:10:08:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19272 "" "Go-http-client/1.1" Nov 28 05:08:40 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e229 do_prune osdmap full prune enabled Nov 28 05:08:40 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e230 e230: 6 total, 6 up, 6 in Nov 28 05:08:40 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e230: 6 total, 6 up, 6 in Nov 28 05:08:40 localhost nova_compute[279673]: 2025-11-28 10:08:40.920 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e230 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:08:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e230 do_prune osdmap full prune enabled Nov 28 05:08:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e231 e231: 6 total, 6 up, 6 in Nov 28 05:08:41 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e231: 6 total, 6 up, 6 in Nov 28 05:08:41 localhost nova_compute[279673]: 2025-11-28 10:08:41.515 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:08:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:08:41 localhost podman[328471]: 2025-11-28 10:08:41.864398121 +0000 UTC m=+0.082950297 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 05:08:41 localhost podman[328471]: 2025-11-28 10:08:41.877431462 +0000 UTC m=+0.095983698 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 05:08:41 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:08:41 localhost podman[328472]: 2025-11-28 10:08:41.969401056 +0000 UTC m=+0.186070704 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Nov 28 05:08:42 localhost podman[328472]: 2025-11-28 10:08:42.007650365 +0000 UTC m=+0.224320063 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 28 05:08:42 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:08:42 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e231 do_prune osdmap full prune enabled Nov 28 05:08:42 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e232 e232: 6 total, 6 up, 6 in Nov 28 05:08:42 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e232: 6 total, 6 up, 6 in Nov 28 05:08:43 localhost ovn_controller[152322]: 2025-11-28T10:08:43Z|00470|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory Nov 28 05:08:43 localhost nova_compute[279673]: 2025-11-28 10:08:43.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:08:44 localhost nova_compute[279673]: 2025-11-28 10:08:44.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:08:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e232 do_prune osdmap full prune enabled Nov 28 05:08:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e233 e233: 6 total, 6 up, 6 in Nov 28 05:08:44 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e233: 6 total, 6 up, 6 in Nov 28 05:08:45 localhost nova_compute[279673]: 2025-11-28 10:08:45.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:08:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e233 do_prune osdmap full prune enabled Nov 28 05:08:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e234 e234: 6 total, 6 up, 6 in Nov 28 05:08:45 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e234: 6 total, 6 up, 6 in Nov 28 05:08:45 localhost nova_compute[279673]: 2025-11-28 10:08:45.922 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:46 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e234 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:08:46 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e234 do_prune osdmap full prune enabled Nov 28 05:08:46 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e235 e235: 6 total, 6 up, 6 in Nov 28 05:08:46 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e235: 6 total, 6 up, 6 in Nov 28 05:08:46 localhost nova_compute[279673]: 2025-11-28 10:08:46.518 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:46 localhost nova_compute[279673]: 2025-11-28 10:08:46.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:08:46 localhost nova_compute[279673]: 2025-11-28 10:08:46.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 05:08:47 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e235 do_prune osdmap full prune enabled Nov 28 05:08:47 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e236 e236: 6 total, 6 up, 6 in Nov 28 05:08:47 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e236: 6 total, 6 up, 6 in Nov 28 05:08:47 localhost nova_compute[279673]: 2025-11-28 10:08:47.767 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:08:48 localhost openstack_network_exporter[240658]: ERROR 10:08:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:08:48 localhost openstack_network_exporter[240658]: ERROR 10:08:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:08:48 localhost openstack_network_exporter[240658]: ERROR 10:08:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:08:48 localhost openstack_network_exporter[240658]: ERROR 10:08:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:08:48 localhost openstack_network_exporter[240658]: Nov 28 05:08:48 localhost openstack_network_exporter[240658]: ERROR 10:08:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:08:48 localhost openstack_network_exporter[240658]: Nov 28 05:08:48 localhost nova_compute[279673]: 2025-11-28 10:08:48.405 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:48 localhost ovn_metadata_agent[158125]: 2025-11-28 10:08:48.407 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:08:48 localhost ovn_metadata_agent[158125]: 2025-11-28 10:08:48.409 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 28 05:08:48 localhost nova_compute[279673]: 2025-11-28 10:08:48.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:08:48 localhost nova_compute[279673]: 2025-11-28 10:08:48.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:08:48 localhost nova_compute[279673]: 2025-11-28 10:08:48.796 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:08:48 localhost nova_compute[279673]: 2025-11-28 10:08:48.796 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:08:48 localhost nova_compute[279673]: 2025-11-28 10:08:48.797 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:08:48 localhost nova_compute[279673]: 2025-11-28 10:08:48.797 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 05:08:48 localhost nova_compute[279673]: 2025-11-28 10:08:48.798 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:08:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:08:49 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1503239630' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:08:49 localhost nova_compute[279673]: 2025-11-28 10:08:49.242 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:08:49 localhost nova_compute[279673]: 2025-11-28 10:08:49.309 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:08:49 localhost nova_compute[279673]: 2025-11-28 10:08:49.310 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:08:49 localhost nova_compute[279673]: 2025-11-28 10:08:49.519 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 05:08:49 localhost nova_compute[279673]: 2025-11-28 10:08:49.520 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11092MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 05:08:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e236 do_prune osdmap full prune enabled Nov 28 05:08:49 localhost nova_compute[279673]: 2025-11-28 10:08:49.521 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:08:49 localhost nova_compute[279673]: 2025-11-28 10:08:49.521 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:08:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e237 e237: 6 total, 6 up, 6 in Nov 28 05:08:49 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e237: 6 total, 6 up, 6 in Nov 28 05:08:49 localhost nova_compute[279673]: 2025-11-28 10:08:49.617 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 05:08:49 localhost nova_compute[279673]: 2025-11-28 10:08:49.618 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 05:08:49 localhost nova_compute[279673]: 2025-11-28 10:08:49.619 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 05:08:49 localhost nova_compute[279673]: 2025-11-28 10:08:49.654 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:08:50 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:08:50 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/780788281' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:08:50 localhost nova_compute[279673]: 2025-11-28 10:08:50.109 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:08:50 localhost nova_compute[279673]: 2025-11-28 10:08:50.116 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 05:08:50 localhost nova_compute[279673]: 2025-11-28 10:08:50.132 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 05:08:50 localhost nova_compute[279673]: 2025-11-28 10:08:50.135 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 05:08:50 localhost nova_compute[279673]: 2025-11-28 10:08:50.135 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.614s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:08:50 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:08:50 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1604987579' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:08:50 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:08:50 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1604987579' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:08:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:08:50.846 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:08:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:08:50.847 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:08:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:08:50.848 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:08:50 localhost nova_compute[279673]: 2025-11-28 10:08:50.923 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:51 localhost nova_compute[279673]: 2025-11-28 10:08:51.137 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:08:51 localhost nova_compute[279673]: 2025-11-28 10:08:51.138 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 05:08:51 localhost nova_compute[279673]: 2025-11-28 10:08:51.138 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 05:08:51 localhost nova_compute[279673]: 2025-11-28 10:08:51.222 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 05:08:51 localhost nova_compute[279673]: 2025-11-28 10:08:51.222 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 05:08:51 localhost nova_compute[279673]: 2025-11-28 10:08:51.223 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 05:08:51 localhost nova_compute[279673]: 2025-11-28 10:08:51.223 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:08:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e237 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:08:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e237 do_prune osdmap full prune enabled Nov 28 05:08:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e238 e238: 6 total, 6 up, 6 in Nov 28 05:08:51 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e238: 6 total, 6 up, 6 in Nov 28 05:08:51 localhost nova_compute[279673]: 2025-11-28 10:08:51.522 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:51 localhost nova_compute[279673]: 2025-11-28 10:08:51.634 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:08:51 localhost nova_compute[279673]: 2025-11-28 10:08:51.662 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 05:08:51 localhost nova_compute[279673]: 2025-11-28 10:08:51.663 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 05:08:51 localhost nova_compute[279673]: 2025-11-28 10:08:51.663 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:08:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:08:52 localhost podman[328558]: 2025-11-28 10:08:52.868633489 +0000 UTC m=+0.103418158 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, architecture=x86_64, version=9.6, com.redhat.component=ubi9-minimal-container, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.buildah.version=1.33.7) Nov 28 05:08:52 localhost podman[328558]: 2025-11-28 10:08:52.885621862 +0000 UTC m=+0.120406491 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers) Nov 28 05:08:52 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:08:55 localhost nova_compute[279673]: 2025-11-28 10:08:55.928 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:56 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:08:56 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e238 do_prune osdmap full prune enabled Nov 28 05:08:56 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e239 e239: 6 total, 6 up, 6 in Nov 28 05:08:56 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e239: 6 total, 6 up, 6 in Nov 28 05:08:56 localhost nova_compute[279673]: 2025-11-28 10:08:56.524 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:08:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:08:56 localhost podman[328578]: 2025-11-28 10:08:56.849109858 +0000 UTC m=+0.087486196 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 28 05:08:56 localhost podman[328578]: 2025-11-28 10:08:56.863995097 +0000 UTC m=+0.102371465 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 05:08:56 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:08:57 localhost ovn_metadata_agent[158125]: 2025-11-28 10:08:57.410 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:08:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:08:58 localhost podman[328602]: 2025-11-28 10:08:58.857476802 +0000 UTC m=+0.087615080 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 28 05:08:58 localhost podman[328602]: 2025-11-28 10:08:58.89082317 +0000 UTC m=+0.120961428 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Nov 28 05:08:58 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:09:00 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e239 do_prune osdmap full prune enabled Nov 28 05:09:00 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e240 e240: 6 total, 6 up, 6 in Nov 28 05:09:00 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e240: 6 total, 6 up, 6 in Nov 28 05:09:00 localhost nova_compute[279673]: 2025-11-28 10:09:00.966 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:01 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:09:01 localhost nova_compute[279673]: 2025-11-28 10:09:01.527 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:02 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e240 do_prune osdmap full prune enabled Nov 28 05:09:02 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e241 e241: 6 total, 6 up, 6 in Nov 28 05:09:02 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e241: 6 total, 6 up, 6 in Nov 28 05:09:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:09:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:09:02 localhost podman[328620]: 2025-11-28 10:09:02.875843569 +0000 UTC m=+0.104097199 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 28 05:09:02 localhost podman[328620]: 2025-11-28 10:09:02.918038249 +0000 UTC m=+0.146291869 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:09:02 localhost systemd[1]: tmp-crun.yEX5Bd.mount: Deactivated successfully. Nov 28 05:09:02 localhost podman[328621]: 2025-11-28 10:09:02.929290796 +0000 UTC m=+0.150829799 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 28 05:09:02 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:09:03 localhost podman[328621]: 2025-11-28 10:09:03.015696098 +0000 UTC m=+0.237235061 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 28 05:09:03 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:09:03 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e241 do_prune osdmap full prune enabled Nov 28 05:09:03 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e242 e242: 6 total, 6 up, 6 in Nov 28 05:09:03 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e242: 6 total, 6 up, 6 in Nov 28 05:09:04 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:09:04 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/754623008' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:09:04 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:09:04 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/754623008' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:09:05 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e242 do_prune osdmap full prune enabled Nov 28 05:09:05 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e243 e243: 6 total, 6 up, 6 in Nov 28 05:09:05 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e243: 6 total, 6 up, 6 in Nov 28 05:09:06 localhost nova_compute[279673]: 2025-11-28 10:09:06.013 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:06 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:09:06 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e243 do_prune osdmap full prune enabled Nov 28 05:09:06 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e244 e244: 6 total, 6 up, 6 in Nov 28 05:09:06 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e244: 6 total, 6 up, 6 in Nov 28 05:09:06 localhost nova_compute[279673]: 2025-11-28 10:09:06.531 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:07 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e244 do_prune osdmap full prune enabled Nov 28 05:09:07 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e245 e245: 6 total, 6 up, 6 in Nov 28 05:09:07 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e245: 6 total, 6 up, 6 in Nov 28 05:09:10 localhost podman[238687]: time="2025-11-28T10:09:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:09:10 localhost podman[238687]: @ - - [28/Nov/2025:10:09:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 28 05:09:10 localhost podman[238687]: @ - - [28/Nov/2025:10:09:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19268 "" "Go-http-client/1.1" Nov 28 05:09:11 localhost nova_compute[279673]: 2025-11-28 10:09:11.042 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:09:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e245 do_prune osdmap full prune enabled Nov 28 05:09:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e246 e246: 6 total, 6 up, 6 in Nov 28 05:09:11 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e246: 6 total, 6 up, 6 in Nov 28 05:09:11 localhost nova_compute[279673]: 2025-11-28 10:09:11.534 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:09:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:09:12 localhost podman[328663]: 2025-11-28 10:09:12.871317505 +0000 UTC m=+0.097626009 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 05:09:12 localhost podman[328664]: 2025-11-28 10:09:12.908825001 +0000 UTC m=+0.133487984 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 28 05:09:12 localhost podman[328664]: 2025-11-28 10:09:12.919547141 +0000 UTC m=+0.144210144 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:09:12 localhost podman[328663]: 2025-11-28 10:09:12.931541411 +0000 UTC m=+0.157849865 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 05:09:12 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:09:12 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:09:13 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:09:13 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3079034859' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:09:13 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:09:13 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3079034859' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:09:14 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e246 do_prune osdmap full prune enabled Nov 28 05:09:14 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e247 e247: 6 total, 6 up, 6 in Nov 28 05:09:14 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e247: 6 total, 6 up, 6 in Nov 28 05:09:15 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e247 do_prune osdmap full prune enabled Nov 28 05:09:15 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e248 e248: 6 total, 6 up, 6 in Nov 28 05:09:15 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e248: 6 total, 6 up, 6 in Nov 28 05:09:16 localhost nova_compute[279673]: 2025-11-28 10:09:16.070 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:09:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e248 do_prune osdmap full prune enabled Nov 28 05:09:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e249 e249: 6 total, 6 up, 6 in Nov 28 05:09:16 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e249: 6 total, 6 up, 6 in Nov 28 05:09:16 localhost nova_compute[279673]: 2025-11-28 10:09:16.536 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:18 localhost openstack_network_exporter[240658]: ERROR 10:09:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:09:18 localhost openstack_network_exporter[240658]: ERROR 10:09:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:09:18 localhost openstack_network_exporter[240658]: ERROR 10:09:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:09:18 localhost openstack_network_exporter[240658]: ERROR 10:09:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:09:18 localhost openstack_network_exporter[240658]: Nov 28 05:09:18 localhost openstack_network_exporter[240658]: ERROR 10:09:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:09:18 localhost openstack_network_exporter[240658]: Nov 28 05:09:18 localhost nova_compute[279673]: 2025-11-28 10:09:18.673 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:19 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e249 do_prune osdmap full prune enabled Nov 28 05:09:19 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e250 e250: 6 total, 6 up, 6 in Nov 28 05:09:19 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e250: 6 total, 6 up, 6 in Nov 28 05:09:20 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e250 do_prune osdmap full prune enabled Nov 28 05:09:20 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e251 e251: 6 total, 6 up, 6 in Nov 28 05:09:20 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e251: 6 total, 6 up, 6 in Nov 28 05:09:21 localhost nova_compute[279673]: 2025-11-28 10:09:21.107 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:09:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e251 do_prune osdmap full prune enabled Nov 28 05:09:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e252 e252: 6 total, 6 up, 6 in Nov 28 05:09:21 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e252: 6 total, 6 up, 6 in Nov 28 05:09:21 localhost nova_compute[279673]: 2025-11-28 10:09:21.540 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0. Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.751964) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55 Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324561752101, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 2643, "num_deletes": 281, "total_data_size": 3176462, "memory_usage": 3242064, "flush_reason": "Manual Compaction"} Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324561775665, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 3108429, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30678, "largest_seqno": 33320, "table_properties": {"data_size": 3096625, "index_size": 7669, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 27914, "raw_average_key_size": 22, "raw_value_size": 3072092, "raw_average_value_size": 2520, "num_data_blocks": 320, "num_entries": 1219, "num_filter_entries": 1219, "num_deletions": 281, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324437, "oldest_key_time": 1764324437, "file_creation_time": 1764324561, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}} Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 23781 microseconds, and 10562 cpu microseconds. Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.775755) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 3108429 bytes OK Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.775797) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.777991) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.778047) EVENT_LOG_v1 {"time_micros": 1764324561778038, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.778113) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 3164760, prev total WAL file size 3164760, number of live WAL files 2. Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.779367) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132323939' seq:72057594037927935, type:22 .. '7061786F73003132353531' seq:0, type:0; will stop at (end) Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(3035KB)], [54(17MB)] Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324561779440, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 20953640, "oldest_snapshot_seqno": -1} Nov 28 05:09:21 localhost nova_compute[279673]: 2025-11-28 10:09:21.789 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 13408 keys, 19616465 bytes, temperature: kUnknown Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324561896486, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 19616465, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19535722, "index_size": 46146, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33541, "raw_key_size": 357739, "raw_average_key_size": 26, "raw_value_size": 19303510, "raw_average_value_size": 1439, "num_data_blocks": 1754, "num_entries": 13408, "num_filter_entries": 13408, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764324561, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}} Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.896968) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 19616465 bytes Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.899330) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 178.8 rd, 167.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.0, 17.0 +0.0 blob) out(18.7 +0.0 blob), read-write-amplify(13.1) write-amplify(6.3) OK, records in: 13974, records dropped: 566 output_compression: NoCompression Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.899361) EVENT_LOG_v1 {"time_micros": 1764324561899346, "job": 32, "event": "compaction_finished", "compaction_time_micros": 117175, "compaction_time_cpu_micros": 51179, "output_level": 6, "num_output_files": 1, "total_output_size": 19616465, "num_input_records": 13974, "num_output_records": 13408, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324561900252, "job": 32, "event": "table_file_deletion", "file_number": 56} Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324561903113, "job": 32, "event": "table_file_deletion", "file_number": 54} Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.779278) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.903210) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.903219) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.903222) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.903226) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:09:21 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:21.903228) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:09:22 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e252 do_prune osdmap full prune enabled Nov 28 05:09:22 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e253 e253: 6 total, 6 up, 6 in Nov 28 05:09:22 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e253: 6 total, 6 up, 6 in Nov 28 05:09:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:09:23 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e253 do_prune osdmap full prune enabled Nov 28 05:09:23 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e254 e254: 6 total, 6 up, 6 in Nov 28 05:09:23 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e254: 6 total, 6 up, 6 in Nov 28 05:09:23 localhost podman[328708]: 2025-11-28 10:09:23.84847391 +0000 UTC m=+0.084910158 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 28 05:09:23 localhost podman[328708]: 2025-11-28 10:09:23.866426063 +0000 UTC m=+0.102862331 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, vendor=Red Hat, Inc., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, distribution-scope=public, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6) Nov 28 05:09:23 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:09:26 localhost nova_compute[279673]: 2025-11-28 10:09:26.139 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:09:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e254 do_prune osdmap full prune enabled Nov 28 05:09:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e255 e255: 6 total, 6 up, 6 in Nov 28 05:09:26 localhost nova_compute[279673]: 2025-11-28 10:09:26.542 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:26 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e255: 6 total, 6 up, 6 in Nov 28 05:09:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:09:27 localhost podman[328729]: 2025-11-28 10:09:27.865805985 +0000 UTC m=+0.103731867 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 05:09:27 localhost podman[328729]: 2025-11-28 10:09:27.878346851 +0000 UTC m=+0.116272743 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 28 05:09:27 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:09:29 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 05:09:29 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:09:29 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 05:09:29 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:09:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:09:29 localhost podman[328836]: 2025-11-28 10:09:29.762839317 +0000 UTC m=+0.094947816 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 05:09:29 localhost podman[328836]: 2025-11-28 10:09:29.799568839 +0000 UTC m=+0.131677288 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 05:09:29 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:09:30 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:09:30 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/444864674' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:09:30 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:09:30 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/444864674' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:09:30 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 28 05:09:31 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:09:31 localhost nova_compute[279673]: 2025-11-28 10:09:31.172 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:31 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:09:31 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e255 do_prune osdmap full prune enabled Nov 28 05:09:31 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e256 e256: 6 total, 6 up, 6 in Nov 28 05:09:31 localhost nova_compute[279673]: 2025-11-28 10:09:31.546 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:31 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e256: 6 total, 6 up, 6 in Nov 28 05:09:31 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:09:32 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:09:32 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3188050676' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:09:32 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:09:32 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3188050676' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:09:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:09:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:09:33 localhost podman[328858]: 2025-11-28 10:09:33.924629674 +0000 UTC m=+0.158019841 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3) Nov 28 05:09:33 localhost podman[328857]: 2025-11-28 10:09:33.877497841 +0000 UTC m=+0.116301535 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0) Nov 28 05:09:33 localhost podman[328857]: 2025-11-28 10:09:33.960436456 +0000 UTC m=+0.199240090 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=edpm) Nov 28 05:09:33 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:09:33 localhost podman[328858]: 2025-11-28 10:09:33.989345478 +0000 UTC m=+0.222735655 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 05:09:33 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:09:35 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e256 do_prune osdmap full prune enabled Nov 28 05:09:35 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e257 e257: 6 total, 6 up, 6 in Nov 28 05:09:35 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e257: 6 total, 6 up, 6 in Nov 28 05:09:36 localhost nova_compute[279673]: 2025-11-28 10:09:36.212 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:36 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:09:36 localhost nova_compute[279673]: 2025-11-28 10:09:36.549 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:36 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e49: np0005538515.yfkzhl(active, since 11m), standbys: np0005538513.dsfdlx, np0005538514.djozup Nov 28 05:09:40 localhost podman[238687]: time="2025-11-28T10:09:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:09:40 localhost podman[238687]: @ - - [28/Nov/2025:10:09:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 28 05:09:40 localhost podman[238687]: @ - - [28/Nov/2025:10:09:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19269 "" "Go-http-client/1.1" Nov 28 05:09:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e257 do_prune osdmap full prune enabled Nov 28 05:09:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e258 e258: 6 total, 6 up, 6 in Nov 28 05:09:41 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e258: 6 total, 6 up, 6 in Nov 28 05:09:41 localhost nova_compute[279673]: 2025-11-28 10:09:41.248 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:09:41 localhost nova_compute[279673]: 2025-11-28 10:09:41.556 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:42 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:09:42 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2049122593' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:09:42 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:09:42 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2049122593' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:09:42 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e258 do_prune osdmap full prune enabled Nov 28 05:09:42 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e259 e259: 6 total, 6 up, 6 in Nov 28 05:09:42 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e259: 6 total, 6 up, 6 in Nov 28 05:09:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:09:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:09:43 localhost systemd[1]: tmp-crun.Ou3pfO.mount: Deactivated successfully. Nov 28 05:09:43 localhost podman[328900]: 2025-11-28 10:09:43.838753504 +0000 UTC m=+0.079404339 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 05:09:43 localhost podman[328900]: 2025-11-28 10:09:43.851788395 +0000 UTC m=+0.092439260 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 05:09:43 localhost systemd[1]: tmp-crun.zLpI3u.mount: Deactivated successfully. Nov 28 05:09:43 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:09:43 localhost podman[328901]: 2025-11-28 10:09:43.861918957 +0000 UTC m=+0.094980428 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 28 05:09:43 localhost podman[328901]: 2025-11-28 10:09:43.942374327 +0000 UTC m=+0.175435838 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3) Nov 28 05:09:43 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:09:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e259 do_prune osdmap full prune enabled Nov 28 05:09:44 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e260 e260: 6 total, 6 up, 6 in Nov 28 05:09:44 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e260: 6 total, 6 up, 6 in Nov 28 05:09:44 localhost nova_compute[279673]: 2025-11-28 10:09:44.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:09:45 localhost nova_compute[279673]: 2025-11-28 10:09:45.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:09:46 localhost nova_compute[279673]: 2025-11-28 10:09:46.283 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:46 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e260 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:09:46 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e260 do_prune osdmap full prune enabled Nov 28 05:09:46 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e261 e261: 6 total, 6 up, 6 in Nov 28 05:09:46 localhost nova_compute[279673]: 2025-11-28 10:09:46.559 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:46 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e261: 6 total, 6 up, 6 in Nov 28 05:09:46 localhost nova_compute[279673]: 2025-11-28 10:09:46.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:09:46 localhost nova_compute[279673]: 2025-11-28 10:09:46.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 05:09:47 localhost nova_compute[279673]: 2025-11-28 10:09:47.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:09:48 localhost openstack_network_exporter[240658]: ERROR 10:09:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:09:48 localhost openstack_network_exporter[240658]: ERROR 10:09:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:09:48 localhost openstack_network_exporter[240658]: ERROR 10:09:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:09:48 localhost openstack_network_exporter[240658]: ERROR 10:09:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:09:48 localhost openstack_network_exporter[240658]: Nov 28 05:09:48 localhost openstack_network_exporter[240658]: ERROR 10:09:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:09:48 localhost openstack_network_exporter[240658]: Nov 28 05:09:48 localhost nova_compute[279673]: 2025-11-28 10:09:48.695 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:48 localhost ovn_metadata_agent[158125]: 2025-11-28 10:09:48.696 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:09:48 localhost ovn_metadata_agent[158125]: 2025-11-28 10:09:48.697 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 28 05:09:49 localhost nova_compute[279673]: 2025-11-28 10:09:49.766 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:09:50 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e261 do_prune osdmap full prune enabled Nov 28 05:09:50 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e262 e262: 6 total, 6 up, 6 in Nov 28 05:09:50 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e262: 6 total, 6 up, 6 in Nov 28 05:09:50 localhost nova_compute[279673]: 2025-11-28 10:09:50.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:09:50 localhost nova_compute[279673]: 2025-11-28 10:09:50.770 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 05:09:50 localhost nova_compute[279673]: 2025-11-28 10:09:50.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 05:09:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:09:50.847 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:09:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:09:50.847 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:09:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:09:50.848 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:09:50 localhost nova_compute[279673]: 2025-11-28 10:09:50.850 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 05:09:50 localhost nova_compute[279673]: 2025-11-28 10:09:50.851 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 05:09:50 localhost nova_compute[279673]: 2025-11-28 10:09:50.851 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 05:09:50 localhost nova_compute[279673]: 2025-11-28 10:09:50.852 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:09:51 localhost nova_compute[279673]: 2025-11-28 10:09:51.319 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:51 localhost nova_compute[279673]: 2025-11-28 10:09:51.323 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:09:51 localhost nova_compute[279673]: 2025-11-28 10:09:51.345 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 05:09:51 localhost nova_compute[279673]: 2025-11-28 10:09:51.346 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 05:09:51 localhost nova_compute[279673]: 2025-11-28 10:09:51.347 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:09:51 localhost nova_compute[279673]: 2025-11-28 10:09:51.347 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:09:51 localhost nova_compute[279673]: 2025-11-28 10:09:51.366 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:09:51 localhost nova_compute[279673]: 2025-11-28 10:09:51.367 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:09:51 localhost nova_compute[279673]: 2025-11-28 10:09:51.367 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:09:51 localhost nova_compute[279673]: 2025-11-28 10:09:51.368 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 05:09:51 localhost nova_compute[279673]: 2025-11-28 10:09:51.368 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:09:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:09:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e262 do_prune osdmap full prune enabled Nov 28 05:09:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e263 e263: 6 total, 6 up, 6 in Nov 28 05:09:51 localhost nova_compute[279673]: 2025-11-28 10:09:51.565 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:51 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e263: 6 total, 6 up, 6 in Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0. Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.585043) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58 Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324591585095, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 756, "num_deletes": 255, "total_data_size": 900264, "memory_usage": 914792, "flush_reason": "Manual Compaction"} Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324591592538, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 812869, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33321, "largest_seqno": 34076, "table_properties": {"data_size": 809169, "index_size": 1427, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 10137, "raw_average_key_size": 21, "raw_value_size": 801282, "raw_average_value_size": 1734, "num_data_blocks": 62, "num_entries": 462, "num_filter_entries": 462, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324562, "oldest_key_time": 1764324562, "file_creation_time": 1764324591, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}} Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 7544 microseconds, and 3065 cpu microseconds. Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.592587) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 812869 bytes OK Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.592616) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.595498) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.595535) EVENT_LOG_v1 {"time_micros": 1764324591595526, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.595561) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 896217, prev total WAL file size 896217, number of live WAL files 2. Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.596259) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034303131' seq:72057594037927935, type:22 .. '6D6772737461740034323632' seq:0, type:0; will stop at (end) Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(793KB)], [57(18MB)] Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324591596307, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 20429334, "oldest_snapshot_seqno": -1} Nov 28 05:09:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:09:51 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1803177013' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:09:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:09:51 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1803177013' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 13349 keys, 18386709 bytes, temperature: kUnknown Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324591692737, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 18386709, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18310381, "index_size": 41859, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33413, "raw_key_size": 357091, "raw_average_key_size": 26, "raw_value_size": 18083325, "raw_average_value_size": 1354, "num_data_blocks": 1573, "num_entries": 13349, "num_filter_entries": 13349, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764324591, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}} Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.693087) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 18386709 bytes Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.695624) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 211.7 rd, 190.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 18.7 +0.0 blob) out(17.5 +0.0 blob), read-write-amplify(47.8) write-amplify(22.6) OK, records in: 13870, records dropped: 521 output_compression: NoCompression Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.695653) EVENT_LOG_v1 {"time_micros": 1764324591695640, "job": 34, "event": "compaction_finished", "compaction_time_micros": 96522, "compaction_time_cpu_micros": 49995, "output_level": 6, "num_output_files": 1, "total_output_size": 18386709, "num_input_records": 13870, "num_output_records": 13349, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324591695903, "job": 34, "event": "table_file_deletion", "file_number": 59} Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324591698383, "job": 34, "event": "table_file_deletion", "file_number": 57} Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.596179) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.698472) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.698480) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.698484) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.698487) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:09:51 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:09:51.698490) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:09:51 localhost ovn_controller[152322]: 2025-11-28T10:09:51Z|00471|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory Nov 28 05:09:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:09:51 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1135745932' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:09:51 localhost nova_compute[279673]: 2025-11-28 10:09:51.837 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:09:51 localhost nova_compute[279673]: 2025-11-28 10:09:51.908 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:09:51 localhost nova_compute[279673]: 2025-11-28 10:09:51.909 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:09:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:09:51 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1402041267' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:09:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:09:51 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1402041267' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:09:52 localhost nova_compute[279673]: 2025-11-28 10:09:52.105 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 05:09:52 localhost nova_compute[279673]: 2025-11-28 10:09:52.107 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11060MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 05:09:52 localhost nova_compute[279673]: 2025-11-28 10:09:52.107 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:09:52 localhost nova_compute[279673]: 2025-11-28 10:09:52.108 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:09:52 localhost nova_compute[279673]: 2025-11-28 10:09:52.184 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 05:09:52 localhost nova_compute[279673]: 2025-11-28 10:09:52.185 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 05:09:52 localhost nova_compute[279673]: 2025-11-28 10:09:52.185 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 05:09:52 localhost nova_compute[279673]: 2025-11-28 10:09:52.231 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:09:52 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:09:52 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2713521814' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:09:52 localhost nova_compute[279673]: 2025-11-28 10:09:52.711 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:09:52 localhost nova_compute[279673]: 2025-11-28 10:09:52.716 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 05:09:52 localhost nova_compute[279673]: 2025-11-28 10:09:52.775 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 05:09:52 localhost nova_compute[279673]: 2025-11-28 10:09:52.778 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 05:09:52 localhost nova_compute[279673]: 2025-11-28 10:09:52.778 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.670s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:09:53 localhost nova_compute[279673]: 2025-11-28 10:09:53.203 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:09:53 localhost ovn_metadata_agent[158125]: 2025-11-28 10:09:53.699 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:09:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e263 do_prune osdmap full prune enabled Nov 28 05:09:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e264 e264: 6 total, 6 up, 6 in Nov 28 05:09:54 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e264: 6 total, 6 up, 6 in Nov 28 05:09:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:09:54 localhost podman[328983]: 2025-11-28 10:09:54.845548463 +0000 UTC m=+0.080624605 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vcs-type=git, io.openshift.expose-services=, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Nov 28 05:09:54 localhost podman[328983]: 2025-11-28 10:09:54.855379415 +0000 UTC m=+0.090455557 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, maintainer=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible) Nov 28 05:09:54 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:09:56 localhost nova_compute[279673]: 2025-11-28 10:09:56.355 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:56 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e264 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:09:56 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e264 do_prune osdmap full prune enabled Nov 28 05:09:56 localhost nova_compute[279673]: 2025-11-28 10:09:56.568 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:09:56 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e265 e265: 6 total, 6 up, 6 in Nov 28 05:09:56 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e265: 6 total, 6 up, 6 in Nov 28 05:09:58 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e265 do_prune osdmap full prune enabled Nov 28 05:09:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:09:58 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e266 e266: 6 total, 6 up, 6 in Nov 28 05:09:58 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e50: np0005538515.yfkzhl(active, since 11m), standbys: np0005538513.dsfdlx, np0005538514.djozup Nov 28 05:09:58 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e266: 6 total, 6 up, 6 in Nov 28 05:09:58 localhost podman[329005]: 2025-11-28 10:09:58.864682213 +0000 UTC m=+0.102930692 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 05:09:58 localhost podman[329005]: 2025-11-28 10:09:58.876582569 +0000 UTC m=+0.114831258 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 05:09:58 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:10:00 localhost ceph-mon[292954]: log_channel(cluster) log [INF] : overall HEALTH_OK Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.676 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.677 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.707 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.707 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9011f07-e60e-48ba-a89d-d0d3cefc1418', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:10:00.677590', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6671016c-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.849485514, 'message_signature': '419097692cfe9929165749bced74accdced2c764cd24e8791e70c16958622380'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:10:00.677590', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6671136e-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.849485514, 'message_signature': 'e928876c2606c778b5821f627a7a12b8fbd41aab2cff6afc153872d2de06bdd8'}]}, 'timestamp': '2025-11-28 10:10:00.708248', '_unique_id': '23bd52a48ae6476cbd03af838a69e9c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.709 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.710 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.715 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '31e0b36c-1b06-459d-871e-51d7bcb25937', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:10:00.711115', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '6672335c-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.883017807, 'message_signature': '414aae5bf3e94d71023af86f035cd9637d055194d9308aee010259272304c28b'}]}, 'timestamp': '2025-11-28 10:10:00.715590', '_unique_id': '0d5485849759441b9af2dcc52889ee55'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.716 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.717 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.717 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab2e34fe-8232-47cd-b0c6-cefd900a0286', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:10:00.717793', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '66729c8e-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.883017807, 'message_signature': '291fddca9e8bb110ac9adc6da21fd96c26126059670d3b300fdadc2e2ca2060c'}]}, 'timestamp': '2025-11-28 10:10:00.718278', '_unique_id': 'fd8fe9e7fb6e4666bde5bf47d3daaf0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.719 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.720 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.730 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.731 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87b92e0a-9ca6-4aea-a70b-461fae010db9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:10:00.720342', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '667498b8-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.892235642, 'message_signature': '812e6acf199b674541656ce843427d983ad6f963eeaf181333cea20ebe8c0896'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:10:00.720342', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6674a970-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.892235642, 'message_signature': '0c484510df273cb92152fcae5e8f946b4271d40219422462cba26c63498f43eb'}]}, 'timestamp': '2025-11-28 10:10:00.731679', '_unique_id': '09e1cd8e8d72452fab3a99abf0c4d4c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.732 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.733 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.733 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '032e7307-0ab7-4412-a5e6-a84d7fdb5da9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:10:00.733885', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '6675120c-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.883017807, 'message_signature': '02a92161f158a493d69ebfeb1895bc07024ccef606f2613b03eaeccfc3073bc6'}]}, 'timestamp': '2025-11-28 10:10:00.734389', '_unique_id': '9784fc3e437547f0ad7519899f3f6be5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.735 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.736 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.736 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.736 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d62d606-e032-4a16-9b12-9d43b14fcd05', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:10:00.736449', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6675744a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.892235642, 'message_signature': '2a81607ff4610d68136e0f810d12263da45310b7a34c67f7efd9e50310aadd17'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:10:00.736449', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '667584c6-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.892235642, 'message_signature': 'c767daa6284b97d7d6ac1cfde8f248720233c11a58edff41515dbb6361517f8f'}]}, 'timestamp': '2025-11-28 10:10:00.737295', '_unique_id': 'd7896ec17aca450e8bda61e76c7163bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.738 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.739 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.739 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.739 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f272c2ff-7ab4-4890-8bdd-d5cb46ff8d7c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:10:00.739407', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6675e7ea-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.849485514, 'message_signature': '448fe4127945c544adb0ed46b158ffb061c9aa61776feebf12e6ebb1c9e04594'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:10:00.739407', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6675f884-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.849485514, 'message_signature': '5323c46a0cd66a7192a6f44e40b5bcf9bd867970cbd084c3dcced739d50abba1'}]}, 'timestamp': '2025-11-28 10:10:00.740261', '_unique_id': '9da7d8fbc385468bae7fc37f7820b8ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.741 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.742 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.742 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e488da15-f85f-42d8-81e4-9c30bffb0c65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:10:00.742540', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '6676629c-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.883017807, 'message_signature': 'f7d32d2e76725980e937c723e6b11aff75993f570766e3c5f7fe66f8a52cdf52'}]}, 'timestamp': '2025-11-28 10:10:00.743006', '_unique_id': '5215f171cfaf4a7996f742902de4cdba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.744 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.745 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.745 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef787469-c1d4-42c4-a691-60d8912cff2f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:10:00.745382', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '6676d1a0-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.883017807, 'message_signature': 'c7231e62b4500d3082e887218d929b16f3a8bb79f38c6a9a4f6a1fc35b426b1b'}]}, 'timestamp': '2025-11-28 10:10:00.745846', '_unique_id': '992c64f31472426c9ba1d9a417d7ca2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.746 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.747 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.747 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b334287-72c5-4ea1-a2de-c0642b95ce3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:10:00.747887', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '66773474-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.883017807, 'message_signature': '0f39c9b338c3f566e267bbce5a0a4d61e1e951ead1ffacadb995674e56f80332'}]}, 'timestamp': '2025-11-28 10:10:00.748374', '_unique_id': '54c8cae4467840588db6288bc8658a18'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.749 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.750 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.750 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.750 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80e3f09b-fd19-4b8f-87bc-f17ffac5d061', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:10:00.750429', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6677964e-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.849485514, 'message_signature': '2bf398d9c42f88d7fce2453d34860c58c8e14eb0c7b5163f72901d37d95647ac'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:10:00.750429', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6677a724-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.849485514, 'message_signature': 'df28661111fab49d3a2664e794afcc62926dcb6fceae553abf30e71c6d28ea47'}]}, 'timestamp': '2025-11-28 10:10:00.751285', '_unique_id': '5a7f0d3df162436a95dd4d8bbf17e183'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.752 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.753 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.753 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f1a0ec6-cea2-46b5-8056-3c092ead95c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:10:00.753341', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '667809bc-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.883017807, 'message_signature': 'ccbc1f3127744f5b3e53986455e78230632e7713a82b688a23fdfdeadc3cb9ac'}]}, 'timestamp': '2025-11-28 10:10:00.753833', '_unique_id': '490804a11b8545b0a5b07a3bcb42fce5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.754 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.755 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.755 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.756 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e47a9297-520e-45a6-baab-263d54af9f15', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:10:00.756008', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '667871f4-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.883017807, 'message_signature': '33d7553ad018ce0577d0f270abe9175b2303d4ae062424c45aa211eff2800115'}]}, 'timestamp': '2025-11-28 10:10:00.756504', '_unique_id': '8564cb43bd4c4b1e8185c5f955ab2e83'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.757 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.758 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.758 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.778 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 18250000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '212563e3-bcd1-42ae-8868-add7e1ee6507', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18250000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:10:00.758762', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '667be37a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.94998646, 'message_signature': 'f9988f713a291de4e21eba916626e6e8e4c82de9e932030912a2dc37b313b7a3'}]}, 'timestamp': '2025-11-28 10:10:00.779178', '_unique_id': 'd3936f4c667c4997971f031668ead1cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.780 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.781 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.782 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '471a4ec1-85d0-48d7-8143-9bc999534ca3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:10:00.782089', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '667c747a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.94998646, 'message_signature': 'c5bb31dd042057416d1c26f214e840a183853b021265d03c2d236d44cd0097d6'}]}, 'timestamp': '2025-11-28 10:10:00.782879', '_unique_id': '0ce3e1dd6c6b4dbe8c3ee71474dd3928'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.785 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.785 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a53c25f-1fdf-4a4c-8151-a2680133bd65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:10:00.785730', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '667cfb16-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.883017807, 'message_signature': '8beb4c39a40b844d527f87844225ae45c312691579523e4618a10dba6a8faac8'}]}, 'timestamp': '2025-11-28 10:10:00.786304', '_unique_id': '5311457da1f14f9386d91942a4f44ac1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.788 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.788 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.789 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e9eaff2-e92d-4d4d-909a-b0e9e313feb5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:10:00.788656', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '667d6c04-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.849485514, 'message_signature': 'f629602d9d76724c2e5c2e854dfaa06e6ffd4a887d5c2ea8080b035aea50001d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:10:00.788656', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '667d7d7a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.849485514, 'message_signature': 'cebde52f43fbc56cd0a6f783ec6f008e70253ad8ef0309d239da3346cefb2f09'}]}, 'timestamp': '2025-11-28 10:10:00.789602', '_unique_id': 'b8be020b9d17447cb55acf2f96870478'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.790 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.791 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.791 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceph-mon[292954]: overall HEALTH_OK Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.792 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b78d2b2e-95b6-4f7a-9148-20eac74d8ba6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:10:00.791872', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '667deb02-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.849485514, 'message_signature': '4aadbd540b8d05dfc6f812ffbd72b48a9d74ef44a0395bf59d3bba8af4e0d603'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:10:00.791872', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '667dfb1a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.849485514, 'message_signature': '34e48aeadaf76fa298916d760a258a3824cfe36e0c954a126c3c5df7fa450817'}]}, 'timestamp': '2025-11-28 10:10:00.792890', '_unique_id': '66be5a99b3f346e6afff06ad90d52889'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.793 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.795 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.795 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.796 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54162c7b-fff6-439f-9cd9-0ddc1310ec5c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:10:00.795667', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '667e7e50-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.849485514, 'message_signature': 'ba7981e45ebbe1014637552dd5d93909a8495b2d9f1acb22500cfdabf92f3b31'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:10:00.795667', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '667e8fd0-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.849485514, 'message_signature': '9d7900444cdb7c7ace59be438a6a9213765bddcaa7f222288fc80d1383bfc190'}]}, 'timestamp': '2025-11-28 10:10:00.796623', '_unique_id': 'c0425cb057f94b8f94f6aa75c6d469f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.797 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.798 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.799 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.799 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f5048e3-2042-46c6-ae4c-f8446b44e9c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:10:00.799298', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '667f0c08-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.883017807, 'message_signature': '645dc2b0304ca37a6a5d06bdef41aac8e66ee71f133f86bd81728ff8a336ceaa'}]}, 'timestamp': '2025-11-28 10:10:00.799771', '_unique_id': '31976e4151c548f3aac64274407223df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.800 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.801 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.802 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.802 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81460575-9d64-4db6-8538-d9e9aba3ef18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:10:00.802062', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '667f75f8-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.892235642, 'message_signature': '3adec2fc4bd23868e243f67f12278feed3f5692fefe01b89a1a3c78ede1abcb5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:10:00.802062', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '667f8264-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12434.892235642, 'message_signature': '7e87dfdb6d455e1b9ee8e0cf9de03bf494762c388b081878b1226f0d3822eb38'}]}, 'timestamp': '2025-11-28 10:10:00.802696', '_unique_id': 'ed6afc7bf9da46cc9ccd2b23dbb85f6d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 ERROR oslo_messaging.notify.messaging Nov 28 05:10:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:10:00.803 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:10:00 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:10:00 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:10:00 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:10:00 localhost podman[329028]: 2025-11-28 10:10:00.856179745 +0000 UTC m=+0.088500307 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent) Nov 28 05:10:00 localhost podman[329028]: 2025-11-28 10:10:00.886451698 +0000 UTC m=+0.118772290 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:10:00 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:10:01 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:10:01 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2619852505' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:10:01 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:10:01 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2619852505' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:10:01 localhost nova_compute[279673]: 2025-11-28 10:10:01.398 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:10:01 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:10:01 localhost nova_compute[279673]: 2025-11-28 10:10:01.570 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:10:01 localhost nova_compute[279673]: 2025-11-28 10:10:01.767 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:10:01 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:10:01 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:10:01 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:10:03 localhost ovn_controller[152322]: 2025-11-28T10:10:03Z|00472|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:10:03 localhost nova_compute[279673]: 2025-11-28 10:10:03.602 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:10:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:10:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:10:04 localhost podman[329046]: 2025-11-28 10:10:04.842842214 +0000 UTC m=+0.082844853 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:10:04 localhost podman[329046]: 2025-11-28 10:10:04.853761011 +0000 UTC m=+0.093763640 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute) Nov 28 05:10:04 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:10:04 localhost podman[329047]: 2025-11-28 10:10:04.917186625 +0000 UTC m=+0.149673193 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true) Nov 28 05:10:04 localhost podman[329047]: 2025-11-28 10:10:04.989551455 +0000 UTC m=+0.222038023 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Nov 28 05:10:05 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:10:05 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:10:05 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:10:05 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:10:05 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch Nov 28 05:10:05 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:10:05 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:10:05 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:10:06 localhost nova_compute[279673]: 2025-11-28 10:10:06.429 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:10:06 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:10:06 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e266 do_prune osdmap full prune enabled Nov 28 05:10:06 localhost nova_compute[279673]: 2025-11-28 10:10:06.575 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:10:06 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e267 e267: 6 total, 6 up, 6 in Nov 28 05:10:06 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e267: 6 total, 6 up, 6 in Nov 28 05:10:07 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve48"} v 0) Nov 28 05:10:07 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch Nov 28 05:10:07 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.eve48"}]': finished Nov 28 05:10:07 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch Nov 28 05:10:07 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch Nov 28 05:10:07 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch Nov 28 05:10:07 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.eve48"}]': finished Nov 28 05:10:10 localhost podman[238687]: time="2025-11-28T10:10:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:10:10 localhost podman[238687]: @ - - [28/Nov/2025:10:10:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 28 05:10:10 localhost podman[238687]: @ - - [28/Nov/2025:10:10:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19275 "" "Go-http-client/1.1" Nov 28 05:10:10 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:10:10 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:10:10 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:10:10 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch Nov 28 05:10:10 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:10:10 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:10:10 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/aee96a4c-0a14-47e6-b8e5-ce0279118ec9/42613c17-5566-4a72-88af-f970b5fd2509", "osd", "allow rw pool=manila_data namespace=fsvolumens_aee96a4c-0a14-47e6-b8e5-ce0279118ec9", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:10:11 localhost nova_compute[279673]: 2025-11-28 10:10:11.468 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:10:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:10:11 localhost nova_compute[279673]: 2025-11-28 10:10:11.577 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:10:13 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Nov 28 05:10:13 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2493387610' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Nov 28 05:10:13 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Nov 28 05:10:13 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2493387610' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Nov 28 05:10:14 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve47"} v 0) Nov 28 05:10:14 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch Nov 28 05:10:14 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.eve47"}]': finished Nov 28 05:10:14 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch Nov 28 05:10:14 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch Nov 28 05:10:14 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch Nov 28 05:10:14 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.eve47"}]': finished Nov 28 05:10:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:10:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:10:14 localhost podman[329090]: 2025-11-28 10:10:14.855246981 +0000 UTC m=+0.089796328 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 05:10:14 localhost podman[329090]: 2025-11-28 10:10:14.866406885 +0000 UTC m=+0.100956192 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 05:10:14 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:10:14 localhost podman[329091]: 2025-11-28 10:10:14.950064012 +0000 UTC m=+0.178789920 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:10:14 localhost podman[329091]: 2025-11-28 10:10:14.960551586 +0000 UTC m=+0.189277464 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Nov 28 05:10:14 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:10:15 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e267 do_prune osdmap full prune enabled Nov 28 05:10:15 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e268 e268: 6 total, 6 up, 6 in Nov 28 05:10:15 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e268: 6 total, 6 up, 6 in Nov 28 05:10:16 localhost nova_compute[279673]: 2025-11-28 10:10:16.499 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:10:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:10:16 localhost nova_compute[279673]: 2025-11-28 10:10:16.580 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:10:18 localhost openstack_network_exporter[240658]: ERROR 10:10:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:10:18 localhost openstack_network_exporter[240658]: ERROR 10:10:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:10:18 localhost openstack_network_exporter[240658]: ERROR 10:10:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:10:18 localhost openstack_network_exporter[240658]: ERROR 10:10:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:10:18 localhost openstack_network_exporter[240658]: Nov 28 05:10:18 localhost openstack_network_exporter[240658]: ERROR 10:10:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:10:18 localhost openstack_network_exporter[240658]: Nov 28 05:10:18 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve49"} v 0) Nov 28 05:10:18 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch Nov 28 05:10:18 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.eve49"}]': finished Nov 28 05:10:18 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch Nov 28 05:10:18 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch Nov 28 05:10:18 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch Nov 28 05:10:18 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.eve49"}]': finished Nov 28 05:10:21 localhost nova_compute[279673]: 2025-11-28 10:10:21.526 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:10:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:10:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e268 do_prune osdmap full prune enabled Nov 28 05:10:21 localhost nova_compute[279673]: 2025-11-28 10:10:21.584 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:10:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e269 e269: 6 total, 6 up, 6 in Nov 28 05:10:21 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e269: 6 total, 6 up, 6 in Nov 28 05:10:22 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e269 do_prune osdmap full prune enabled Nov 28 05:10:22 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e270 e270: 6 total, 6 up, 6 in Nov 28 05:10:22 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e270: 6 total, 6 up, 6 in Nov 28 05:10:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:10:25 localhost podman[329132]: 2025-11-28 10:10:25.841639459 +0000 UTC m=+0.078835360 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41) Nov 28 05:10:25 localhost podman[329132]: 2025-11-28 10:10:25.862368827 +0000 UTC m=+0.099564728 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, config_id=edpm, maintainer=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git) Nov 28 05:10:25 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:10:26 localhost nova_compute[279673]: 2025-11-28 10:10:26.561 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:10:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:10:26 localhost nova_compute[279673]: 2025-11-28 10:10:26.587 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:10:28 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:10:28 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:10:28 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:10:29 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 28 05:10:29 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:10:29 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:10:29 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:10:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:10:29 localhost podman[329152]: 2025-11-28 10:10:29.85109607 +0000 UTC m=+0.088087845 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 05:10:29 localhost podman[329152]: 2025-11-28 10:10:29.862408149 +0000 UTC m=+0.099399934 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 05:10:29 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:10:30 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 05:10:30 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:10:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:10:31 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 28 05:10:31 localhost podman[329261]: 2025-11-28 10:10:31.031970306 +0000 UTC m=+0.078667454 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 28 05:10:31 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:10:31 localhost podman[329261]: 2025-11-28 10:10:31.0664978 +0000 UTC m=+0.113195018 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 05:10:31 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:10:31 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:10:31 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e270 do_prune osdmap full prune enabled Nov 28 05:10:31 localhost nova_compute[279673]: 2025-11-28 10:10:31.588 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:10:31 localhost nova_compute[279673]: 2025-11-28 10:10:31.590 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:10:31 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e271 e271: 6 total, 6 up, 6 in Nov 28 05:10:31 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e271: 6 total, 6 up, 6 in Nov 28 05:10:31 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 05:10:31 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:10:31 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:10:31 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-254686751", "caps": ["mds", "allow rw path=/volumes/_nogroup/3c672c69-0cef-413e-aa2f-cebe487d9fad/4ce11f7d-42a8-42fc-8022-08399d2c2f15", "osd", "allow rw pool=manila_data namespace=fsvolumens_3c672c69-0cef-413e-aa2f-cebe487d9fad", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:10:31 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-254686751", "caps": ["mds", "allow rw path=/volumes/_nogroup/3c672c69-0cef-413e-aa2f-cebe487d9fad/4ce11f7d-42a8-42fc-8022-08399d2c2f15", "osd", "allow rw pool=manila_data namespace=fsvolumens_3c672c69-0cef-413e-aa2f-cebe487d9fad", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:10:31 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-254686751", "caps": ["mds", "allow rw path=/volumes/_nogroup/3c672c69-0cef-413e-aa2f-cebe487d9fad/4ce11f7d-42a8-42fc-8022-08399d2c2f15", "osd", "allow rw pool=manila_data namespace=fsvolumens_3c672c69-0cef-413e-aa2f-cebe487d9fad", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:10:32 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-254686751", "format": "json"} : dispatch Nov 28 05:10:32 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-254686751", "caps": ["mds", "allow rw path=/volumes/_nogroup/3c672c69-0cef-413e-aa2f-cebe487d9fad/4ce11f7d-42a8-42fc-8022-08399d2c2f15", "osd", "allow rw pool=manila_data namespace=fsvolumens_3c672c69-0cef-413e-aa2f-cebe487d9fad", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:10:32 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-254686751", "caps": ["mds", "allow rw path=/volumes/_nogroup/3c672c69-0cef-413e-aa2f-cebe487d9fad/4ce11f7d-42a8-42fc-8022-08399d2c2f15", "osd", "allow rw pool=manila_data namespace=fsvolumens_3c672c69-0cef-413e-aa2f-cebe487d9fad", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:10:32 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-254686751", "caps": ["mds", "allow rw path=/volumes/_nogroup/3c672c69-0cef-413e-aa2f-cebe487d9fad/4ce11f7d-42a8-42fc-8022-08399d2c2f15", "osd", "allow rw pool=manila_data namespace=fsvolumens_3c672c69-0cef-413e-aa2f-cebe487d9fad", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:10:34 localhost ovn_controller[152322]: 2025-11-28T10:10:34Z|00473|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory Nov 28 05:10:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:10:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:10:35 localhost podman[329280]: 2025-11-28 10:10:35.50007341 +0000 UTC m=+0.093980067 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:10:35 localhost podman[329280]: 2025-11-28 10:10:35.510512132 +0000 UTC m=+0.104418749 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 28 05:10:35 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:10:35 localhost podman[329281]: 2025-11-28 10:10:35.600889597 +0000 UTC m=+0.192350048 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 28 05:10:35 localhost podman[329281]: 2025-11-28 10:10:35.707629626 +0000 UTC m=+0.299090067 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:10:35 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:10:35 localhost nova_compute[279673]: 2025-11-28 10:10:35.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:10:35 localhost nova_compute[279673]: 2025-11-28 10:10:35.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Nov 28 05:10:35 localhost nova_compute[279673]: 2025-11-28 10:10:35.791 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Nov 28 05:10:36 localhost nova_compute[279673]: 2025-11-28 10:10:36.591 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:10:36 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:10:36 localhost nova_compute[279673]: 2025-11-28 10:10:36.593 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:10:40 localhost podman[238687]: time="2025-11-28T10:10:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:10:40 localhost podman[238687]: @ - - [28/Nov/2025:10:10:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 28 05:10:40 localhost podman[238687]: @ - - [28/Nov/2025:10:10:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19283 "" "Go-http-client/1.1" Nov 28 05:10:40 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Nov 28 05:10:40 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 28 05:10:40 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 28 05:10:40 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 28 05:10:40 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 28 05:10:40 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 28 05:10:40 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 28 05:10:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-254686751"} v 0) Nov 28 05:10:41 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-254686751"} : dispatch Nov 28 05:10:41 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-254686751"}]': finished Nov 28 05:10:41 localhost nova_compute[279673]: 2025-11-28 10:10:41.594 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:10:41 localhost nova_compute[279673]: 2025-11-28 10:10:41.621 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:10:41 localhost nova_compute[279673]: 2025-11-28 10:10:41.622 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5028 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:10:41 localhost nova_compute[279673]: 2025-11-28 10:10:41.622 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:10:41 localhost nova_compute[279673]: 2025-11-28 10:10:41.623 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:10:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:10:41 localhost nova_compute[279673]: 2025-11-28 10:10:41.623 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:10:41 localhost nova_compute[279673]: 2025-11-28 10:10:41.625 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:10:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:10:41 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:10:41 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:10:41 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-254686751"} : dispatch Nov 28 05:10:41 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-254686751", "format": "json"} : dispatch Nov 28 05:10:41 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-254686751"} : dispatch Nov 28 05:10:41 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-254686751"}]': finished Nov 28 05:10:41 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 28 05:10:41 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:10:41 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:10:42 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:10:44 localhost sshd[329325]: main: sshd: ssh-rsa algorithm is disabled Nov 28 05:10:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:10:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:10:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Nov 28 05:10:45 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 28 05:10:45 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 28 05:10:45 localhost podman[329327]: 2025-11-28 10:10:45.124663909 +0000 UTC m=+0.089133356 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 05:10:45 localhost podman[329327]: 2025-11-28 10:10:45.16589892 +0000 UTC m=+0.130368407 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 05:10:45 localhost podman[329328]: 2025-11-28 10:10:45.176098335 +0000 UTC m=+0.136684193 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3) Nov 28 05:10:45 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:10:45 localhost podman[329328]: 2025-11-28 10:10:45.191629233 +0000 UTC m=+0.152215101 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:10:45 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:10:45 localhost nova_compute[279673]: 2025-11-28 10:10:45.790 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:10:46 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 28 05:10:46 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 28 05:10:46 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 28 05:10:46 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 28 05:10:46 localhost nova_compute[279673]: 2025-11-28 10:10:46.626 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:10:46 localhost nova_compute[279673]: 2025-11-28 10:10:46.628 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:10:46 localhost nova_compute[279673]: 2025-11-28 10:10:46.629 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:10:46 localhost nova_compute[279673]: 2025-11-28 10:10:46.629 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:10:46 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:10:46 localhost nova_compute[279673]: 2025-11-28 10:10:46.658 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:10:46 localhost nova_compute[279673]: 2025-11-28 10:10:46.659 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:10:46 localhost nova_compute[279673]: 2025-11-28 10:10:46.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:10:46 localhost nova_compute[279673]: 2025-11-28 10:10:46.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 05:10:47 localhost ceph-osd[31557]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2. Nov 28 05:10:47 localhost nova_compute[279673]: 2025-11-28 10:10:47.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:10:48 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:10:48 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:10:48 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:10:48 localhost openstack_network_exporter[240658]: ERROR 10:10:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:10:48 localhost openstack_network_exporter[240658]: ERROR 10:10:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:10:48 localhost openstack_network_exporter[240658]: ERROR 10:10:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:10:48 localhost openstack_network_exporter[240658]: ERROR 10:10:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:10:48 localhost openstack_network_exporter[240658]: Nov 28 05:10:48 localhost openstack_network_exporter[240658]: ERROR 10:10:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:10:48 localhost openstack_network_exporter[240658]: Nov 28 05:10:48 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 28 05:10:48 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:10:48 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:10:48 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:10:49 localhost ovn_metadata_agent[158125]: 2025-11-28 10:10:49.587 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:10:49 localhost ovn_metadata_agent[158125]: 2025-11-28 10:10:49.589 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 28 05:10:49 localhost nova_compute[279673]: 2025-11-28 10:10:49.590 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:10:49 localhost nova_compute[279673]: 2025-11-28 10:10:49.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:10:50 localhost nova_compute[279673]: 2025-11-28 10:10:50.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:10:50 localhost nova_compute[279673]: 2025-11-28 10:10:50.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:10:50 localhost nova_compute[279673]: 2025-11-28 10:10:50.789 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:10:50 localhost nova_compute[279673]: 2025-11-28 10:10:50.790 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:10:50 localhost nova_compute[279673]: 2025-11-28 10:10:50.790 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:10:50 localhost nova_compute[279673]: 2025-11-28 10:10:50.791 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 05:10:50 localhost nova_compute[279673]: 2025-11-28 10:10:50.791 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:10:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:10:50.847 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:10:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:10:50.848 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:10:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:10:50.848 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:10:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:10:51 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1097277396' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:10:51 localhost nova_compute[279673]: 2025-11-28 10:10:51.299 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:10:51 localhost nova_compute[279673]: 2025-11-28 10:10:51.360 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:10:51 localhost nova_compute[279673]: 2025-11-28 10:10:51.360 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:10:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Nov 28 05:10:51 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 28 05:10:51 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 28 05:10:51 localhost nova_compute[279673]: 2025-11-28 10:10:51.597 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 05:10:51 localhost nova_compute[279673]: 2025-11-28 10:10:51.599 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11049MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 05:10:51 localhost nova_compute[279673]: 2025-11-28 10:10:51.599 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:10:51 localhost nova_compute[279673]: 2025-11-28 10:10:51.600 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:10:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:10:51 localhost nova_compute[279673]: 2025-11-28 10:10:51.688 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:10:51 localhost nova_compute[279673]: 2025-11-28 10:10:51.815 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 05:10:51 localhost nova_compute[279673]: 2025-11-28 10:10:51.815 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 05:10:51 localhost nova_compute[279673]: 2025-11-28 10:10:51.816 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 05:10:51 localhost nova_compute[279673]: 2025-11-28 10:10:51.887 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing inventories for resource provider 35fead26-0bad-4950-b646-987079d58a17 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 28 05:10:51 localhost nova_compute[279673]: 2025-11-28 10:10:51.938 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Updating ProviderTree inventory for provider 35fead26-0bad-4950-b646-987079d58a17 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 28 05:10:51 localhost nova_compute[279673]: 2025-11-28 10:10:51.938 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 28 05:10:51 localhost nova_compute[279673]: 2025-11-28 10:10:51.952 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing aggregate associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 28 05:10:51 localhost nova_compute[279673]: 2025-11-28 10:10:51.971 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing trait associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_FMA3,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_ABM,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AMD_SVM,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_CLMUL,HW_CPU_X86_BMI2,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 28 05:10:52 localhost nova_compute[279673]: 2025-11-28 10:10:52.017 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:10:52 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 28 05:10:52 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 28 05:10:52 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 28 05:10:52 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 28 05:10:52 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:10:52 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4265101259' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:10:52 localhost nova_compute[279673]: 2025-11-28 10:10:52.486 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:10:52 localhost nova_compute[279673]: 2025-11-28 10:10:52.492 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 05:10:52 localhost nova_compute[279673]: 2025-11-28 10:10:52.513 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 05:10:52 localhost nova_compute[279673]: 2025-11-28 10:10:52.516 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 05:10:52 localhost nova_compute[279673]: 2025-11-28 10:10:52.516 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.916s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:10:53 localhost nova_compute[279673]: 2025-11-28 10:10:53.513 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:10:53 localhost nova_compute[279673]: 2025-11-28 10:10:53.514 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:10:53 localhost nova_compute[279673]: 2025-11-28 10:10:53.514 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 05:10:53 localhost nova_compute[279673]: 2025-11-28 10:10:53.515 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 05:10:53 localhost nova_compute[279673]: 2025-11-28 10:10:53.637 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 05:10:53 localhost nova_compute[279673]: 2025-11-28 10:10:53.637 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 05:10:53 localhost nova_compute[279673]: 2025-11-28 10:10:53.638 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 05:10:53 localhost nova_compute[279673]: 2025-11-28 10:10:53.638 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:10:54 localhost nova_compute[279673]: 2025-11-28 10:10:54.253 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:10:54 localhost nova_compute[279673]: 2025-11-28 10:10:54.269 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 05:10:54 localhost nova_compute[279673]: 2025-11-28 10:10:54.269 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 05:10:54 localhost nova_compute[279673]: 2025-11-28 10:10:54.269 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:10:55 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:10:55 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:10:55 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:10:55 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 28 05:10:55 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:10:55 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:10:55 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:10:55 localhost ovn_metadata_agent[158125]: 2025-11-28 10:10:55.592 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:10:56 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:10:56 localhost nova_compute[279673]: 2025-11-28 10:10:56.714 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:10:56 localhost nova_compute[279673]: 2025-11-28 10:10:56.715 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:10:56 localhost nova_compute[279673]: 2025-11-28 10:10:56.716 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5026 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:10:56 localhost nova_compute[279673]: 2025-11-28 10:10:56.716 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:10:56 localhost nova_compute[279673]: 2025-11-28 10:10:56.717 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:10:56 localhost nova_compute[279673]: 2025-11-28 10:10:56.719 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:10:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:10:56 localhost podman[329413]: 2025-11-28 10:10:56.846103828 +0000 UTC m=+0.079457620 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Nov 28 05:10:56 localhost podman[329413]: 2025-11-28 10:10:56.885634636 +0000 UTC m=+0.118988448 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, config_id=edpm, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Nov 28 05:10:56 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:10:58 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Nov 28 05:10:58 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 28 05:10:58 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 28 05:10:59 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 28 05:10:59 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 28 05:10:59 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 28 05:10:59 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 28 05:11:00 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e271 do_prune osdmap full prune enabled Nov 28 05:11:00 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e272 e272: 6 total, 6 up, 6 in Nov 28 05:11:00 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e272: 6 total, 6 up, 6 in Nov 28 05:11:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:11:00 localhost podman[329434]: 2025-11-28 10:11:00.842186898 +0000 UTC m=+0.078846640 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 28 05:11:00 localhost podman[329434]: 2025-11-28 10:11:00.850126023 +0000 UTC m=+0.086785725 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 05:11:00 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:11:01 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:11:01 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:01 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:01 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:11:01 localhost nova_compute[279673]: 2025-11-28 10:11:01.720 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:11:01 localhost nova_compute[279673]: 2025-11-28 10:11:01.722 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:11:01 localhost nova_compute[279673]: 2025-11-28 10:11:01.722 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:11:01 localhost nova_compute[279673]: 2025-11-28 10:11:01.723 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:11:01 localhost nova_compute[279673]: 2025-11-28 10:11:01.758 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:11:01 localhost nova_compute[279673]: 2025-11-28 10:11:01.759 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:11:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:11:01 localhost podman[329456]: 2025-11-28 10:11:01.847774583 +0000 UTC m=+0.072102793 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 05:11:01 localhost podman[329456]: 2025-11-28 10:11:01.857504812 +0000 UTC m=+0.081833052 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Nov 28 05:11:01 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:11:02 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/9342293c-12c1-4a10-bd1e-8eba9e15cd79/6ecb968c-6419-4825-9a25-21daec62c92e", "osd", "allow rw pool=manila_data namespace=fsvolumens_9342293c-12c1-4a10-bd1e-8eba9e15cd79", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:11:02 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/9342293c-12c1-4a10-bd1e-8eba9e15cd79/6ecb968c-6419-4825-9a25-21daec62c92e", "osd", "allow rw pool=manila_data namespace=fsvolumens_9342293c-12c1-4a10-bd1e-8eba9e15cd79", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:02 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/9342293c-12c1-4a10-bd1e-8eba9e15cd79/6ecb968c-6419-4825-9a25-21daec62c92e", "osd", "allow rw pool=manila_data namespace=fsvolumens_9342293c-12c1-4a10-bd1e-8eba9e15cd79", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:02 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 28 05:11:02 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:02 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:02 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:02 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch Nov 28 05:11:02 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/9342293c-12c1-4a10-bd1e-8eba9e15cd79/6ecb968c-6419-4825-9a25-21daec62c92e", "osd", "allow rw pool=manila_data namespace=fsvolumens_9342293c-12c1-4a10-bd1e-8eba9e15cd79", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:02 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/9342293c-12c1-4a10-bd1e-8eba9e15cd79/6ecb968c-6419-4825-9a25-21daec62c92e", "osd", "allow rw pool=manila_data namespace=fsvolumens_9342293c-12c1-4a10-bd1e-8eba9e15cd79", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:02 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/9342293c-12c1-4a10-bd1e-8eba9e15cd79/6ecb968c-6419-4825-9a25-21daec62c92e", "osd", "allow rw pool=manila_data namespace=fsvolumens_9342293c-12c1-4a10-bd1e-8eba9e15cd79", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:04 localhost nova_compute[279673]: 2025-11-28 10:11:04.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:11:05 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Nov 28 05:11:05 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 28 05:11:05 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 28 05:11:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:11:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:11:05 localhost podman[329475]: 2025-11-28 10:11:05.860254808 +0000 UTC m=+0.093639746 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:11:05 localhost podman[329475]: 2025-11-28 10:11:05.870531835 +0000 UTC m=+0.103916773 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Nov 28 05:11:05 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:11:05 localhost nova_compute[279673]: 2025-11-28 10:11:05.944 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:11:05 localhost nova_compute[279673]: 2025-11-28 10:11:05.945 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Nov 28 05:11:05 localhost systemd[1]: tmp-crun.7l67Fa.mount: Deactivated successfully. Nov 28 05:11:05 localhost podman[329476]: 2025-11-28 10:11:05.958453423 +0000 UTC m=+0.188823698 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Nov 28 05:11:06 localhost podman[329476]: 2025-11-28 10:11:06.027510262 +0000 UTC m=+0.257880557 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true) Nov 28 05:11:06 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:11:06 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 28 05:11:06 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 28 05:11:06 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 28 05:11:06 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 28 05:11:06 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:11:06 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e272 do_prune osdmap full prune enabled Nov 28 05:11:06 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e273 e273: 6 total, 6 up, 6 in Nov 28 05:11:06 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e273: 6 total, 6 up, 6 in Nov 28 05:11:06 localhost nova_compute[279673]: 2025-11-28 10:11:06.762 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:11:07 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} v 0) Nov 28 05:11:07 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch Nov 28 05:11:07 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished Nov 28 05:11:07 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch Nov 28 05:11:07 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch Nov 28 05:11:07 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch Nov 28 05:11:07 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished Nov 28 05:11:08 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:11:08 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:08 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:09 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 28 05:11:09 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:09 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:09 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:10 localhost podman[238687]: time="2025-11-28T10:11:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:11:10 localhost podman[238687]: @ - - [28/Nov/2025:10:11:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 28 05:11:10 localhost podman[238687]: @ - - [28/Nov/2025:10:11:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19273 "" "Go-http-client/1.1" Nov 28 05:11:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:11:11 localhost nova_compute[279673]: 2025-11-28 10:11:11.763 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:11:11 localhost nova_compute[279673]: 2025-11-28 10:11:11.766 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:11:11 localhost nova_compute[279673]: 2025-11-28 10:11:11.767 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:11:11 localhost nova_compute[279673]: 2025-11-28 10:11:11.767 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:11:11 localhost nova_compute[279673]: 2025-11-28 10:11:11.803 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:11:11 localhost nova_compute[279673]: 2025-11-28 10:11:11.804 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:11:12 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1d0b6ffa-5038-4546-af4f-2ad9a9443222/d4c11bd3-8d93-41b4-9ea4-9848a75f8c7c", "osd", "allow rw pool=manila_data namespace=fsvolumens_1d0b6ffa-5038-4546-af4f-2ad9a9443222", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:11:12 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1d0b6ffa-5038-4546-af4f-2ad9a9443222/d4c11bd3-8d93-41b4-9ea4-9848a75f8c7c", "osd", "allow rw pool=manila_data namespace=fsvolumens_1d0b6ffa-5038-4546-af4f-2ad9a9443222", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:12 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1d0b6ffa-5038-4546-af4f-2ad9a9443222/d4c11bd3-8d93-41b4-9ea4-9848a75f8c7c", "osd", "allow rw pool=manila_data namespace=fsvolumens_1d0b6ffa-5038-4546-af4f-2ad9a9443222", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:12 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Nov 28 05:11:12 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 28 05:11:12 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 28 05:11:12 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch Nov 28 05:11:12 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1d0b6ffa-5038-4546-af4f-2ad9a9443222/d4c11bd3-8d93-41b4-9ea4-9848a75f8c7c", "osd", "allow rw pool=manila_data namespace=fsvolumens_1d0b6ffa-5038-4546-af4f-2ad9a9443222", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:12 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1d0b6ffa-5038-4546-af4f-2ad9a9443222/d4c11bd3-8d93-41b4-9ea4-9848a75f8c7c", "osd", "allow rw pool=manila_data namespace=fsvolumens_1d0b6ffa-5038-4546-af4f-2ad9a9443222", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:12 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1d0b6ffa-5038-4546-af4f-2ad9a9443222/d4c11bd3-8d93-41b4-9ea4-9848a75f8c7c", "osd", "allow rw pool=manila_data namespace=fsvolumens_1d0b6ffa-5038-4546-af4f-2ad9a9443222", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:12 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 28 05:11:12 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 28 05:11:12 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 28 05:11:12 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 28 05:11:15 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:11:15 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:15 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:11:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:11:15 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 28 05:11:15 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:15 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:15 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:15 localhost podman[329519]: 2025-11-28 10:11:15.852906018 +0000 UTC m=+0.088862799 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Nov 28 05:11:15 localhost podman[329519]: 2025-11-28 10:11:15.860628886 +0000 UTC m=+0.096585647 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 05:11:15 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:11:15 localhost podman[329520]: 2025-11-28 10:11:15.951800626 +0000 UTC m=+0.183951730 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 05:11:15 localhost podman[329520]: 2025-11-28 10:11:15.965370963 +0000 UTC m=+0.197522067 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Nov 28 05:11:15 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:11:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} v 0) Nov 28 05:11:16 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch Nov 28 05:11:16 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished Nov 28 05:11:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:11:16 localhost nova_compute[279673]: 2025-11-28 10:11:16.805 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:11:16 localhost nova_compute[279673]: 2025-11-28 10:11:16.835 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:11:16 localhost nova_compute[279673]: 2025-11-28 10:11:16.835 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5030 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:11:16 localhost nova_compute[279673]: 2025-11-28 10:11:16.835 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:11:16 localhost nova_compute[279673]: 2025-11-28 10:11:16.837 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:11:16 localhost nova_compute[279673]: 2025-11-28 10:11:16.838 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:11:16 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch Nov 28 05:11:16 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch Nov 28 05:11:16 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch Nov 28 05:11:16 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished Nov 28 05:11:18 localhost openstack_network_exporter[240658]: ERROR 10:11:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:11:18 localhost openstack_network_exporter[240658]: ERROR 10:11:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:11:18 localhost openstack_network_exporter[240658]: ERROR 10:11:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:11:18 localhost openstack_network_exporter[240658]: ERROR 10:11:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:11:18 localhost openstack_network_exporter[240658]: Nov 28 05:11:18 localhost openstack_network_exporter[240658]: ERROR 10:11:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:11:18 localhost openstack_network_exporter[240658]: Nov 28 05:11:18 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Nov 28 05:11:18 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 28 05:11:18 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 28 05:11:19 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 28 05:11:19 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 28 05:11:19 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 28 05:11:19 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 28 05:11:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:11:21 localhost nova_compute[279673]: 2025-11-28 10:11:21.838 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:11:21 localhost nova_compute[279673]: 2025-11-28 10:11:21.840 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:11:21 localhost nova_compute[279673]: 2025-11-28 10:11:21.840 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:11:21 localhost nova_compute[279673]: 2025-11-28 10:11:21.841 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:11:21 localhost nova_compute[279673]: 2025-11-28 10:11:21.860 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:11:21 localhost nova_compute[279673]: 2025-11-28 10:11:21.861 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:11:22 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1daffb16-bcf6-4808-941d-7da7540d99dc/48e67baa-974d-4f75-aae2-5962f27bbd3a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1daffb16-bcf6-4808-941d-7da7540d99dc", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:11:22 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1daffb16-bcf6-4808-941d-7da7540d99dc/48e67baa-974d-4f75-aae2-5962f27bbd3a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1daffb16-bcf6-4808-941d-7da7540d99dc", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:22 localhost sshd[329563]: main: sshd: ssh-rsa algorithm is disabled Nov 28 05:11:22 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1daffb16-bcf6-4808-941d-7da7540d99dc/48e67baa-974d-4f75-aae2-5962f27bbd3a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1daffb16-bcf6-4808-941d-7da7540d99dc", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:22 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:11:22 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:22 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:23 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch Nov 28 05:11:23 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1daffb16-bcf6-4808-941d-7da7540d99dc/48e67baa-974d-4f75-aae2-5962f27bbd3a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1daffb16-bcf6-4808-941d-7da7540d99dc", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:23 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1daffb16-bcf6-4808-941d-7da7540d99dc/48e67baa-974d-4f75-aae2-5962f27bbd3a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1daffb16-bcf6-4808-941d-7da7540d99dc", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:23 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/1daffb16-bcf6-4808-941d-7da7540d99dc/48e67baa-974d-4f75-aae2-5962f27bbd3a", "osd", "allow rw pool=manila_data namespace=fsvolumens_1daffb16-bcf6-4808-941d-7da7540d99dc", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:23 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 28 05:11:23 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:23 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:23 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:25 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Nov 28 05:11:25 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 28 05:11:25 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 28 05:11:26 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 28 05:11:26 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 28 05:11:26 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 28 05:11:26 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 28 05:11:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} v 0) Nov 28 05:11:26 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch Nov 28 05:11:26 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished Nov 28 05:11:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:11:26 localhost nova_compute[279673]: 2025-11-28 10:11:26.862 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:11:26 localhost nova_compute[279673]: 2025-11-28 10:11:26.864 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:11:26 localhost nova_compute[279673]: 2025-11-28 10:11:26.864 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:11:26 localhost nova_compute[279673]: 2025-11-28 10:11:26.864 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:11:26 localhost nova_compute[279673]: 2025-11-28 10:11:26.886 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:11:26 localhost nova_compute[279673]: 2025-11-28 10:11:26.887 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:11:27 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch Nov 28 05:11:27 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch Nov 28 05:11:27 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch Nov 28 05:11:27 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished Nov 28 05:11:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:11:27 localhost podman[329565]: 2025-11-28 10:11:27.718949062 +0000 UTC m=+0.080839452 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Nov 28 05:11:27 localhost podman[329565]: 2025-11-28 10:11:27.731291173 +0000 UTC m=+0.093181563 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter) Nov 28 05:11:27 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:11:29 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 28 05:11:29 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:11:29 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:29 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:29 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:11:29 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:29 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:30 localhost ovn_metadata_agent[158125]: 2025-11-28 10:11:30.064 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:11:30 localhost nova_compute[279673]: 2025-11-28 10:11:30.065 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:11:30 localhost ovn_metadata_agent[158125]: 2025-11-28 10:11:30.066 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 28 05:11:30 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:30 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:30 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:30 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch Nov 28 05:11:30 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:30 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:30 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0. Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.222447) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61 Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324690222787, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2161, "num_deletes": 255, "total_data_size": 2093514, "memory_usage": 2159360, "flush_reason": "Manual Compaction"} Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324690237534, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 2054274, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 34077, "largest_seqno": 36237, "table_properties": {"data_size": 2044761, "index_size": 5574, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2885, "raw_key_size": 25919, "raw_average_key_size": 22, "raw_value_size": 2023509, "raw_average_value_size": 1764, "num_data_blocks": 240, "num_entries": 1147, "num_filter_entries": 1147, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324591, "oldest_key_time": 1764324591, "file_creation_time": 1764324690, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}} Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 14878 microseconds, and 6426 cpu microseconds. Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.237589) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 2054274 bytes OK Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.237614) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.239943) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.239968) EVENT_LOG_v1 {"time_micros": 1764324690239961, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.239992) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 2083220, prev total WAL file size 2083220, number of live WAL files 2. Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.241145) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132353530' seq:72057594037927935, type:22 .. '7061786F73003132383032' seq:0, type:0; will stop at (end) Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(2006KB)], [60(17MB)] Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324690241200, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 20440983, "oldest_snapshot_seqno": -1} Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 13959 keys, 18920248 bytes, temperature: kUnknown Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324690352819, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 18920248, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18839354, "index_size": 44889, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 34949, "raw_key_size": 372388, "raw_average_key_size": 26, "raw_value_size": 18601214, "raw_average_value_size": 1332, "num_data_blocks": 1690, "num_entries": 13959, "num_filter_entries": 13959, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764324690, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}} Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.353217) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 18920248 bytes Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.355220) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 182.9 rd, 169.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 17.5 +0.0 blob) out(18.0 +0.0 blob), read-write-amplify(19.2) write-amplify(9.2) OK, records in: 14496, records dropped: 537 output_compression: NoCompression Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.355250) EVENT_LOG_v1 {"time_micros": 1764324690355236, "job": 36, "event": "compaction_finished", "compaction_time_micros": 111759, "compaction_time_cpu_micros": 53091, "output_level": 6, "num_output_files": 1, "total_output_size": 18920248, "num_input_records": 14496, "num_output_records": 13959, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324690355662, "job": 36, "event": "table_file_deletion", "file_number": 62} Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324690358241, "job": 36, "event": "table_file_deletion", "file_number": 60} Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.241000) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.358317) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.358323) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.358326) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.358329) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:11:30 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:30.358333) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:11:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:11:31 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e273 do_prune osdmap full prune enabled Nov 28 05:11:31 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e274 e274: 6 total, 6 up, 6 in Nov 28 05:11:31 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e274: 6 total, 6 up, 6 in Nov 28 05:11:31 localhost podman[329601]: 2025-11-28 10:11:31.280002979 +0000 UTC m=+0.064916121 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 05:11:31 localhost podman[329601]: 2025-11-28 10:11:31.287874792 +0000 UTC m=+0.072787934 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 05:11:31 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:11:31 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:11:31 localhost nova_compute[279673]: 2025-11-28 10:11:31.924 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:11:32 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 05:11:32 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:11:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:11:32 localhost podman[329693]: 2025-11-28 10:11:32.298947785 +0000 UTC m=+0.083339099 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:11:32 localhost podman[329693]: 2025-11-28 10:11:32.309407978 +0000 UTC m=+0.093799252 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:11:32 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:11:32 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 05:11:32 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:11:32 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} v 0) Nov 28 05:11:32 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch Nov 28 05:11:32 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished Nov 28 05:11:32 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Nov 28 05:11:32 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 28 05:11:32 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 28 05:11:33 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch Nov 28 05:11:33 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch Nov 28 05:11:33 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch Nov 28 05:11:33 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished Nov 28 05:11:33 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 28 05:11:33 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 28 05:11:33 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 28 05:11:33 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 28 05:11:35 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:11:35 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:35 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:35 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:11:35 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:35 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:36 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 28 05:11:36 localhost ovn_metadata_agent[158125]: 2025-11-28 10:11:36.067 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:11:36 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:11:36 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch Nov 28 05:11:36 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:36 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:36 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:36 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 28 05:11:36 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:36 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:36 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:36 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:11:36 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:11:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:11:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0. Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.763087) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64 Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324696763133, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 425, "num_deletes": 259, "total_data_size": 187094, "memory_usage": 196712, "flush_reason": "Manual Compaction"} Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324696766850, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 185011, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36238, "largest_seqno": 36662, "table_properties": {"data_size": 182545, "index_size": 513, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6728, "raw_average_key_size": 18, "raw_value_size": 177115, "raw_average_value_size": 498, "num_data_blocks": 23, "num_entries": 355, "num_filter_entries": 355, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324690, "oldest_key_time": 1764324690, "file_creation_time": 1764324696, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}} Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 3817 microseconds, and 1397 cpu microseconds. Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.766901) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 185011 bytes OK Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.766926) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.770518) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.770547) EVENT_LOG_v1 {"time_micros": 1764324696770539, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.770568) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 184316, prev total WAL file size 193115, number of live WAL files 2. Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.771242) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034323734' seq:72057594037927935, type:22 .. '6C6F676D0034353239' seq:0, type:0; will stop at (end) Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(180KB)], [63(18MB)] Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324696771309, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 19105259, "oldest_snapshot_seqno": -1} Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 13773 keys, 18715383 bytes, temperature: kUnknown Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324696873585, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 18715383, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18635881, "index_size": 43945, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 34501, "raw_key_size": 369629, "raw_average_key_size": 26, "raw_value_size": 18401113, "raw_average_value_size": 1336, "num_data_blocks": 1643, "num_entries": 13773, "num_filter_entries": 13773, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764324696, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}} Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.873955) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 18715383 bytes Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.876122) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 186.6 rd, 182.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 18.0 +0.0 blob) out(17.8 +0.0 blob), read-write-amplify(204.4) write-amplify(101.2) OK, records in: 14314, records dropped: 541 output_compression: NoCompression Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.876151) EVENT_LOG_v1 {"time_micros": 1764324696876139, "job": 38, "event": "compaction_finished", "compaction_time_micros": 102371, "compaction_time_cpu_micros": 53981, "output_level": 6, "num_output_files": 1, "total_output_size": 18715383, "num_input_records": 14314, "num_output_records": 13773, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324696876318, "job": 38, "event": "table_file_deletion", "file_number": 65} Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324696879174, "job": 38, "event": "table_file_deletion", "file_number": 63} Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.771010) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.879330) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.879339) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.879342) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.879346) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:11:36 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:11:36.879349) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:11:36 localhost podman[329711]: 2025-11-28 10:11:36.901040296 +0000 UTC m=+0.132494074 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Nov 28 05:11:36 localhost podman[329711]: 2025-11-28 10:11:36.915401928 +0000 UTC m=+0.146855686 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible) Nov 28 05:11:36 localhost nova_compute[279673]: 2025-11-28 10:11:36.927 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:11:36 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:11:36 localhost nova_compute[279673]: 2025-11-28 10:11:36.929 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:11:36 localhost nova_compute[279673]: 2025-11-28 10:11:36.930 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:11:36 localhost nova_compute[279673]: 2025-11-28 10:11:36.930 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:11:36 localhost nova_compute[279673]: 2025-11-28 10:11:36.967 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:11:36 localhost nova_compute[279673]: 2025-11-28 10:11:36.968 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:11:36 localhost systemd[1]: tmp-crun.FErQAK.mount: Deactivated successfully. Nov 28 05:11:37 localhost podman[329712]: 2025-11-28 10:11:37.00177468 +0000 UTC m=+0.230324658 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:11:37 localhost podman[329712]: 2025-11-28 10:11:37.040417251 +0000 UTC m=+0.268967189 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller) Nov 28 05:11:37 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:11:37 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e51: np0005538515.yfkzhl(active, since 13m), standbys: np0005538513.dsfdlx, np0005538514.djozup Nov 28 05:11:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} v 0) Nov 28 05:11:39 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch Nov 28 05:11:39 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished Nov 28 05:11:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Nov 28 05:11:39 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 28 05:11:39 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 28 05:11:39 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch Nov 28 05:11:39 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch Nov 28 05:11:39 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch Nov 28 05:11:39 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished Nov 28 05:11:39 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 28 05:11:39 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 28 05:11:39 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 28 05:11:39 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 28 05:11:40 localhost podman[238687]: time="2025-11-28T10:11:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:11:40 localhost podman[238687]: @ - - [28/Nov/2025:10:11:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 28 05:11:40 localhost podman[238687]: @ - - [28/Nov/2025:10:11:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19273 "" "Go-http-client/1.1" Nov 28 05:11:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:11:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e274 do_prune osdmap full prune enabled Nov 28 05:11:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e275 e275: 6 total, 6 up, 6 in Nov 28 05:11:41 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e275: 6 total, 6 up, 6 in Nov 28 05:11:41 localhost nova_compute[279673]: 2025-11-28 10:11:41.970 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:11:41 localhost nova_compute[279673]: 2025-11-28 10:11:41.972 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:11:41 localhost nova_compute[279673]: 2025-11-28 10:11:41.972 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:11:41 localhost nova_compute[279673]: 2025-11-28 10:11:41.972 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:11:41 localhost nova_compute[279673]: 2025-11-28 10:11:41.992 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:11:41 localhost nova_compute[279673]: 2025-11-28 10:11:41.993 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:11:42 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:11:42 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:42 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:42 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 28 05:11:42 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:42 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:42 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:42 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:11:42 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:42 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:43 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch Nov 28 05:11:43 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:43 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:43 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e275 do_prune osdmap full prune enabled Nov 28 05:11:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e276 e276: 6 total, 6 up, 6 in Nov 28 05:11:45 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e276: 6 total, 6 up, 6 in Nov 28 05:11:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Nov 28 05:11:45 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 28 05:11:45 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 28 05:11:46 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} v 0) Nov 28 05:11:46 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch Nov 28 05:11:46 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished Nov 28 05:11:46 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 28 05:11:46 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 28 05:11:46 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 28 05:11:46 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 28 05:11:46 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch Nov 28 05:11:46 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch Nov 28 05:11:46 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch Nov 28 05:11:46 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished Nov 28 05:11:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:11:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:11:46 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e276 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:11:46 localhost podman[329755]: 2025-11-28 10:11:46.851319531 +0000 UTC m=+0.088920941 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 05:11:46 localhost podman[329755]: 2025-11-28 10:11:46.888383463 +0000 UTC m=+0.125984853 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 05:11:46 localhost podman[329756]: 2025-11-28 10:11:46.903115646 +0000 UTC m=+0.134821775 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd) Nov 28 05:11:46 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:11:46 localhost podman[329756]: 2025-11-28 10:11:46.918441889 +0000 UTC m=+0.150148008 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 28 05:11:46 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:11:46 localhost nova_compute[279673]: 2025-11-28 10:11:46.994 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:11:46 localhost nova_compute[279673]: 2025-11-28 10:11:46.996 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:11:46 localhost nova_compute[279673]: 2025-11-28 10:11:46.997 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:11:46 localhost nova_compute[279673]: 2025-11-28 10:11:46.997 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:11:47 localhost nova_compute[279673]: 2025-11-28 10:11:47.039 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:11:47 localhost nova_compute[279673]: 2025-11-28 10:11:47.040 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:11:48 localhost openstack_network_exporter[240658]: ERROR 10:11:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:11:48 localhost openstack_network_exporter[240658]: ERROR 10:11:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:11:48 localhost openstack_network_exporter[240658]: ERROR 10:11:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:11:48 localhost openstack_network_exporter[240658]: ERROR 10:11:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:11:48 localhost openstack_network_exporter[240658]: Nov 28 05:11:48 localhost openstack_network_exporter[240658]: ERROR 10:11:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:11:48 localhost openstack_network_exporter[240658]: Nov 28 05:11:48 localhost nova_compute[279673]: 2025-11-28 10:11:48.414 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:11:48 localhost nova_compute[279673]: 2025-11-28 10:11:48.414 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:11:48 localhost nova_compute[279673]: 2025-11-28 10:11:48.415 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:11:48 localhost nova_compute[279673]: 2025-11-28 10:11:48.415 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 05:11:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:11:49 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:49 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:11:49 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:49 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:49 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 28 05:11:49 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:49 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:49 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:49 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch Nov 28 05:11:49 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:49 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:49 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-459400664", "caps": ["mds", "allow rw path=/volumes/_nogroup/3fe15641-5409-4db6-8856-5687ded3c0e8/de8c16b9-d700-4f01-bec8-e5a6a7967ad5", "osd", "allow rw pool=manila_data namespace=fsvolumens_3fe15641-5409-4db6-8856-5687ded3c0e8", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:50 localhost nova_compute[279673]: 2025-11-28 10:11:50.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:11:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:11:50.848 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:11:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:11:50.849 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:11:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:11:50.849 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:11:51 localhost nova_compute[279673]: 2025-11-28 10:11:51.500 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:11:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e276 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:11:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e276 do_prune osdmap full prune enabled Nov 28 05:11:51 localhost nova_compute[279673]: 2025-11-28 10:11:51.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:11:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e277 e277: 6 total, 6 up, 6 in Nov 28 05:11:51 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e277: 6 total, 6 up, 6 in Nov 28 05:11:52 localhost nova_compute[279673]: 2025-11-28 10:11:52.041 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:11:52 localhost nova_compute[279673]: 2025-11-28 10:11:52.043 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:11:52 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Nov 28 05:11:52 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 28 05:11:52 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 28 05:11:52 localhost nova_compute[279673]: 2025-11-28 10:11:52.766 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:11:52 localhost nova_compute[279673]: 2025-11-28 10:11:52.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:11:52 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 28 05:11:52 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 28 05:11:52 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 28 05:11:52 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 28 05:11:52 localhost nova_compute[279673]: 2025-11-28 10:11:52.793 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:11:52 localhost nova_compute[279673]: 2025-11-28 10:11:52.794 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:11:52 localhost nova_compute[279673]: 2025-11-28 10:11:52.794 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:11:52 localhost nova_compute[279673]: 2025-11-28 10:11:52.794 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 05:11:52 localhost nova_compute[279673]: 2025-11-28 10:11:52.795 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:11:52 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} v 0) Nov 28 05:11:52 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch Nov 28 05:11:52 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished Nov 28 05:11:53 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:11:53 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2162333142' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:11:53 localhost nova_compute[279673]: 2025-11-28 10:11:53.244 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:11:53 localhost nova_compute[279673]: 2025-11-28 10:11:53.317 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:11:53 localhost nova_compute[279673]: 2025-11-28 10:11:53.317 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:11:53 localhost nova_compute[279673]: 2025-11-28 10:11:53.544 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 05:11:53 localhost nova_compute[279673]: 2025-11-28 10:11:53.546 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11036MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 05:11:53 localhost nova_compute[279673]: 2025-11-28 10:11:53.546 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:11:53 localhost nova_compute[279673]: 2025-11-28 10:11:53.547 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:11:53 localhost nova_compute[279673]: 2025-11-28 10:11:53.624 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 05:11:53 localhost nova_compute[279673]: 2025-11-28 10:11:53.625 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 05:11:53 localhost nova_compute[279673]: 2025-11-28 10:11:53.625 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 05:11:53 localhost nova_compute[279673]: 2025-11-28 10:11:53.703 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:11:53 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch Nov 28 05:11:53 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-459400664", "format": "json"} : dispatch Nov 28 05:11:53 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"} : dispatch Nov 28 05:11:53 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-459400664"}]': finished Nov 28 05:11:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:11:54 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2922934843' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:11:54 localhost nova_compute[279673]: 2025-11-28 10:11:54.186 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:11:54 localhost nova_compute[279673]: 2025-11-28 10:11:54.193 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 05:11:54 localhost nova_compute[279673]: 2025-11-28 10:11:54.210 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 05:11:54 localhost nova_compute[279673]: 2025-11-28 10:11:54.213 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 05:11:54 localhost nova_compute[279673]: 2025-11-28 10:11:54.214 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:11:54 localhost nova_compute[279673]: 2025-11-28 10:11:54.304 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:11:55 localhost nova_compute[279673]: 2025-11-28 10:11:55.215 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:11:55 localhost nova_compute[279673]: 2025-11-28 10:11:55.216 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 05:11:55 localhost nova_compute[279673]: 2025-11-28 10:11:55.216 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 05:11:56 localhost nova_compute[279673]: 2025-11-28 10:11:56.161 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 05:11:56 localhost nova_compute[279673]: 2025-11-28 10:11:56.161 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 05:11:56 localhost nova_compute[279673]: 2025-11-28 10:11:56.162 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 05:11:56 localhost nova_compute[279673]: 2025-11-28 10:11:56.162 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:11:56 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:11:56 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:56 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:56 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:11:57 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 28 05:11:57 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:57 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:11:57 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:11:57 localhost nova_compute[279673]: 2025-11-28 10:11:57.071 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:11:57 localhost nova_compute[279673]: 2025-11-28 10:11:57.088 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:11:57 localhost nova_compute[279673]: 2025-11-28 10:11:57.102 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 05:11:57 localhost nova_compute[279673]: 2025-11-28 10:11:57.103 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 05:11:57 localhost nova_compute[279673]: 2025-11-28 10:11:57.104 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:11:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:11:58 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:11:58.770 261084 INFO neutron.agent.linux.ip_lib [None req-c4fa7785-c232-40f8-a7a8-52415563ca7c - - - - - -] Device tapf3135b0f-56 cannot be used as it has no MAC address#033[00m Nov 28 05:11:58 localhost nova_compute[279673]: 2025-11-28 10:11:58.827 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:11:58 localhost kernel: device tapf3135b0f-56 entered promiscuous mode Nov 28 05:11:58 localhost NetworkManager[5967]: [1764324718.8404] manager: (tapf3135b0f-56): new Generic device (/org/freedesktop/NetworkManager/Devices/75) Nov 28 05:11:58 localhost ovn_controller[152322]: 2025-11-28T10:11:58Z|00474|binding|INFO|Claiming lport f3135b0f-56fc-476c-b4e5-c9e5a120aa9d for this chassis. Nov 28 05:11:58 localhost ovn_controller[152322]: 2025-11-28T10:11:58Z|00475|binding|INFO|f3135b0f-56fc-476c-b4e5-c9e5a120aa9d: Claiming unknown Nov 28 05:11:58 localhost podman[329844]: 2025-11-28 10:11:58.843514026 +0000 UTC m=+0.139385035 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Nov 28 05:11:58 localhost nova_compute[279673]: 2025-11-28 10:11:58.848 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:11:58 localhost systemd-udevd[329870]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:11:58 localhost podman[329844]: 2025-11-28 10:11:58.86470125 +0000 UTC m=+0.160572289 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, build-date=2025-08-20T13:12:41) Nov 28 05:11:58 localhost ovn_controller[152322]: 2025-11-28T10:11:58Z|00476|binding|INFO|Setting lport f3135b0f-56fc-476c-b4e5-c9e5a120aa9d ovn-installed in OVS Nov 28 05:11:58 localhost ovn_controller[152322]: 2025-11-28T10:11:58Z|00477|binding|INFO|Setting lport f3135b0f-56fc-476c-b4e5-c9e5a120aa9d up in Southbound Nov 28 05:11:58 localhost ovn_metadata_agent[158125]: 2025-11-28 10:11:58.868 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-1bac4260-cc00-4d23-940f-68536ef7d308', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1bac4260-cc00-4d23-940f-68536ef7d308', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '729a918c0a8248ff9fef91d8e41e340a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a10c622-42c8-4119-8cc0-4b51720ab1bd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f3135b0f-56fc-476c-b4e5-c9e5a120aa9d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:11:58 localhost ovn_metadata_agent[158125]: 2025-11-28 10:11:58.871 158130 INFO neutron.agent.ovn.metadata.agent [-] Port f3135b0f-56fc-476c-b4e5-c9e5a120aa9d in datapath 1bac4260-cc00-4d23-940f-68536ef7d308 bound to our chassis#033[00m Nov 28 05:11:58 localhost ovn_metadata_agent[158125]: 2025-11-28 10:11:58.873 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port 875e6dcc-a188-4fba-a15d-1fa76107968e IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:11:58 localhost ovn_metadata_agent[158125]: 2025-11-28 10:11:58.873 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1bac4260-cc00-4d23-940f-68536ef7d308, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:11:58 localhost ovn_metadata_agent[158125]: 2025-11-28 10:11:58.875 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[9ee0c22e-bb81-485e-ba6e-24f1a03aa398]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:11:58 localhost nova_compute[279673]: 2025-11-28 10:11:58.877 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:11:58 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:11:58 localhost nova_compute[279673]: 2025-11-28 10:11:58.926 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:11:58 localhost nova_compute[279673]: 2025-11-28 10:11:58.944 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:11:59 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Nov 28 05:11:59 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 28 05:11:59 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 28 05:11:59 localhost podman[329924]: Nov 28 05:11:59 localhost podman[329924]: 2025-11-28 10:11:59.812567486 +0000 UTC m=+0.093217633 container create c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1bac4260-cc00-4d23-940f-68536ef7d308, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:11:59 localhost systemd[1]: Started libpod-conmon-c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998.scope. Nov 28 05:11:59 localhost podman[329924]: 2025-11-28 10:11:59.767976822 +0000 UTC m=+0.048626999 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:11:59 localhost systemd[1]: tmp-crun.bB7JiY.mount: Deactivated successfully. Nov 28 05:11:59 localhost systemd[1]: Started libcrun container. Nov 28 05:11:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53803e0e67ab6fc96360e1bca7f403c3373d728f0094543b1385d739dc6ce18a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:11:59 localhost podman[329924]: 2025-11-28 10:11:59.891166978 +0000 UTC m=+0.171817125 container init c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1bac4260-cc00-4d23-940f-68536ef7d308, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2) Nov 28 05:11:59 localhost podman[329924]: 2025-11-28 10:11:59.900284809 +0000 UTC m=+0.180934956 container start c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1bac4260-cc00-4d23-940f-68536ef7d308, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:11:59 localhost dnsmasq[329943]: started, version 2.85 cachesize 150 Nov 28 05:11:59 localhost dnsmasq[329943]: DNS service limited to local subnets Nov 28 05:11:59 localhost dnsmasq[329943]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:11:59 localhost dnsmasq[329943]: warning: no upstream servers configured Nov 28 05:11:59 localhost dnsmasq-dhcp[329943]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:11:59 localhost dnsmasq[329943]: read /var/lib/neutron/dhcp/1bac4260-cc00-4d23-940f-68536ef7d308/addn_hosts - 0 addresses Nov 28 05:11:59 localhost dnsmasq-dhcp[329943]: read /var/lib/neutron/dhcp/1bac4260-cc00-4d23-940f-68536ef7d308/host Nov 28 05:11:59 localhost dnsmasq-dhcp[329943]: read /var/lib/neutron/dhcp/1bac4260-cc00-4d23-940f-68536ef7d308/opts Nov 28 05:11:59 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:11:59.965 261084 INFO neutron.agent.dhcp.agent [None req-8f135ad8-bb8b-4fce-a1d1-2ae3b356bcc8 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:11:59Z, description=, device_id=c2d60d6f-83fc-4648-964a-020aeb44c54e, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=bcda3fd1-8481-4078-8919-c9d7f3e1c8a6, ip_allocation=immediate, mac_address=fa:16:3e:a4:e8:4a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:11:56Z, description=, dns_domain=, id=1bac4260-cc00-4d23-940f-68536ef7d308, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-1059272471-network, port_security_enabled=True, project_id=729a918c0a8248ff9fef91d8e41e340a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=40620, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3688, status=ACTIVE, subnets=['a3e77d57-25ec-4c92-b11d-6f3e73fc2f1e'], tags=[], tenant_id=729a918c0a8248ff9fef91d8e41e340a, updated_at=2025-11-28T10:11:57Z, vlan_transparent=None, network_id=1bac4260-cc00-4d23-940f-68536ef7d308, port_security_enabled=False, project_id=729a918c0a8248ff9fef91d8e41e340a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3696, status=DOWN, tags=[], tenant_id=729a918c0a8248ff9fef91d8e41e340a, updated_at=2025-11-28T10:11:59Z on network 1bac4260-cc00-4d23-940f-68536ef7d308#033[00m Nov 28 05:12:00 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 28 05:12:00 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 28 05:12:00 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 28 05:12:00 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 28 05:12:00 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:12:00.141 261084 INFO neutron.agent.dhcp.agent [None req-7f83153c-7f78-4107-98a3-b8ea12f1b2bb - - - - - -] DHCP configuration for ports {'d81fcc1a-8a98-4599-b518-924ca188dfd4'} is completed#033[00m Nov 28 05:12:00 localhost dnsmasq[329943]: read /var/lib/neutron/dhcp/1bac4260-cc00-4d23-940f-68536ef7d308/addn_hosts - 1 addresses Nov 28 05:12:00 localhost dnsmasq-dhcp[329943]: read /var/lib/neutron/dhcp/1bac4260-cc00-4d23-940f-68536ef7d308/host Nov 28 05:12:00 localhost dnsmasq-dhcp[329943]: read /var/lib/neutron/dhcp/1bac4260-cc00-4d23-940f-68536ef7d308/opts Nov 28 05:12:00 localhost podman[329961]: 2025-11-28 10:12:00.321468807 +0000 UTC m=+0.063237860 container kill c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1bac4260-cc00-4d23-940f-68536ef7d308, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS) Nov 28 05:12:00 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:12:00.603 261084 INFO neutron.agent.dhcp.agent [None req-a33d74a0-428d-4911-848e-e73f0d9734f7 - - - - - -] DHCP configuration for ports {'bcda3fd1-8481-4078-8919-c9d7f3e1c8a6'} is completed#033[00m Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.676 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.677 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.705 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.705 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8222651-4b3f-4f2d-9ce0-38e10022865c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:12:00.677867', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'adf74582-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.84976793, 'message_signature': 'eb539f1025d60ed666c71b324c9066b67e062c8b67322c5a1f56fbe778e5134f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:12:00.677867', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'adf75d88-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.84976793, 'message_signature': 'cea35c50aba4eae65689a9dd0d36142a335f389c146cdde2dbb86a143ed1e898'}]}, 'timestamp': '2025-11-28 10:12:00.706520', '_unique_id': 'bdb0e93d7bc7434886ac353dd079bb58'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.708 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.709 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 28 05:12:00 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:12:00.721 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:11:59Z, description=, device_id=c2d60d6f-83fc-4648-964a-020aeb44c54e, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=bcda3fd1-8481-4078-8919-c9d7f3e1c8a6, ip_allocation=immediate, mac_address=fa:16:3e:a4:e8:4a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:11:56Z, description=, dns_domain=, id=1bac4260-cc00-4d23-940f-68536ef7d308, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-1059272471-network, port_security_enabled=True, project_id=729a918c0a8248ff9fef91d8e41e340a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=40620, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3688, status=ACTIVE, subnets=['a3e77d57-25ec-4c92-b11d-6f3e73fc2f1e'], tags=[], tenant_id=729a918c0a8248ff9fef91d8e41e340a, updated_at=2025-11-28T10:11:57Z, vlan_transparent=None, network_id=1bac4260-cc00-4d23-940f-68536ef7d308, port_security_enabled=False, project_id=729a918c0a8248ff9fef91d8e41e340a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3696, status=DOWN, tags=[], tenant_id=729a918c0a8248ff9fef91d8e41e340a, updated_at=2025-11-28T10:11:59Z on network 1bac4260-cc00-4d23-940f-68536ef7d308#033[00m Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.728 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 18870000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2cffca0-664d-4aff-8554-a281e8f39ff3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18870000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:12:00.710100', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'adfae2d2-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.900646917, 'message_signature': 'a723426fe68d7ea7f0f3229e9877c5d517c184e52713a96c68be562f158c6a12'}]}, 'timestamp': '2025-11-28 10:12:00.729632', '_unique_id': '0ca08b7e0e24427390c2e2216371b462'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.730 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.732 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.732 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.735 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '908811aa-4974-4c23-b4bf-ee21e6f8c6d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:12:00.732450', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'adfbd8f4-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.904353422, 'message_signature': 'f3c1b7ed1afa3c39e0ee274293a30e3f120e32b6a5a920162c07b841c761efa4'}]}, 'timestamp': '2025-11-28 10:12:00.735916', '_unique_id': 'ed664a15af2d4286a5557f69a71e01c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.737 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.738 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.738 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b72240d-7ab5-416b-8750-0886ecf43113', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:12:00.738623', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'adfc55c2-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.904353422, 'message_signature': '770e5e609bba2d4ceea668c4490198db0c5046e6aa500ad898a86def9490bd07'}]}, 'timestamp': '2025-11-28 10:12:00.739161', '_unique_id': '8b2ddb0ec78144c493108e992f7b2177'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.740 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.741 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.741 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6865841-1b5d-42ef-83e2-12589a12557a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:12:00.741833', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'adfcd560-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.904353422, 'message_signature': 'ad92057203af800587dbe9ff4842236dabd12bec5bf6c5dcdd88d7f808164208'}]}, 'timestamp': '2025-11-28 10:12:00.742374', '_unique_id': '34a41996dd2c4d84b3a474d3904cb626'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.743 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.744 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.745 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19fad7fe-1959-40c0-9c23-30175d69097e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:12:00.745179', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'adfd5620-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.904353422, 'message_signature': 'e053d5f75d7c52e9ae395fd65f78a75ed8c77f852c1ce63228036f6d64ce2668'}]}, 'timestamp': '2025-11-28 10:12:00.745751', '_unique_id': '6aed1976ed504acb9871fb8fbb0f8360'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.746 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.747 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.748 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.759 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.760 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e52de5b8-65b4-4355-b3c2-d938222eb514', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:12:00.748256', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'adff86ca-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.920153228, 'message_signature': '6819390b05057e84ec68e6a99a7a6aec8cc63bba761177fdefe842357788a680'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:12:00.748256', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'adff9a3e-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.920153228, 'message_signature': '0c2f590b69b7dd9aef77d7c3882a6fb95d218118ee1b9a45258073095ce20ec4'}]}, 'timestamp': '2025-11-28 10:12:00.760560', '_unique_id': 'f81dedf2d806441892e04ab8b367b1bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.761 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.762 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.763 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9ca7edd-dc05-448a-9e6c-9b8c1dcc1ba2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:12:00.763119', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'ae0012d4-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.904353422, 'message_signature': '5a8e5d77748387904ab037fca8c15e251077cc65982aac4de44123f35049a602'}]}, 'timestamp': '2025-11-28 10:12:00.763601', '_unique_id': '0fe693c0af6742b6aab4983be1471de3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.765 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.766 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.766 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.767 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd7f075a7-8e6d-4790-bebd-1d6ef905c41a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:12:00.766513', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ae009704-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.84976793, 'message_signature': '4a9372c0e7ca3a6fa293e144c2c0a60ddbfe7447fab835b4acc463d20ea9af5f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:12:00.766513', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ae00ad70-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.84976793, 'message_signature': '833103ade7dcb0f7b56882d5114e30750029b0381979faff5532becd131517fd'}]}, 'timestamp': '2025-11-28 10:12:00.767528', '_unique_id': '6113028193ae42ee8bfb76a1086e48d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.768 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.769 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.770 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.770 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44df7095-d35f-4425-ba63-73b5651f6a5c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:12:00.769983', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ae01211a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.920153228, 'message_signature': '175e85b6958518ceda7ebd1c3d0aad207adb8275b892ef9d3f67564aef8615c8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:12:00.769983', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ae0131c8-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.920153228, 'message_signature': '7ca87da8c69be1ed33ae93ebeaf5b8176a04de428e6d69d70406ead10344a1de'}]}, 'timestamp': '2025-11-28 10:12:00.770913', '_unique_id': '63f9b24eb39d42efbe7f74f7eddd6d80'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.771 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.773 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.773 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d4391ef-266a-4a2a-bfaf-840d0a7c48e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:12:00.773216', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'ae019cee-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.904353422, 'message_signature': 'd7425e312d571964a8459f68514aa947c3997d96ebb1b34c2185cf7b2bc30778'}]}, 'timestamp': '2025-11-28 10:12:00.773687', '_unique_id': 'e0e483c4774848aa85d644dd36d45381'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.774 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.775 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.775 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '01867545-9820-49fd-a7d9-2732570c5ba5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:12:00.775850', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'ae0204c2-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.904353422, 'message_signature': 'ba4fece42357cd5c1ad7a1582e500764ca13d814b4d29debb2dc40ef0645b9c9'}]}, 'timestamp': '2025-11-28 10:12:00.776347', '_unique_id': '9e5579a56a7b49059323d058ba85520d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.777 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.778 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.778 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97a21ad8-b72b-4fa4-b10d-2da2e12df742', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:12:00.778532', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'ae026c96-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.904353422, 'message_signature': 'b80d03137edfb7685398709824c5742fafb9d6bdad06ae01ea1ea9480bf8162f'}]}, 'timestamp': '2025-11-28 10:12:00.779004', '_unique_id': 'a513b694b35b4d83bb8a36732df1c733'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.779 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.781 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.781 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.781 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1d1024b-f0b2-4a8c-a291-f091c2050b32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:12:00.781169', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ae02d014-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.84976793, 'message_signature': '870355561135e7353bc43eb0388d26374cc75fa26cccbe4cee8fa991b76d6741'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:12:00.781169', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ae02da5a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.84976793, 'message_signature': 'c18c27627142a5bfbea8ce7d3cb54be6fed76406afdc30bb7cb89df4d2a81c0f'}]}, 'timestamp': '2025-11-28 10:12:00.781708', '_unique_id': '295e12d31b9c42ac942265c94c02fef8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.782 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.783 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.783 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eed5f2b9-7982-4b53-9a8e-4fccad2fbd9f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:12:00.783190', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'ae031eca-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.900646917, 'message_signature': 'f7f28cff1e07dbaebc6059d369911bd781e8ed4f6fd77f100550d5dc17d2ed2c'}]}, 'timestamp': '2025-11-28 10:12:00.783467', '_unique_id': 'edd3c134ad7748d0b73f51351421b028'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.784 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4bc3a9bf-4d20-41bc-b6de-669f73717003', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:12:00.784785', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'ae035d5e-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.904353422, 'message_signature': '4bf457d53e828a66f29f0b8166efe5ac5d5811685d282befdfc88d074774c7af'}]}, 'timestamp': '2025-11-28 10:12:00.785105', '_unique_id': '3f5cd9ecba4b4f15b46322ed10507431'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.785 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.786 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.786 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.786 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5cd9783d-bc38-4036-b293-844f48de0f5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:12:00.786476', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ae039f44-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.920153228, 'message_signature': 'de96cfc28c470a39d418e8c093bb1ca438b0e72c38cf8b63e0b357f9649f5937'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:12:00.786476', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ae03a93a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.920153228, 'message_signature': '67b7c4ee0230fbee1444e5a98fb835df35b8964edad5f0c3a2f291c939284c32'}]}, 'timestamp': '2025-11-28 10:12:00.787003', '_unique_id': 'c2fc3bda007f41f493d12dde39edcdfd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.788 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.788 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da204656-c739-4793-8d2b-4f1488f509c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:12:00.788404', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'ae03eabc-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.904353422, 'message_signature': '1882b6e646512a401a622c0915963f7d4fb80afe9cc74465028c0b8d0d2f1fa2'}]}, 'timestamp': '2025-11-28 10:12:00.788718', '_unique_id': '296395c5f14c43d4bb5ecc5d17666c86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.789 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.790 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.790 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.790 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37f4adf5-f5c7-4cfa-b29a-c657102ae708', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:12:00.790218', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ae043198-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.84976793, 'message_signature': 'a7a4816e418c86249a6a877e29334beda793c046b0a7dac405a77202ae974bae'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:12:00.790218', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ae043ba2-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.84976793, 'message_signature': 'cb96d6492e1a8bb875417a89ec150933b1cda85252e735dd4319b8357ef782c4'}]}, 'timestamp': '2025-11-28 10:12:00.790751', '_unique_id': '25ca6f66f0a349de8e8eefb87d49bd4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.791 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.792 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.792 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.792 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59d952e6-6bd2-4071-b113-f08d4688765b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:12:00.792214', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ae047f9a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.84976793, 'message_signature': 'eca015218ef70e69391b5ec50d36875fc7f3dbbd8091655cba3aa1c821c621fb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:12:00.792214', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ae048c6a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.84976793, 'message_signature': 'd8bb8ea6f0f772059aa1280094d6d606c6b585715b20f25b8caf840463c58f97'}]}, 'timestamp': '2025-11-28 10:12:00.792824', '_unique_id': '0e15d9c9ec8b4c3d803f06ae863d8177'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.793 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.794 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.794 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.794 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a6c3683-c6a0-4ec0-bc67-305b185c68a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:12:00.794269', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ae04cfae-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.84976793, 'message_signature': '0c04899b68a716087af260afd8f88d6e6900957b64b637afe8cf02bac02db875'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:12:00.794269', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ae04da26-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12554.84976793, 'message_signature': '019c15abb402b855fdd9fa74f28eb7121edcc069c7741a4cb888be0e79c56bc4'}]}, 'timestamp': '2025-11-28 10:12:00.794809', '_unique_id': '85fe3616559b40c1a3c21cd2297dacbe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.795 12 ERROR oslo_messaging.notify.messaging Nov 28 05:12:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:12:00.796 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:12:00 localhost dnsmasq[329943]: read /var/lib/neutron/dhcp/1bac4260-cc00-4d23-940f-68536ef7d308/addn_hosts - 1 addresses Nov 28 05:12:00 localhost dnsmasq-dhcp[329943]: read /var/lib/neutron/dhcp/1bac4260-cc00-4d23-940f-68536ef7d308/host Nov 28 05:12:00 localhost dnsmasq-dhcp[329943]: read /var/lib/neutron/dhcp/1bac4260-cc00-4d23-940f-68536ef7d308/opts Nov 28 05:12:00 localhost podman[329998]: 2025-11-28 10:12:00.969157934 +0000 UTC m=+0.063303291 container kill c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1bac4260-cc00-4d23-940f-68536ef7d308, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Nov 28 05:12:01 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e277 do_prune osdmap full prune enabled Nov 28 05:12:01 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e278 e278: 6 total, 6 up, 6 in Nov 28 05:12:01 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e278: 6 total, 6 up, 6 in Nov 28 05:12:01 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:12:01.245 261084 INFO neutron.agent.dhcp.agent [None req-9c7016f0-fd36-48f0-bfe7-253cb23f517c - - - - - -] DHCP configuration for ports {'bcda3fd1-8481-4078-8919-c9d7f3e1c8a6'} is completed#033[00m Nov 28 05:12:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:12:01 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:12:01 localhost systemd[1]: tmp-crun.UOkQTy.mount: Deactivated successfully. Nov 28 05:12:01 localhost podman[330019]: 2025-11-28 10:12:01.822322953 +0000 UTC m=+0.058069160 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 05:12:01 localhost podman[330019]: 2025-11-28 10:12:01.856482375 +0000 UTC m=+0.092228572 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 05:12:01 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:12:02 localhost nova_compute[279673]: 2025-11-28 10:12:02.092 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:12:02 localhost podman[330042]: 2025-11-28 10:12:02.852194395 +0000 UTC m=+0.088601540 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125) Nov 28 05:12:02 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:12:02 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:02 localhost podman[330042]: 2025-11-28 10:12:02.886590496 +0000 UTC m=+0.122997631 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Nov 28 05:12:02 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:12:02 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:12:03 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 28 05:12:03 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:03 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:03 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:12:03 localhost nova_compute[279673]: 2025-11-28 10:12:03.655 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:12:05 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e278 do_prune osdmap full prune enabled Nov 28 05:12:05 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e279 e279: 6 total, 6 up, 6 in Nov 28 05:12:05 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e279: 6 total, 6 up, 6 in Nov 28 05:12:06 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Nov 28 05:12:06 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 28 05:12:06 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 28 05:12:06 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e279 do_prune osdmap full prune enabled Nov 28 05:12:06 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 28 05:12:06 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 28 05:12:06 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 28 05:12:06 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 28 05:12:06 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e280 e280: 6 total, 6 up, 6 in Nov 28 05:12:06 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e280: 6 total, 6 up, 6 in Nov 28 05:12:06 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e280 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:12:07 localhost nova_compute[279673]: 2025-11-28 10:12:07.095 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:12:07 localhost nova_compute[279673]: 2025-11-28 10:12:07.097 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:12:07 localhost nova_compute[279673]: 2025-11-28 10:12:07.097 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:12:07 localhost nova_compute[279673]: 2025-11-28 10:12:07.097 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:12:07 localhost nova_compute[279673]: 2025-11-28 10:12:07.140 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:07 localhost nova_compute[279673]: 2025-11-28 10:12:07.141 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:12:07 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : mgrmap e52: np0005538515.yfkzhl(active, since 14m), standbys: np0005538513.dsfdlx, np0005538514.djozup Nov 28 05:12:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:12:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:12:07 localhost podman[330061]: 2025-11-28 10:12:07.844540134 +0000 UTC m=+0.080234994 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 28 05:12:07 localhost podman[330061]: 2025-11-28 10:12:07.8603255 +0000 UTC m=+0.096020390 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute) Nov 28 05:12:07 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:12:07 localhost podman[330062]: 2025-11-28 10:12:07.951310774 +0000 UTC m=+0.181191424 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Nov 28 05:12:08 localhost podman[330062]: 2025-11-28 10:12:08.018620518 +0000 UTC m=+0.248501158 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3) Nov 28 05:12:08 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:12:09 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:12:09 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:09 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:12:09 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 28 05:12:09 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:09 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:09 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:12:10 localhost podman[238687]: time="2025-11-28T10:12:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:12:10 localhost podman[238687]: @ - - [28/Nov/2025:10:12:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157511 "" "Go-http-client/1.1" Nov 28 05:12:10 localhost podman[238687]: @ - - [28/Nov/2025:10:12:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19755 "" "Go-http-client/1.1" Nov 28 05:12:10 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e280 do_prune osdmap full prune enabled Nov 28 05:12:10 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e281 e281: 6 total, 6 up, 6 in Nov 28 05:12:10 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e281: 6 total, 6 up, 6 in Nov 28 05:12:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:12:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e281 do_prune osdmap full prune enabled Nov 28 05:12:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e282 e282: 6 total, 6 up, 6 in Nov 28 05:12:11 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e282: 6 total, 6 up, 6 in Nov 28 05:12:12 localhost ovn_controller[152322]: 2025-11-28T10:12:12Z|00478|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:12:12 localhost nova_compute[279673]: 2025-11-28 10:12:12.127 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:12 localhost nova_compute[279673]: 2025-11-28 10:12:12.140 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:12 localhost nova_compute[279673]: 2025-11-28 10:12:12.143 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:12 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Nov 28 05:12:12 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 28 05:12:12 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 28 05:12:13 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 28 05:12:13 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 28 05:12:13 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 28 05:12:13 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 28 05:12:13 localhost dnsmasq[329943]: read /var/lib/neutron/dhcp/1bac4260-cc00-4d23-940f-68536ef7d308/addn_hosts - 0 addresses Nov 28 05:12:13 localhost podman[330123]: 2025-11-28 10:12:13.906238678 +0000 UTC m=+0.065817618 container kill c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1bac4260-cc00-4d23-940f-68536ef7d308, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Nov 28 05:12:13 localhost dnsmasq-dhcp[329943]: read /var/lib/neutron/dhcp/1bac4260-cc00-4d23-940f-68536ef7d308/host Nov 28 05:12:13 localhost dnsmasq-dhcp[329943]: read /var/lib/neutron/dhcp/1bac4260-cc00-4d23-940f-68536ef7d308/opts Nov 28 05:12:14 localhost kernel: device tapf3135b0f-56 left promiscuous mode Nov 28 05:12:14 localhost ovn_controller[152322]: 2025-11-28T10:12:14Z|00479|binding|INFO|Releasing lport f3135b0f-56fc-476c-b4e5-c9e5a120aa9d from this chassis (sb_readonly=0) Nov 28 05:12:14 localhost ovn_controller[152322]: 2025-11-28T10:12:14Z|00480|binding|INFO|Setting lport f3135b0f-56fc-476c-b4e5-c9e5a120aa9d down in Southbound Nov 28 05:12:14 localhost nova_compute[279673]: 2025-11-28 10:12:14.241 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:14 localhost nova_compute[279673]: 2025-11-28 10:12:14.260 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:14 localhost ovn_metadata_agent[158125]: 2025-11-28 10:12:14.268 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-1bac4260-cc00-4d23-940f-68536ef7d308', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1bac4260-cc00-4d23-940f-68536ef7d308', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '729a918c0a8248ff9fef91d8e41e340a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a10c622-42c8-4119-8cc0-4b51720ab1bd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f3135b0f-56fc-476c-b4e5-c9e5a120aa9d) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:12:14 localhost ovn_metadata_agent[158125]: 2025-11-28 10:12:14.269 158130 INFO neutron.agent.ovn.metadata.agent [-] Port f3135b0f-56fc-476c-b4e5-c9e5a120aa9d in datapath 1bac4260-cc00-4d23-940f-68536ef7d308 unbound from our chassis#033[00m Nov 28 05:12:14 localhost ovn_metadata_agent[158125]: 2025-11-28 10:12:14.271 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1bac4260-cc00-4d23-940f-68536ef7d308, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:12:14 localhost ovn_metadata_agent[158125]: 2025-11-28 10:12:14.271 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[6cfbc432-3b9c-4d99-818a-652a8ce2cb69]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:12:15 localhost ovn_controller[152322]: 2025-11-28T10:12:15Z|00481|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:12:15 localhost nova_compute[279673]: 2025-11-28 10:12:15.500 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:15 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e282 do_prune osdmap full prune enabled Nov 28 05:12:15 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e283 e283: 6 total, 6 up, 6 in Nov 28 05:12:15 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e283: 6 total, 6 up, 6 in Nov 28 05:12:15 localhost systemd[1]: tmp-crun.XCF4Gz.mount: Deactivated successfully. Nov 28 05:12:15 localhost dnsmasq[329943]: exiting on receipt of SIGTERM Nov 28 05:12:15 localhost podman[330161]: 2025-11-28 10:12:15.932183514 +0000 UTC m=+0.084276089 container kill c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1bac4260-cc00-4d23-940f-68536ef7d308, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:12:15 localhost systemd[1]: libpod-c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998.scope: Deactivated successfully. Nov 28 05:12:16 localhost podman[330175]: 2025-11-28 10:12:16.01452102 +0000 UTC m=+0.068097049 container died c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1bac4260-cc00-4d23-940f-68536ef7d308, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 05:12:16 localhost podman[330175]: 2025-11-28 10:12:16.057272677 +0000 UTC m=+0.110848676 container cleanup c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1bac4260-cc00-4d23-940f-68536ef7d308, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 05:12:16 localhost systemd[1]: libpod-conmon-c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998.scope: Deactivated successfully. Nov 28 05:12:16 localhost podman[330177]: 2025-11-28 10:12:16.10601147 +0000 UTC m=+0.150204540 container remove c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1bac4260-cc00-4d23-940f-68536ef7d308, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Nov 28 05:12:16 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:12:16.136 261084 INFO neutron.agent.dhcp.agent [None req-f2b95ac2-9a28-41e2-ba92-89f709e8016d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:12:16 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:12:16.139 261084 INFO neutron.agent.dhcp.agent [None req-f2b95ac2-9a28-41e2-ba92-89f709e8016d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:12:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:12:16 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:16 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:12:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e283 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:12:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e283 do_prune osdmap full prune enabled Nov 28 05:12:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e284 e284: 6 total, 6 up, 6 in Nov 28 05:12:16 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e284: 6 total, 6 up, 6 in Nov 28 05:12:16 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 28 05:12:16 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:16 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:16 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:12:16 localhost systemd[1]: var-lib-containers-storage-overlay-53803e0e67ab6fc96360e1bca7f403c3373d728f0094543b1385d739dc6ce18a-merged.mount: Deactivated successfully. Nov 28 05:12:16 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c9d4933ed87cdd76f16ea4eea03dac1244f206b74b19e2ac9d80afc2ce1bb998-userdata-shm.mount: Deactivated successfully. Nov 28 05:12:16 localhost systemd[1]: run-netns-qdhcp\x2d1bac4260\x2dcc00\x2d4d23\x2d940f\x2d68536ef7d308.mount: Deactivated successfully. Nov 28 05:12:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:12:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:12:17 localhost podman[330203]: 2025-11-28 10:12:17.050693188 +0000 UTC m=+0.102781238 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 05:12:17 localhost podman[330203]: 2025-11-28 10:12:17.063392139 +0000 UTC m=+0.115480239 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 05:12:17 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:12:17 localhost podman[330204]: 2025-11-28 10:12:17.17767497 +0000 UTC m=+0.222444044 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 28 05:12:17 localhost nova_compute[279673]: 2025-11-28 10:12:17.179 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:17 localhost podman[330204]: 2025-11-28 10:12:17.192325422 +0000 UTC m=+0.237094446 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 05:12:17 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:12:18 localhost openstack_network_exporter[240658]: ERROR 10:12:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:12:18 localhost openstack_network_exporter[240658]: ERROR 10:12:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:12:18 localhost openstack_network_exporter[240658]: ERROR 10:12:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:12:18 localhost openstack_network_exporter[240658]: ERROR 10:12:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:12:18 localhost openstack_network_exporter[240658]: Nov 28 05:12:18 localhost openstack_network_exporter[240658]: ERROR 10:12:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:12:18 localhost openstack_network_exporter[240658]: Nov 28 05:12:19 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Nov 28 05:12:19 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 28 05:12:19 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 28 05:12:20 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 28 05:12:20 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 28 05:12:20 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 28 05:12:20 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 28 05:12:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:12:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e284 do_prune osdmap full prune enabled Nov 28 05:12:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e285 e285: 6 total, 6 up, 6 in Nov 28 05:12:21 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e285: 6 total, 6 up, 6 in Nov 28 05:12:22 localhost nova_compute[279673]: 2025-11-28 10:12:22.183 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:22 localhost nova_compute[279673]: 2025-11-28 10:12:22.186 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:22 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:12:22.266 261084 INFO neutron.agent.linux.ip_lib [None req-8117c4f7-4e76-4baf-9275-077c0fb7af16 - - - - - -] Device tapdf88dd7c-43 cannot be used as it has no MAC address#033[00m Nov 28 05:12:22 localhost nova_compute[279673]: 2025-11-28 10:12:22.294 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:22 localhost kernel: device tapdf88dd7c-43 entered promiscuous mode Nov 28 05:12:22 localhost ovn_controller[152322]: 2025-11-28T10:12:22Z|00482|binding|INFO|Claiming lport df88dd7c-4397-4078-b7e1-7fbf48d503b7 for this chassis. Nov 28 05:12:22 localhost NetworkManager[5967]: [1764324742.3045] manager: (tapdf88dd7c-43): new Generic device (/org/freedesktop/NetworkManager/Devices/76) Nov 28 05:12:22 localhost ovn_controller[152322]: 2025-11-28T10:12:22Z|00483|binding|INFO|df88dd7c-4397-4078-b7e1-7fbf48d503b7: Claiming unknown Nov 28 05:12:22 localhost nova_compute[279673]: 2025-11-28 10:12:22.302 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:22 localhost systemd-udevd[330254]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:12:22 localhost ovn_metadata_agent[158125]: 2025-11-28 10:12:22.319 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-36d3914b-0866-44d3-8d61-e3876a797a40', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-36d3914b-0866-44d3-8d61-e3876a797a40', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a5eaeee8da34a54a7b5240b18a0f9b2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb560543-1290-4f49-91f1-518ecd990906, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=df88dd7c-4397-4078-b7e1-7fbf48d503b7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:12:22 localhost ovn_metadata_agent[158125]: 2025-11-28 10:12:22.321 158130 INFO neutron.agent.ovn.metadata.agent [-] Port df88dd7c-4397-4078-b7e1-7fbf48d503b7 in datapath 36d3914b-0866-44d3-8d61-e3876a797a40 bound to our chassis#033[00m Nov 28 05:12:22 localhost ovn_metadata_agent[158125]: 2025-11-28 10:12:22.323 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Port bd3edd05-3839-4526-8e12-85548b39b0b0 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Nov 28 05:12:22 localhost ovn_metadata_agent[158125]: 2025-11-28 10:12:22.323 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 36d3914b-0866-44d3-8d61-e3876a797a40, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:12:22 localhost ovn_metadata_agent[158125]: 2025-11-28 10:12:22.324 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[1f4a7576-9528-4c42-aa0a-893bb05c7376]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:12:22 localhost journal[227875]: ethtool ioctl error on tapdf88dd7c-43: No such device Nov 28 05:12:22 localhost ovn_controller[152322]: 2025-11-28T10:12:22Z|00484|binding|INFO|Setting lport df88dd7c-4397-4078-b7e1-7fbf48d503b7 ovn-installed in OVS Nov 28 05:12:22 localhost ovn_controller[152322]: 2025-11-28T10:12:22Z|00485|binding|INFO|Setting lport df88dd7c-4397-4078-b7e1-7fbf48d503b7 up in Southbound Nov 28 05:12:22 localhost journal[227875]: ethtool ioctl error on tapdf88dd7c-43: No such device Nov 28 05:12:22 localhost nova_compute[279673]: 2025-11-28 10:12:22.341 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:22 localhost journal[227875]: ethtool ioctl error on tapdf88dd7c-43: No such device Nov 28 05:12:22 localhost journal[227875]: ethtool ioctl error on tapdf88dd7c-43: No such device Nov 28 05:12:22 localhost journal[227875]: ethtool ioctl error on tapdf88dd7c-43: No such device Nov 28 05:12:22 localhost journal[227875]: ethtool ioctl error on tapdf88dd7c-43: No such device Nov 28 05:12:22 localhost journal[227875]: ethtool ioctl error on tapdf88dd7c-43: No such device Nov 28 05:12:22 localhost journal[227875]: ethtool ioctl error on tapdf88dd7c-43: No such device Nov 28 05:12:22 localhost nova_compute[279673]: 2025-11-28 10:12:22.382 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:22 localhost nova_compute[279673]: 2025-11-28 10:12:22.413 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:22 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e285 do_prune osdmap full prune enabled Nov 28 05:12:22 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e286 e286: 6 total, 6 up, 6 in Nov 28 05:12:22 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e286: 6 total, 6 up, 6 in Nov 28 05:12:22 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:12:22 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:22 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:12:23 localhost podman[330324]: Nov 28 05:12:23 localhost podman[330324]: 2025-11-28 10:12:23.32576063 +0000 UTC m=+0.090351515 container create b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36d3914b-0866-44d3-8d61-e3876a797a40, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Nov 28 05:12:23 localhost podman[330324]: 2025-11-28 10:12:23.281567178 +0000 UTC m=+0.046158083 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:12:23 localhost nova_compute[279673]: 2025-11-28 10:12:23.385 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:23 localhost systemd[1]: Started libpod-conmon-b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f.scope. Nov 28 05:12:23 localhost systemd[1]: Started libcrun container. Nov 28 05:12:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/398491bd3154902fd56e5c19ffa0b65563dff195d8ba193f4ec0c429240248ab/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:12:23 localhost podman[330324]: 2025-11-28 10:12:23.432698305 +0000 UTC m=+0.197289190 container init b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36d3914b-0866-44d3-8d61-e3876a797a40, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Nov 28 05:12:23 localhost podman[330324]: 2025-11-28 10:12:23.447981895 +0000 UTC m=+0.212572790 container start b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36d3914b-0866-44d3-8d61-e3876a797a40, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 28 05:12:23 localhost dnsmasq[330342]: started, version 2.85 cachesize 150 Nov 28 05:12:23 localhost dnsmasq[330342]: DNS service limited to local subnets Nov 28 05:12:23 localhost dnsmasq[330342]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:12:23 localhost dnsmasq[330342]: warning: no upstream servers configured Nov 28 05:12:23 localhost dnsmasq-dhcp[330342]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:12:23 localhost dnsmasq[330342]: read /var/lib/neutron/dhcp/36d3914b-0866-44d3-8d61-e3876a797a40/addn_hosts - 0 addresses Nov 28 05:12:23 localhost dnsmasq-dhcp[330342]: read /var/lib/neutron/dhcp/36d3914b-0866-44d3-8d61-e3876a797a40/host Nov 28 05:12:23 localhost dnsmasq-dhcp[330342]: read /var/lib/neutron/dhcp/36d3914b-0866-44d3-8d61-e3876a797a40/opts Nov 28 05:12:23 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:12:23.588 261084 INFO neutron.agent.dhcp.agent [None req-d484a65a-4241-4462-85e1-84ab3923c5e7 - - - - - -] DHCP configuration for ports {'ee15ef05-e21c-4932-8e4b-a1142faa7a40'} is completed#033[00m Nov 28 05:12:23 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 28 05:12:23 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:23 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:23 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:12:24 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:12:24.239 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:12:23Z, description=, device_id=a2e46648-c204-4aa4-852a-22559e830378, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=92b58e02-a654-4139-b6b8-8d7bd52e9ded, ip_allocation=immediate, mac_address=fa:16:3e:04:23:47, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:12:19Z, description=, dns_domain=, id=36d3914b-0866-44d3-8d61-e3876a797a40, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIMysqlTest-289637141-network, port_security_enabled=True, project_id=2a5eaeee8da34a54a7b5240b18a0f9b2, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=45868, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3753, status=ACTIVE, subnets=['99c9b187-4788-4ff8-bc75-80da88278afe'], tags=[], tenant_id=2a5eaeee8da34a54a7b5240b18a0f9b2, updated_at=2025-11-28T10:12:20Z, vlan_transparent=None, network_id=36d3914b-0866-44d3-8d61-e3876a797a40, port_security_enabled=False, project_id=2a5eaeee8da34a54a7b5240b18a0f9b2, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3766, status=DOWN, tags=[], tenant_id=2a5eaeee8da34a54a7b5240b18a0f9b2, updated_at=2025-11-28T10:12:23Z on network 36d3914b-0866-44d3-8d61-e3876a797a40#033[00m Nov 28 05:12:24 localhost systemd[1]: tmp-crun.CByurc.mount: Deactivated successfully. Nov 28 05:12:24 localhost dnsmasq[330342]: read /var/lib/neutron/dhcp/36d3914b-0866-44d3-8d61-e3876a797a40/addn_hosts - 1 addresses Nov 28 05:12:24 localhost dnsmasq-dhcp[330342]: read /var/lib/neutron/dhcp/36d3914b-0866-44d3-8d61-e3876a797a40/host Nov 28 05:12:24 localhost dnsmasq-dhcp[330342]: read /var/lib/neutron/dhcp/36d3914b-0866-44d3-8d61-e3876a797a40/opts Nov 28 05:12:24 localhost podman[330360]: 2025-11-28 10:12:24.406096906 +0000 UTC m=+0.039420565 container kill b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36d3914b-0866-44d3-8d61-e3876a797a40, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Nov 28 05:12:24 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:12:24.762 261084 INFO neutron.agent.dhcp.agent [None req-be4a8c81-5fee-47e9-a6d9-ea591aeb002c - - - - - -] DHCP configuration for ports {'92b58e02-a654-4139-b6b8-8d7bd52e9ded'} is completed#033[00m Nov 28 05:12:25 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:12:25.883 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:12:23Z, description=, device_id=a2e46648-c204-4aa4-852a-22559e830378, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=92b58e02-a654-4139-b6b8-8d7bd52e9ded, ip_allocation=immediate, mac_address=fa:16:3e:04:23:47, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:12:19Z, description=, dns_domain=, id=36d3914b-0866-44d3-8d61-e3876a797a40, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIMysqlTest-289637141-network, port_security_enabled=True, project_id=2a5eaeee8da34a54a7b5240b18a0f9b2, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=45868, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3753, status=ACTIVE, subnets=['99c9b187-4788-4ff8-bc75-80da88278afe'], tags=[], tenant_id=2a5eaeee8da34a54a7b5240b18a0f9b2, updated_at=2025-11-28T10:12:20Z, vlan_transparent=None, network_id=36d3914b-0866-44d3-8d61-e3876a797a40, port_security_enabled=False, project_id=2a5eaeee8da34a54a7b5240b18a0f9b2, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3766, status=DOWN, tags=[], tenant_id=2a5eaeee8da34a54a7b5240b18a0f9b2, updated_at=2025-11-28T10:12:23Z on network 36d3914b-0866-44d3-8d61-e3876a797a40#033[00m Nov 28 05:12:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Nov 28 05:12:26 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 28 05:12:26 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 28 05:12:26 localhost dnsmasq[330342]: read /var/lib/neutron/dhcp/36d3914b-0866-44d3-8d61-e3876a797a40/addn_hosts - 1 addresses Nov 28 05:12:26 localhost podman[330398]: 2025-11-28 10:12:26.130443507 +0000 UTC m=+0.063252920 container kill b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36d3914b-0866-44d3-8d61-e3876a797a40, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:12:26 localhost dnsmasq-dhcp[330342]: read /var/lib/neutron/dhcp/36d3914b-0866-44d3-8d61-e3876a797a40/host Nov 28 05:12:26 localhost dnsmasq-dhcp[330342]: read /var/lib/neutron/dhcp/36d3914b-0866-44d3-8d61-e3876a797a40/opts Nov 28 05:12:26 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:12:26.461 261084 INFO neutron.agent.dhcp.agent [None req-84a10848-0264-4341-ba7d-7b4ab3b4b005 - - - - - -] DHCP configuration for ports {'92b58e02-a654-4139-b6b8-8d7bd52e9ded'} is completed#033[00m Nov 28 05:12:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:12:27 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 28 05:12:27 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 28 05:12:27 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 28 05:12:27 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 28 05:12:27 localhost nova_compute[279673]: 2025-11-28 10:12:27.188 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:29 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:12:29 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:29 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:12:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:12:29 localhost podman[330420]: 2025-11-28 10:12:29.847434518 +0000 UTC m=+0.082615868 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Nov 28 05:12:29 localhost podman[330420]: 2025-11-28 10:12:29.866532866 +0000 UTC m=+0.101714246 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, distribution-scope=public, config_id=edpm, maintainer=Red Hat, Inc., container_name=openstack_network_exporter) Nov 28 05:12:29 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:12:30 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 28 05:12:30 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:30 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:30 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:12:31 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e286 do_prune osdmap full prune enabled Nov 28 05:12:31 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e287 e287: 6 total, 6 up, 6 in Nov 28 05:12:31 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e287: 6 total, 6 up, 6 in Nov 28 05:12:31 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e287 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:12:31 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e287 do_prune osdmap full prune enabled Nov 28 05:12:31 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e288 e288: 6 total, 6 up, 6 in Nov 28 05:12:31 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e288: 6 total, 6 up, 6 in Nov 28 05:12:32 localhost nova_compute[279673]: 2025-11-28 10:12:32.191 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:12:32 localhost nova_compute[279673]: 2025-11-28 10:12:32.192 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:32 localhost nova_compute[279673]: 2025-11-28 10:12:32.193 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:12:32 localhost nova_compute[279673]: 2025-11-28 10:12:32.193 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:12:32 localhost nova_compute[279673]: 2025-11-28 10:12:32.194 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:12:32 localhost nova_compute[279673]: 2025-11-28 10:12:32.196 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:12:32 localhost podman[330460]: 2025-11-28 10:12:32.569049499 +0000 UTC m=+0.096519125 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 28 05:12:32 localhost podman[330460]: 2025-11-28 10:12:32.602665225 +0000 UTC m=+0.130134871 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 05:12:32 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Nov 28 05:12:32 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 28 05:12:32 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:12:32 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 28 05:12:33 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 05:12:33 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:12:33 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 28 05:12:33 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 28 05:12:33 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 28 05:12:33 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 28 05:12:33 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 05:12:33 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:12:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:12:33 localhost podman[330551]: 2025-11-28 10:12:33.545076973 +0000 UTC m=+0.070441561 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:12:33 localhost podman[330551]: 2025-11-28 10:12:33.555460083 +0000 UTC m=+0.080824701 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 28 05:12:33 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:12:35 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e288 do_prune osdmap full prune enabled Nov 28 05:12:35 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e289 e289: 6 total, 6 up, 6 in Nov 28 05:12:35 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e289: 6 total, 6 up, 6 in Nov 28 05:12:35 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:12:35 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:35 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:12:35 localhost ovn_metadata_agent[158125]: 2025-11-28 10:12:35.952 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:12:35 localhost ovn_metadata_agent[158125]: 2025-11-28 10:12:35.953 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 28 05:12:35 localhost nova_compute[279673]: 2025-11-28 10:12:35.980 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:36 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 28 05:12:36 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:12:36 localhost dnsmasq[330342]: read /var/lib/neutron/dhcp/36d3914b-0866-44d3-8d61-e3876a797a40/addn_hosts - 0 addresses Nov 28 05:12:36 localhost dnsmasq-dhcp[330342]: read /var/lib/neutron/dhcp/36d3914b-0866-44d3-8d61-e3876a797a40/host Nov 28 05:12:36 localhost podman[330584]: 2025-11-28 10:12:36.174183321 +0000 UTC m=+0.063020542 container kill b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36d3914b-0866-44d3-8d61-e3876a797a40, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Nov 28 05:12:36 localhost dnsmasq-dhcp[330342]: read /var/lib/neutron/dhcp/36d3914b-0866-44d3-8d61-e3876a797a40/opts Nov 28 05:12:36 localhost nova_compute[279673]: 2025-11-28 10:12:36.373 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:36 localhost ovn_controller[152322]: 2025-11-28T10:12:36Z|00486|binding|INFO|Releasing lport df88dd7c-4397-4078-b7e1-7fbf48d503b7 from this chassis (sb_readonly=0) Nov 28 05:12:36 localhost kernel: device tapdf88dd7c-43 left promiscuous mode Nov 28 05:12:36 localhost ovn_controller[152322]: 2025-11-28T10:12:36Z|00487|binding|INFO|Setting lport df88dd7c-4397-4078-b7e1-7fbf48d503b7 down in Southbound Nov 28 05:12:36 localhost ovn_metadata_agent[158125]: 2025-11-28 10:12:36.385 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-36d3914b-0866-44d3-8d61-e3876a797a40', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-36d3914b-0866-44d3-8d61-e3876a797a40', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a5eaeee8da34a54a7b5240b18a0f9b2', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb560543-1290-4f49-91f1-518ecd990906, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=df88dd7c-4397-4078-b7e1-7fbf48d503b7) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:12:36 localhost ovn_metadata_agent[158125]: 2025-11-28 10:12:36.387 158130 INFO neutron.agent.ovn.metadata.agent [-] Port df88dd7c-4397-4078-b7e1-7fbf48d503b7 in datapath 36d3914b-0866-44d3-8d61-e3876a797a40 unbound from our chassis#033[00m Nov 28 05:12:36 localhost ovn_metadata_agent[158125]: 2025-11-28 10:12:36.389 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 36d3914b-0866-44d3-8d61-e3876a797a40, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:12:36 localhost ovn_metadata_agent[158125]: 2025-11-28 10:12:36.390 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[850aebea-eb9f-4304-b8d3-5791c71f9687]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:12:36 localhost nova_compute[279673]: 2025-11-28 10:12:36.393 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:36 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 28 05:12:36 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:36 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:36 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:12:36 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:12:36 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e289 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:12:37 localhost nova_compute[279673]: 2025-11-28 10:12:37.217 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:37 localhost ovn_controller[152322]: 2025-11-28T10:12:37Z|00488|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:12:37 localhost nova_compute[279673]: 2025-11-28 10:12:37.834 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:38 localhost dnsmasq[330342]: exiting on receipt of SIGTERM Nov 28 05:12:38 localhost podman[330623]: 2025-11-28 10:12:38.625488562 +0000 UTC m=+0.059685600 container kill b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36d3914b-0866-44d3-8d61-e3876a797a40, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 05:12:38 localhost systemd[1]: libpod-b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f.scope: Deactivated successfully. Nov 28 05:12:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:12:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:12:38 localhost podman[330644]: 2025-11-28 10:12:38.737961508 +0000 UTC m=+0.084334710 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible) Nov 28 05:12:38 localhost podman[330637]: 2025-11-28 10:12:38.757716226 +0000 UTC m=+0.114550999 container died b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36d3914b-0866-44d3-8d61-e3876a797a40, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 05:12:38 localhost podman[330637]: 2025-11-28 10:12:38.800706031 +0000 UTC m=+0.157540794 container cleanup b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36d3914b-0866-44d3-8d61-e3876a797a40, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true) Nov 28 05:12:38 localhost podman[330644]: 2025-11-28 10:12:38.803186948 +0000 UTC m=+0.149560170 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS) Nov 28 05:12:38 localhost systemd[1]: libpod-conmon-b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f.scope: Deactivated successfully. Nov 28 05:12:38 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:12:38 localhost podman[330639]: 2025-11-28 10:12:38.893012326 +0000 UTC m=+0.242230606 container remove b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-36d3914b-0866-44d3-8d61-e3876a797a40, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:12:38 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:12:38.928 261084 INFO neutron.agent.dhcp.agent [None req-53dd6a79-7cbd-4646-acaf-a1e9c3ae5257 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:12:38 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:12:38.928 261084 INFO neutron.agent.dhcp.agent [None req-53dd6a79-7cbd-4646-acaf-a1e9c3ae5257 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:12:38 localhost podman[330646]: 2025-11-28 10:12:38.949364502 +0000 UTC m=+0.290801922 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3) Nov 28 05:12:39 localhost podman[330646]: 2025-11-28 10:12:39.05964093 +0000 UTC m=+0.401078290 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 28 05:12:39 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:12:39 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Nov 28 05:12:39 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 28 05:12:39 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 28 05:12:39 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Nov 28 05:12:39 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 5604 writes, 37K keys, 5604 commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.05 MB/s#012Cumulative WAL: 5604 writes, 5604 syncs, 1.00 writes per sync, written: 0.06 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2520 writes, 11K keys, 2520 commit groups, 1.0 writes per commit group, ingest: 12.58 MB, 0.02 MB/s#012Interval WAL: 2520 writes, 2520 syncs, 1.00 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 132.5 0.34 0.13 19 0.018 0 0 0.0 0.0#012 L6 1/0 17.85 MB 0.0 0.3 0.0 0.3 0.3 0.0 0.0 6.7 172.5 158.3 1.91 0.80 18 0.106 224K 9420 0.0 0.0#012 Sum 1/0 17.85 MB 0.0 0.3 0.0 0.3 0.3 0.1 0.0 7.7 146.5 154.4 2.26 0.93 37 0.061 224K 9420 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.1 0.0 0.1 0.1 0.0 0.0 12.0 159.6 162.6 0.82 0.37 14 0.058 96K 3788 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low 0/0 0.00 KB 0.0 0.3 0.0 0.3 0.3 0.0 0.0 0.0 172.5 158.3 1.91 0.80 18 0.106 224K 9420 0.0 0.0#012High 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 134.2 0.34 0.13 18 0.019 0 0 0.0 0.0#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4 0.00 0.00 1 0.004 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.044, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.34 GB write, 0.29 MB/s write, 0.32 GB read, 0.28 MB/s read, 2.3 seconds#012Interval compaction: 0.13 GB write, 0.22 MB/s write, 0.13 GB read, 0.22 MB/s read, 0.8 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55b5bb679350#2 capacity: 304.00 MB usage: 55.14 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.00081 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3716,53.69 MB,17.6605%) FilterBlock(37,645.73 KB,0.207434%) IndexBlock(37,843.55 KB,0.270979%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Nov 28 05:12:39 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 28 05:12:39 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 28 05:12:39 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 28 05:12:39 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 28 05:12:39 localhost systemd[1]: var-lib-containers-storage-overlay-398491bd3154902fd56e5c19ffa0b65563dff195d8ba193f4ec0c429240248ab-merged.mount: Deactivated successfully. Nov 28 05:12:39 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b902b2a7c8a241f32dcf0d24da2871381cabfe7d2ab173a419d9f02e9d5d8d2f-userdata-shm.mount: Deactivated successfully. Nov 28 05:12:39 localhost systemd[1]: run-netns-qdhcp\x2d36d3914b\x2d0866\x2d44d3\x2d8d61\x2de3876a797a40.mount: Deactivated successfully. Nov 28 05:12:40 localhost podman[238687]: time="2025-11-28T10:12:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:12:40 localhost podman[238687]: @ - - [28/Nov/2025:10:12:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 28 05:12:40 localhost podman[238687]: @ - - [28/Nov/2025:10:12:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19277 "" "Go-http-client/1.1" Nov 28 05:12:40 localhost ovn_metadata_agent[158125]: 2025-11-28 10:12:40.954 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:12:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e289 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:12:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e289 do_prune osdmap full prune enabled Nov 28 05:12:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 e290: 6 total, 6 up, 6 in Nov 28 05:12:41 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e290: 6 total, 6 up, 6 in Nov 28 05:12:42 localhost nova_compute[279673]: 2025-11-28 10:12:42.220 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:42 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:12:42 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:42 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:12:42 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 28 05:12:42 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:42 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:42 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:12:45 localhost ceph-osd[32506]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2. Nov 28 05:12:45 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Nov 28 05:12:45 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 28 05:12:45 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 28 05:12:46 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 28 05:12:46 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Nov 28 05:12:46 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Nov 28 05:12:46 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Nov 28 05:12:46 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:12:47 localhost nova_compute[279673]: 2025-11-28 10:12:47.223 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:47 localhost nova_compute[279673]: 2025-11-28 10:12:47.253 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:12:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:12:47 localhost nova_compute[279673]: 2025-11-28 10:12:47.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:12:47 localhost systemd[1]: tmp-crun.PrVOQp.mount: Deactivated successfully. Nov 28 05:12:47 localhost podman[330708]: 2025-11-28 10:12:47.870244087 +0000 UTC m=+0.098133944 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 05:12:47 localhost podman[330708]: 2025-11-28 10:12:47.915774191 +0000 UTC m=+0.143664018 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Nov 28 05:12:47 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:12:47 localhost podman[330707]: 2025-11-28 10:12:47.919195256 +0000 UTC m=+0.150401175 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 05:12:47 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:12:47.977 261084 INFO neutron.agent.linux.ip_lib [None req-b40bfd1e-1a73-42c6-a455-63ef18baedf2 - - - - - -] Device tap1738dc88-e5 cannot be used as it has no MAC address#033[00m Nov 28 05:12:48 localhost podman[330707]: 2025-11-28 10:12:48.003626378 +0000 UTC m=+0.234832297 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 05:12:48 localhost nova_compute[279673]: 2025-11-28 10:12:48.050 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:48 localhost kernel: device tap1738dc88-e5 entered promiscuous mode Nov 28 05:12:48 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:12:48 localhost nova_compute[279673]: 2025-11-28 10:12:48.058 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:48 localhost ovn_controller[152322]: 2025-11-28T10:12:48Z|00489|binding|INFO|Claiming lport 1738dc88-e5f5-4680-ab9e-8f550f7bd83e for this chassis. Nov 28 05:12:48 localhost ovn_controller[152322]: 2025-11-28T10:12:48Z|00490|binding|INFO|1738dc88-e5f5-4680-ab9e-8f550f7bd83e: Claiming unknown Nov 28 05:12:48 localhost NetworkManager[5967]: [1764324768.0643] manager: (tap1738dc88-e5): new Generic device (/org/freedesktop/NetworkManager/Devices/77) Nov 28 05:12:48 localhost systemd-udevd[330759]: Network interface NamePolicy= disabled on kernel command line. Nov 28 05:12:48 localhost openstack_network_exporter[240658]: ERROR 10:12:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:12:48 localhost openstack_network_exporter[240658]: ERROR 10:12:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:12:48 localhost openstack_network_exporter[240658]: ERROR 10:12:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:12:48 localhost journal[227875]: ethtool ioctl error on tap1738dc88-e5: No such device Nov 28 05:12:48 localhost openstack_network_exporter[240658]: ERROR 10:12:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:12:48 localhost openstack_network_exporter[240658]: Nov 28 05:12:48 localhost openstack_network_exporter[240658]: ERROR 10:12:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:12:48 localhost openstack_network_exporter[240658]: Nov 28 05:12:48 localhost ovn_controller[152322]: 2025-11-28T10:12:48Z|00491|binding|INFO|Setting lport 1738dc88-e5f5-4680-ab9e-8f550f7bd83e ovn-installed in OVS Nov 28 05:12:48 localhost journal[227875]: ethtool ioctl error on tap1738dc88-e5: No such device Nov 28 05:12:48 localhost journal[227875]: ethtool ioctl error on tap1738dc88-e5: No such device Nov 28 05:12:48 localhost nova_compute[279673]: 2025-11-28 10:12:48.102 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:48 localhost journal[227875]: ethtool ioctl error on tap1738dc88-e5: No such device Nov 28 05:12:48 localhost journal[227875]: ethtool ioctl error on tap1738dc88-e5: No such device Nov 28 05:12:48 localhost journal[227875]: ethtool ioctl error on tap1738dc88-e5: No such device Nov 28 05:12:48 localhost journal[227875]: ethtool ioctl error on tap1738dc88-e5: No such device Nov 28 05:12:48 localhost journal[227875]: ethtool ioctl error on tap1738dc88-e5: No such device Nov 28 05:12:48 localhost nova_compute[279673]: 2025-11-28 10:12:48.137 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:48 localhost ovn_controller[152322]: 2025-11-28T10:12:48Z|00492|binding|INFO|Setting lport 1738dc88-e5f5-4680-ab9e-8f550f7bd83e up in Southbound Nov 28 05:12:48 localhost ovn_metadata_agent[158125]: 2025-11-28 10:12:48.141 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-41370116-60b0-4433-ab19-12e9b7026582', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41370116-60b0-4433-ab19-12e9b7026582', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8ce454c94ec74465ac8200d5fe0b153e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c1903e82-d4b6-46fe-9759-cfad1c16d3d4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1738dc88-e5f5-4680-ab9e-8f550f7bd83e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:12:48 localhost ovn_metadata_agent[158125]: 2025-11-28 10:12:48.143 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 1738dc88-e5f5-4680-ab9e-8f550f7bd83e in datapath 41370116-60b0-4433-ab19-12e9b7026582 bound to our chassis#033[00m Nov 28 05:12:48 localhost ovn_metadata_agent[158125]: 2025-11-28 10:12:48.144 158130 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 41370116-60b0-4433-ab19-12e9b7026582 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Nov 28 05:12:48 localhost ovn_metadata_agent[158125]: 2025-11-28 10:12:48.145 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[931c7037-1d94-437e-9ad6-1dd4ed871bfe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:12:48 localhost nova_compute[279673]: 2025-11-28 10:12:48.169 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:48 localhost nova_compute[279673]: 2025-11-28 10:12:48.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:12:48 localhost nova_compute[279673]: 2025-11-28 10:12:48.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:12:48 localhost nova_compute[279673]: 2025-11-28 10:12:48.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 05:12:48 localhost systemd[1]: tmp-crun.ayvavn.mount: Deactivated successfully. Nov 28 05:12:49 localhost podman[330830]: Nov 28 05:12:49 localhost podman[330830]: 2025-11-28 10:12:49.404583814 +0000 UTC m=+0.089034824 container create fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41370116-60b0-4433-ab19-12e9b7026582, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS) Nov 28 05:12:49 localhost podman[330830]: 2025-11-28 10:12:49.360556337 +0000 UTC m=+0.045007387 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Nov 28 05:12:49 localhost systemd[1]: Started libpod-conmon-fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79.scope. Nov 28 05:12:49 localhost systemd[1]: tmp-crun.d64MIh.mount: Deactivated successfully. Nov 28 05:12:49 localhost systemd[1]: Started libcrun container. Nov 28 05:12:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f48131d4cd50d23f0a08edebbccc523a267ca2a1360891739e8c240e1e9f0859/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Nov 28 05:12:49 localhost podman[330830]: 2025-11-28 10:12:49.533224898 +0000 UTC m=+0.217675918 container init fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41370116-60b0-4433-ab19-12e9b7026582, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:12:49 localhost podman[330830]: 2025-11-28 10:12:49.542216295 +0000 UTC m=+0.226667305 container start fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41370116-60b0-4433-ab19-12e9b7026582, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:12:49 localhost dnsmasq[330849]: started, version 2.85 cachesize 150 Nov 28 05:12:49 localhost dnsmasq[330849]: DNS service limited to local subnets Nov 28 05:12:49 localhost dnsmasq[330849]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Nov 28 05:12:49 localhost dnsmasq[330849]: warning: no upstream servers configured Nov 28 05:12:49 localhost dnsmasq-dhcp[330849]: DHCP, static leases only on 10.100.0.0, lease time 1d Nov 28 05:12:49 localhost dnsmasq[330849]: read /var/lib/neutron/dhcp/41370116-60b0-4433-ab19-12e9b7026582/addn_hosts - 0 addresses Nov 28 05:12:49 localhost dnsmasq-dhcp[330849]: read /var/lib/neutron/dhcp/41370116-60b0-4433-ab19-12e9b7026582/host Nov 28 05:12:49 localhost dnsmasq-dhcp[330849]: read /var/lib/neutron/dhcp/41370116-60b0-4433-ab19-12e9b7026582/opts Nov 28 05:12:49 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:12:49 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:49 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:12:49 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:12:49.738 261084 INFO neutron.agent.dhcp.agent [None req-c8fb3da4-a500-46e8-9b84-1d4e2d342699 - - - - - -] DHCP configuration for ports {'30dc80b5-018b-40de-ae8c-72305a0bb063'} is completed#033[00m Nov 28 05:12:50 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 28 05:12:50 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:50 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:50 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:12:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:12:50.849 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:12:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:12:50.850 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:12:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:12:50.851 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:12:51 localhost nova_compute[279673]: 2025-11-28 10:12:51.515 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:51 localhost nova_compute[279673]: 2025-11-28 10:12:51.769 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:51 localhost nova_compute[279673]: 2025-11-28 10:12:51.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:12:51 localhost nova_compute[279673]: 2025-11-28 10:12:51.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:12:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:12:52 localhost nova_compute[279673]: 2025-11-28 10:12:52.228 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:52 localhost nova_compute[279673]: 2025-11-28 10:12:52.767 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:12:52 localhost nova_compute[279673]: 2025-11-28 10:12:52.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:12:52 localhost nova_compute[279673]: 2025-11-28 10:12:52.814 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:12:52 localhost nova_compute[279673]: 2025-11-28 10:12:52.815 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:12:52 localhost nova_compute[279673]: 2025-11-28 10:12:52.815 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:12:52 localhost nova_compute[279673]: 2025-11-28 10:12:52.815 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 05:12:52 localhost nova_compute[279673]: 2025-11-28 10:12:52.816 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:12:53 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Nov 28 05:12:53 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 28 05:12:53 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 28 05:12:53 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:12:53 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2706457592' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:12:53 localhost nova_compute[279673]: 2025-11-28 10:12:53.268 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:12:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:12:53.294 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:12:52Z, description=, device_id=c6a3c308-ab33-40a1-9933-91a047698d13, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=5b58bb5a-3cf7-4593-ab3b-2d1f27a01a18, ip_allocation=immediate, mac_address=fa:16:3e:7c:e0:a5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:12:46Z, description=, dns_domain=, id=41370116-60b0-4433-ab19-12e9b7026582, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-1537040395-network, port_security_enabled=True, project_id=8ce454c94ec74465ac8200d5fe0b153e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25437, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3799, status=ACTIVE, subnets=['129cbc91-ddc3-4c68-80c6-e69ac70a7c43'], tags=[], tenant_id=8ce454c94ec74465ac8200d5fe0b153e, updated_at=2025-11-28T10:12:47Z, vlan_transparent=None, network_id=41370116-60b0-4433-ab19-12e9b7026582, port_security_enabled=False, project_id=8ce454c94ec74465ac8200d5fe0b153e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3807, status=DOWN, tags=[], tenant_id=8ce454c94ec74465ac8200d5fe0b153e, updated_at=2025-11-28T10:12:52Z on network 41370116-60b0-4433-ab19-12e9b7026582#033[00m Nov 28 05:12:53 localhost nova_compute[279673]: 2025-11-28 10:12:53.378 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:12:53 localhost nova_compute[279673]: 2025-11-28 10:12:53.379 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:12:53 localhost dnsmasq[330849]: read /var/lib/neutron/dhcp/41370116-60b0-4433-ab19-12e9b7026582/addn_hosts - 1 addresses Nov 28 05:12:53 localhost dnsmasq-dhcp[330849]: read /var/lib/neutron/dhcp/41370116-60b0-4433-ab19-12e9b7026582/host Nov 28 05:12:53 localhost podman[330889]: 2025-11-28 10:12:53.568442903 +0000 UTC m=+0.077952992 container kill fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41370116-60b0-4433-ab19-12e9b7026582, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2) Nov 28 05:12:53 localhost dnsmasq-dhcp[330849]: read /var/lib/neutron/dhcp/41370116-60b0-4433-ab19-12e9b7026582/opts Nov 28 05:12:53 localhost nova_compute[279673]: 2025-11-28 10:12:53.616 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 05:12:53 localhost nova_compute[279673]: 2025-11-28 10:12:53.618 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11010MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 05:12:53 localhost nova_compute[279673]: 2025-11-28 10:12:53.619 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:12:53 localhost nova_compute[279673]: 2025-11-28 10:12:53.620 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:12:53 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 28 05:12:53 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 28 05:12:53 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 28 05:12:53 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 28 05:12:53 localhost nova_compute[279673]: 2025-11-28 10:12:53.701 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 05:12:53 localhost nova_compute[279673]: 2025-11-28 10:12:53.702 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 05:12:53 localhost nova_compute[279673]: 2025-11-28 10:12:53.702 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 05:12:53 localhost nova_compute[279673]: 2025-11-28 10:12:53.740 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:12:53 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:12:53.907 261084 INFO neutron.agent.dhcp.agent [None req-c295737c-9342-4b7d-80a8-56a817df0419 - - - - - -] DHCP configuration for ports {'5b58bb5a-3cf7-4593-ab3b-2d1f27a01a18'} is completed#033[00m Nov 28 05:12:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:12:54 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2576893307' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:12:54 localhost nova_compute[279673]: 2025-11-28 10:12:54.196 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:12:54 localhost nova_compute[279673]: 2025-11-28 10:12:54.203 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 05:12:54 localhost nova_compute[279673]: 2025-11-28 10:12:54.235 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 05:12:54 localhost nova_compute[279673]: 2025-11-28 10:12:54.238 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 05:12:54 localhost nova_compute[279673]: 2025-11-28 10:12:54.238 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.618s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:12:55 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:12:55.315 261084 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-11-28T10:12:52Z, description=, device_id=c6a3c308-ab33-40a1-9933-91a047698d13, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=5b58bb5a-3cf7-4593-ab3b-2d1f27a01a18, ip_allocation=immediate, mac_address=fa:16:3e:7c:e0:a5, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-11-28T10:12:46Z, description=, dns_domain=, id=41370116-60b0-4433-ab19-12e9b7026582, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-1537040395-network, port_security_enabled=True, project_id=8ce454c94ec74465ac8200d5fe0b153e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25437, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3799, status=ACTIVE, subnets=['129cbc91-ddc3-4c68-80c6-e69ac70a7c43'], tags=[], tenant_id=8ce454c94ec74465ac8200d5fe0b153e, updated_at=2025-11-28T10:12:47Z, vlan_transparent=None, network_id=41370116-60b0-4433-ab19-12e9b7026582, port_security_enabled=False, project_id=8ce454c94ec74465ac8200d5fe0b153e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3807, status=DOWN, tags=[], tenant_id=8ce454c94ec74465ac8200d5fe0b153e, updated_at=2025-11-28T10:12:52Z on network 41370116-60b0-4433-ab19-12e9b7026582#033[00m Nov 28 05:12:55 localhost systemd[1]: tmp-crun.dN7Ub2.mount: Deactivated successfully. Nov 28 05:12:55 localhost dnsmasq[330849]: read /var/lib/neutron/dhcp/41370116-60b0-4433-ab19-12e9b7026582/addn_hosts - 1 addresses Nov 28 05:12:55 localhost dnsmasq-dhcp[330849]: read /var/lib/neutron/dhcp/41370116-60b0-4433-ab19-12e9b7026582/host Nov 28 05:12:55 localhost dnsmasq-dhcp[330849]: read /var/lib/neutron/dhcp/41370116-60b0-4433-ab19-12e9b7026582/opts Nov 28 05:12:55 localhost podman[330950]: 2025-11-28 10:12:55.631256604 +0000 UTC m=+0.069428300 container kill fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41370116-60b0-4433-ab19-12e9b7026582, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 05:12:55 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:12:55.862 261084 INFO neutron.agent.dhcp.agent [None req-41c17d07-4157-484d-bdc1-1de8e4697e2c - - - - - -] DHCP configuration for ports {'5b58bb5a-3cf7-4593-ab3b-2d1f27a01a18'} is completed#033[00m Nov 28 05:12:56 localhost nova_compute[279673]: 2025-11-28 10:12:56.239 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:12:56 localhost nova_compute[279673]: 2025-11-28 10:12:56.240 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 05:12:56 localhost nova_compute[279673]: 2025-11-28 10:12:56.240 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 05:12:56 localhost nova_compute[279673]: 2025-11-28 10:12:56.374 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 05:12:56 localhost nova_compute[279673]: 2025-11-28 10:12:56.376 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 05:12:56 localhost nova_compute[279673]: 2025-11-28 10:12:56.377 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 05:12:56 localhost nova_compute[279673]: 2025-11-28 10:12:56.377 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:12:56 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:12:56 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:56 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:12:56 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 28 05:12:56 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:56 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:12:56 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:12:56 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:12:57 localhost nova_compute[279673]: 2025-11-28 10:12:57.226 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:12:57 localhost nova_compute[279673]: 2025-11-28 10:12:57.230 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:12:57 localhost nova_compute[279673]: 2025-11-28 10:12:57.231 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:57 localhost nova_compute[279673]: 2025-11-28 10:12:57.231 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:12:57 localhost nova_compute[279673]: 2025-11-28 10:12:57.232 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:12:57 localhost nova_compute[279673]: 2025-11-28 10:12:57.233 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:12:57 localhost nova_compute[279673]: 2025-11-28 10:12:57.236 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:12:57 localhost nova_compute[279673]: 2025-11-28 10:12:57.341 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 05:12:57 localhost nova_compute[279673]: 2025-11-28 10:12:57.342 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 05:12:57 localhost nova_compute[279673]: 2025-11-28 10:12:57.342 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:13:00 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Nov 28 05:13:00 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 28 05:13:00 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 28 05:13:00 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 28 05:13:00 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Nov 28 05:13:00 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Nov 28 05:13:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:13:00 localhost podman[330970]: 2025-11-28 10:13:00.84815166 +0000 UTC m=+0.083582767 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, architecture=x86_64, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container) Nov 28 05:13:00 localhost podman[330970]: 2025-11-28 10:13:00.860763838 +0000 UTC m=+0.096194975 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, config_id=edpm, name=ubi9-minimal, distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64) Nov 28 05:13:00 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:13:01 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Nov 28 05:13:01 localhost ovn_controller[152322]: 2025-11-28T10:13:01Z|00493|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:13:01 localhost nova_compute[279673]: 2025-11-28 10:13:01.151 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:13:01 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:13:02 localhost nova_compute[279673]: 2025-11-28 10:13:02.256 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:13:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:13:02 localhost podman[330990]: 2025-11-28 10:13:02.846289107 +0000 UTC m=+0.079936573 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 28 05:13:02 localhost podman[330990]: 2025-11-28 10:13:02.858393451 +0000 UTC m=+0.092040897 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 05:13:02 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:13:03 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:13:03 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:13:03 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:13:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:13:03 localhost systemd[1]: tmp-crun.GbSYPS.mount: Deactivated successfully. Nov 28 05:13:03 localhost podman[331013]: 2025-11-28 10:13:03.840981306 +0000 UTC m=+0.076237520 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3) Nov 28 05:13:03 localhost podman[331013]: 2025-11-28 10:13:03.846645511 +0000 UTC m=+0.081901684 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:13:03 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:13:04 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 28 05:13:04 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:13:04 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:13:04 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:13:04 localhost nova_compute[279673]: 2025-11-28 10:13:04.406 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:13:06 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:13:07 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Nov 28 05:13:07 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 28 05:13:07 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 28 05:13:07 localhost nova_compute[279673]: 2025-11-28 10:13:07.294 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:13:07 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 28 05:13:07 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 28 05:13:07 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 28 05:13:07 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 28 05:13:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:13:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:13:09 localhost podman[331031]: 2025-11-28 10:13:09.868322335 +0000 UTC m=+0.100242300 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Nov 28 05:13:09 localhost podman[331031]: 2025-11-28 10:13:09.883547893 +0000 UTC m=+0.115467848 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:13:09 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:13:09 localhost podman[331032]: 2025-11-28 10:13:09.961694851 +0000 UTC m=+0.188314683 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, container_name=ovn_controller) Nov 28 05:13:10 localhost podman[331032]: 2025-11-28 10:13:10.026677213 +0000 UTC m=+0.253297055 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true) Nov 28 05:13:10 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:13:10 localhost podman[238687]: time="2025-11-28T10:13:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:13:10 localhost podman[238687]: @ - - [28/Nov/2025:10:13:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157512 "" "Go-http-client/1.1" Nov 28 05:13:10 localhost podman[238687]: @ - - [28/Nov/2025:10:13:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19763 "" "Go-http-client/1.1" Nov 28 05:13:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:13:11 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:13:11 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:13:11 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 28 05:13:11 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:13:11 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:13:11 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow r pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:13:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:13:12 localhost nova_compute[279673]: 2025-11-28 10:13:12.296 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:13:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:13:17 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Nov 28 05:13:17 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 28 05:13:17 localhost nova_compute[279673]: 2025-11-28 10:13:17.301 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:13:17 localhost nova_compute[279673]: 2025-11-28 10:13:17.303 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:13:17 localhost nova_compute[279673]: 2025-11-28 10:13:17.303 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:13:17 localhost nova_compute[279673]: 2025-11-28 10:13:17.303 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:13:17 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 28 05:13:17 localhost nova_compute[279673]: 2025-11-28 10:13:17.340 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:13:17 localhost nova_compute[279673]: 2025-11-28 10:13:17.341 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:13:17 localhost nova_compute[279673]: 2025-11-28 10:13:17.344 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:13:17 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Nov 28 05:13:17 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 28 05:13:17 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Nov 28 05:13:17 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Nov 28 05:13:18 localhost openstack_network_exporter[240658]: ERROR 10:13:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:13:18 localhost openstack_network_exporter[240658]: ERROR 10:13:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:13:18 localhost openstack_network_exporter[240658]: ERROR 10:13:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:13:18 localhost openstack_network_exporter[240658]: ERROR 10:13:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:13:18 localhost openstack_network_exporter[240658]: Nov 28 05:13:18 localhost openstack_network_exporter[240658]: ERROR 10:13:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:13:18 localhost openstack_network_exporter[240658]: Nov 28 05:13:18 localhost nova_compute[279673]: 2025-11-28 10:13:18.753 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:13:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:13:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:13:18 localhost podman[331076]: 2025-11-28 10:13:18.863671411 +0000 UTC m=+0.083114682 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:13:18 localhost podman[331076]: 2025-11-28 10:13:18.90224865 +0000 UTC m=+0.121691901 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Nov 28 05:13:18 localhost podman[331075]: 2025-11-28 10:13:18.919421939 +0000 UTC m=+0.142722178 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 05:13:18 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:13:18 localhost podman[331075]: 2025-11-28 10:13:18.958463632 +0000 UTC m=+0.181763861 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 05:13:18 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:13:20 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:13:20 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:13:20 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:13:20 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Nov 28 05:13:20 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:13:21 localhost dnsmasq[330849]: read /var/lib/neutron/dhcp/41370116-60b0-4433-ab19-12e9b7026582/addn_hosts - 0 addresses Nov 28 05:13:21 localhost dnsmasq-dhcp[330849]: read /var/lib/neutron/dhcp/41370116-60b0-4433-ab19-12e9b7026582/host Nov 28 05:13:21 localhost dnsmasq-dhcp[330849]: read /var/lib/neutron/dhcp/41370116-60b0-4433-ab19-12e9b7026582/opts Nov 28 05:13:21 localhost podman[331133]: 2025-11-28 10:13:21.380694947 +0000 UTC m=+0.070767452 container kill fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41370116-60b0-4433-ab19-12e9b7026582, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Nov 28 05:13:21 localhost nova_compute[279673]: 2025-11-28 10:13:21.574 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:13:21 localhost ovn_controller[152322]: 2025-11-28T10:13:21Z|00494|binding|INFO|Releasing lport 1738dc88-e5f5-4680-ab9e-8f550f7bd83e from this chassis (sb_readonly=0) Nov 28 05:13:21 localhost kernel: device tap1738dc88-e5 left promiscuous mode Nov 28 05:13:21 localhost ovn_controller[152322]: 2025-11-28T10:13:21Z|00495|binding|INFO|Setting lport 1738dc88-e5f5-4680-ab9e-8f550f7bd83e down in Southbound Nov 28 05:13:21 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:13:21 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:13:21 localhost ovn_metadata_agent[158125]: 2025-11-28 10:13:21.585 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005538513.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcp8d11bc4e-20dd-5622-818e-763c4e88824a-41370116-60b0-4433-ab19-12e9b7026582', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41370116-60b0-4433-ab19-12e9b7026582', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8ce454c94ec74465ac8200d5fe0b153e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005538513.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c1903e82-d4b6-46fe-9759-cfad1c16d3d4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1738dc88-e5f5-4680-ab9e-8f550f7bd83e) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:13:21 localhost ovn_metadata_agent[158125]: 2025-11-28 10:13:21.593 158130 INFO neutron.agent.ovn.metadata.agent [-] Port 1738dc88-e5f5-4680-ab9e-8f550f7bd83e in datapath 41370116-60b0-4433-ab19-12e9b7026582 unbound from our chassis#033[00m Nov 28 05:13:21 localhost nova_compute[279673]: 2025-11-28 10:13:21.596 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:13:21 localhost ovn_metadata_agent[158125]: 2025-11-28 10:13:21.596 158130 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 41370116-60b0-4433-ab19-12e9b7026582, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Nov 28 05:13:21 localhost ovn_metadata_agent[158125]: 2025-11-28 10:13:21.599 158233 DEBUG oslo.privsep.daemon [-] privsep: reply[b4807ed4-84b9-456a-bcc6-3671c6802d38]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Nov 28 05:13:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:13:22 localhost nova_compute[279673]: 2025-11-28 10:13:22.343 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:13:23 localhost ovn_controller[152322]: 2025-11-28T10:13:23Z|00496|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:13:23 localhost nova_compute[279673]: 2025-11-28 10:13:23.367 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:13:23 localhost dnsmasq[330849]: exiting on receipt of SIGTERM Nov 28 05:13:23 localhost podman[331172]: 2025-11-28 10:13:23.742094046 +0000 UTC m=+0.062686102 container kill fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41370116-60b0-4433-ab19-12e9b7026582, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Nov 28 05:13:23 localhost systemd[1]: libpod-fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79.scope: Deactivated successfully. Nov 28 05:13:23 localhost podman[331185]: 2025-11-28 10:13:23.817943413 +0000 UTC m=+0.061119833 container died fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41370116-60b0-4433-ab19-12e9b7026582, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125) Nov 28 05:13:23 localhost systemd[1]: tmp-crun.qG84Du.mount: Deactivated successfully. Nov 28 05:13:23 localhost podman[331185]: 2025-11-28 10:13:23.861335721 +0000 UTC m=+0.104512091 container cleanup fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41370116-60b0-4433-ab19-12e9b7026582, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125) Nov 28 05:13:23 localhost systemd[1]: libpod-conmon-fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79.scope: Deactivated successfully. Nov 28 05:13:23 localhost podman[331193]: 2025-11-28 10:13:23.909095982 +0000 UTC m=+0.137699204 container remove fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41370116-60b0-4433-ab19-12e9b7026582, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 05:13:23 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:13:23.939 261084 INFO neutron.agent.dhcp.agent [None req-05e642a3-4c7a-488c-a34c-e73794abd662 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:13:24 localhost neutron_dhcp_agent[261080]: 2025-11-28 10:13:24.140 261084 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Nov 28 05:13:24 localhost systemd[1]: var-lib-containers-storage-overlay-f48131d4cd50d23f0a08edebbccc523a267ca2a1360891739e8c240e1e9f0859-merged.mount: Deactivated successfully. Nov 28 05:13:24 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fa43d95179908ce2af4d3ce439f4fed57d1261b3307a8d5bbb30a05083b23f79-userdata-shm.mount: Deactivated successfully. Nov 28 05:13:24 localhost systemd[1]: run-netns-qdhcp\x2d41370116\x2d60b0\x2d4433\x2dab19\x2d12e9b7026582.mount: Deactivated successfully. Nov 28 05:13:26 localhost ovn_controller[152322]: 2025-11-28T10:13:26Z|00497|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:13:26 localhost nova_compute[279673]: 2025-11-28 10:13:26.549 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:13:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:13:27 localhost nova_compute[279673]: 2025-11-28 10:13:27.345 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:13:28 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97,allow rw path=/volumes/_nogroup/1f8e6c04-5771-4f46-846b-71f913803117/99d1667a-36eb-45db-8990-56f5fc443d2e", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50,allow rw pool=manila_data namespace=fsvolumens_1f8e6c04-5771-4f46-846b-71f913803117"]} v 0) Nov 28 05:13:28 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97,allow rw path=/volumes/_nogroup/1f8e6c04-5771-4f46-846b-71f913803117/99d1667a-36eb-45db-8990-56f5fc443d2e", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50,allow rw pool=manila_data namespace=fsvolumens_1f8e6c04-5771-4f46-846b-71f913803117"]} : dispatch Nov 28 05:13:28 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97,allow rw path=/volumes/_nogroup/1f8e6c04-5771-4f46-846b-71f913803117/99d1667a-36eb-45db-8990-56f5fc443d2e", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50,allow rw pool=manila_data namespace=fsvolumens_1f8e6c04-5771-4f46-846b-71f913803117"]}]': finished Nov 28 05:13:29 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Nov 28 05:13:29 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97,allow rw path=/volumes/_nogroup/1f8e6c04-5771-4f46-846b-71f913803117/99d1667a-36eb-45db-8990-56f5fc443d2e", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50,allow rw pool=manila_data namespace=fsvolumens_1f8e6c04-5771-4f46-846b-71f913803117"]} : dispatch Nov 28 05:13:29 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97,allow rw path=/volumes/_nogroup/1f8e6c04-5771-4f46-846b-71f913803117/99d1667a-36eb-45db-8990-56f5fc443d2e", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50,allow rw pool=manila_data namespace=fsvolumens_1f8e6c04-5771-4f46-846b-71f913803117"]} : dispatch Nov 28 05:13:29 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97,allow rw path=/volumes/_nogroup/1f8e6c04-5771-4f46-846b-71f913803117/99d1667a-36eb-45db-8990-56f5fc443d2e", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50,allow rw pool=manila_data namespace=fsvolumens_1f8e6c04-5771-4f46-846b-71f913803117"]}]': finished Nov 28 05:13:29 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Nov 28 05:13:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:13:31 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:13:31 localhost podman[331215]: 2025-11-28 10:13:31.856468979 +0000 UTC m=+0.087715693 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, name=ubi9-minimal) Nov 28 05:13:31 localhost podman[331215]: 2025-11-28 10:13:31.874408301 +0000 UTC m=+0.105655025 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, version=9.6, config_id=edpm, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible) Nov 28 05:13:31 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:13:32 localhost nova_compute[279673]: 2025-11-28 10:13:32.349 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:13:32 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50"]} v 0) Nov 28 05:13:32 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50"]} : dispatch Nov 28 05:13:32 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50"]}]': finished Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0. Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.672403) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67 Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324812672426, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 2361, "num_deletes": 259, "total_data_size": 2545114, "memory_usage": 2724472, "flush_reason": "Manual Compaction"} Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324812685606, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 2498832, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36663, "largest_seqno": 39023, "table_properties": {"data_size": 2488792, "index_size": 6097, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2885, "raw_key_size": 25107, "raw_average_key_size": 22, "raw_value_size": 2467168, "raw_average_value_size": 2175, "num_data_blocks": 263, "num_entries": 1134, "num_filter_entries": 1134, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324696, "oldest_key_time": 1764324696, "file_creation_time": 1764324812, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}} Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 13275 microseconds, and 4723 cpu microseconds. Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.685666) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 2498832 bytes OK Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.685695) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.689053) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.689074) EVENT_LOG_v1 {"time_micros": 1764324812689068, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.689093) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 2534388, prev total WAL file size 2534388, number of live WAL files 2. Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.689859) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132383031' seq:72057594037927935, type:22 .. '7061786F73003133303533' seq:0, type:0; will stop at (end) Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(2440KB)], [66(17MB)] Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324812689917, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 21214215, "oldest_snapshot_seqno": -1} Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 14366 keys, 19606922 bytes, temperature: kUnknown Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324812794364, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 19606922, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19522694, "index_size": 47199, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35973, "raw_key_size": 383926, "raw_average_key_size": 26, "raw_value_size": 19276866, "raw_average_value_size": 1341, "num_data_blocks": 1771, "num_entries": 14366, "num_filter_entries": 14366, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764324812, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}} Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.794752) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 19606922 bytes Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.796466) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 202.9 rd, 187.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 17.8 +0.0 blob) out(18.7 +0.0 blob), read-write-amplify(16.3) write-amplify(7.8) OK, records in: 14907, records dropped: 541 output_compression: NoCompression Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.796496) EVENT_LOG_v1 {"time_micros": 1764324812796482, "job": 40, "event": "compaction_finished", "compaction_time_micros": 104558, "compaction_time_cpu_micros": 55838, "output_level": 6, "num_output_files": 1, "total_output_size": 19606922, "num_input_records": 14907, "num_output_records": 14366, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324812797044, "job": 40, "event": "table_file_deletion", "file_number": 68} Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324812799616, "job": 40, "event": "table_file_deletion", "file_number": 66} Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.689748) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.799723) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.799731) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.799734) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.799737) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:13:32 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:13:32.799740) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:13:32 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50"]} : dispatch Nov 28 05:13:32 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Nov 28 05:13:32 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50"]} : dispatch Nov 28 05:13:32 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50/21395b47-9479-472b-88c4-193165317a97", "osd", "allow rw pool=manila_data namespace=fsvolumens_3592a7eb-5b6e-4ca3-b6c3-e13b81ee2f50"]}]': finished Nov 28 05:13:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:13:33 localhost podman[331253]: 2025-11-28 10:13:33.823152147 +0000 UTC m=+0.088384105 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 05:13:33 localhost podman[331253]: 2025-11-28 10:13:33.837482089 +0000 UTC m=+0.102714077 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 05:13:33 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:13:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:13:34 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain.devices.0}] v 0) Nov 28 05:13:34 localhost systemd[1]: tmp-crun.ilbtCv.mount: Deactivated successfully. Nov 28 05:13:34 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:13:34 localhost podman[331308]: 2025-11-28 10:13:34.261988179 +0000 UTC m=+0.087595831 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:13:34 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538513.localdomain}] v 0) Nov 28 05:13:34 localhost podman[331308]: 2025-11-28 10:13:34.270363017 +0000 UTC m=+0.095970729 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true) Nov 28 05:13:34 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:13:34 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:13:34 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain.devices.0}] v 0) Nov 28 05:13:34 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:13:34 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538514.localdomain}] v 0) Nov 28 05:13:34 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:13:34 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain.devices.0}] v 0) Nov 28 05:13:34 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:13:34 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005538515.localdomain}] v 0) Nov 28 05:13:34 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:13:35 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:13:35 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:13:35 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:13:35 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:13:35 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:13:35 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:13:35 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 05:13:35 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:13:35 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.bob"} v 0) Nov 28 05:13:35 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch Nov 28 05:13:35 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.bob"}]': finished Nov 28 05:13:36 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 28 05:13:36 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:13:36 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 05:13:36 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:13:36 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch Nov 28 05:13:36 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Nov 28 05:13:36 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch Nov 28 05:13:36 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.bob"}]': finished Nov 28 05:13:36 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:13:36 localhost nova_compute[279673]: 2025-11-28 10:13:36.548 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:13:36 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:13:37 localhost nova_compute[279673]: 2025-11-28 10:13:37.352 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:13:37 localhost ovn_metadata_agent[158125]: 2025-11-28 10:13:37.635 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:13:37 localhost ovn_metadata_agent[158125]: 2025-11-28 10:13:37.636 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 28 05:13:37 localhost nova_compute[279673]: 2025-11-28 10:13:37.680 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:13:38 localhost ovn_metadata_agent[158125]: 2025-11-28 10:13:38.638 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:13:40 localhost podman[238687]: time="2025-11-28T10:13:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:13:40 localhost podman[238687]: @ - - [28/Nov/2025:10:13:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 28 05:13:40 localhost podman[238687]: @ - - [28/Nov/2025:10:13:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19275 "" "Go-http-client/1.1" Nov 28 05:13:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:13:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:13:40 localhost podman[331417]: 2025-11-28 10:13:40.854262143 +0000 UTC m=+0.092570433 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:13:40 localhost podman[331418]: 2025-11-28 10:13:40.903391846 +0000 UTC m=+0.137961541 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.build-date=20251125, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Nov 28 05:13:40 localhost podman[331417]: 2025-11-28 10:13:40.918469391 +0000 UTC m=+0.156777721 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125) Nov 28 05:13:40 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:13:40 localhost podman[331418]: 2025-11-28 10:13:40.976481658 +0000 UTC m=+0.211051343 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:13:40 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:13:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:13:42 localhost nova_compute[279673]: 2025-11-28 10:13:42.355 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:13:42 localhost nova_compute[279673]: 2025-11-28 10:13:42.504 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:13:46 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:13:47 localhost ovn_controller[152322]: 2025-11-28T10:13:47Z|00498|binding|INFO|Releasing lport 3ff57c88-06c6-4894-984a-80ce116d1456 from this chassis (sb_readonly=0) Nov 28 05:13:47 localhost nova_compute[279673]: 2025-11-28 10:13:47.117 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:13:47 localhost nova_compute[279673]: 2025-11-28 10:13:47.358 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:13:47 localhost nova_compute[279673]: 2025-11-28 10:13:47.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:13:48 localhost openstack_network_exporter[240658]: ERROR 10:13:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:13:48 localhost openstack_network_exporter[240658]: ERROR 10:13:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:13:48 localhost openstack_network_exporter[240658]: ERROR 10:13:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:13:48 localhost openstack_network_exporter[240658]: ERROR 10:13:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:13:48 localhost openstack_network_exporter[240658]: Nov 28 05:13:48 localhost openstack_network_exporter[240658]: ERROR 10:13:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:13:48 localhost openstack_network_exporter[240658]: Nov 28 05:13:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:13:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:13:49 localhost podman[331465]: 2025-11-28 10:13:49.882996242 +0000 UTC m=+0.115356775 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125) Nov 28 05:13:49 localhost podman[331465]: 2025-11-28 10:13:49.923483489 +0000 UTC m=+0.155844042 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd) Nov 28 05:13:49 localhost systemd[1]: tmp-crun.TJpQ4m.mount: Deactivated successfully. Nov 28 05:13:49 localhost podman[331464]: 2025-11-28 10:13:49.939112271 +0000 UTC m=+0.173705743 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 05:13:49 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:13:49 localhost podman[331464]: 2025-11-28 10:13:49.973411538 +0000 UTC m=+0.208004990 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 05:13:49 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:13:50 localhost nova_compute[279673]: 2025-11-28 10:13:50.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:13:50 localhost nova_compute[279673]: 2025-11-28 10:13:50.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:13:50 localhost nova_compute[279673]: 2025-11-28 10:13:50.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 05:13:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:13:50.850 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:13:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:13:50.851 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:13:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:13:50.851 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:13:51 localhost nova_compute[279673]: 2025-11-28 10:13:51.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:13:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:13:52 localhost nova_compute[279673]: 2025-11-28 10:13:52.361 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:13:52 localhost nova_compute[279673]: 2025-11-28 10:13:52.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:13:52 localhost nova_compute[279673]: 2025-11-28 10:13:52.893 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:13:52 localhost nova_compute[279673]: 2025-11-28 10:13:52.894 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:13:52 localhost nova_compute[279673]: 2025-11-28 10:13:52.894 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:13:52 localhost nova_compute[279673]: 2025-11-28 10:13:52.895 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 05:13:52 localhost nova_compute[279673]: 2025-11-28 10:13:52.895 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:13:53 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:13:53 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3360059162' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:13:53 localhost nova_compute[279673]: 2025-11-28 10:13:53.375 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:13:53 localhost nova_compute[279673]: 2025-11-28 10:13:53.507 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:13:53 localhost nova_compute[279673]: 2025-11-28 10:13:53.508 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:13:53 localhost nova_compute[279673]: 2025-11-28 10:13:53.741 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 05:13:53 localhost nova_compute[279673]: 2025-11-28 10:13:53.743 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=11004MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 05:13:53 localhost nova_compute[279673]: 2025-11-28 10:13:53.744 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:13:53 localhost nova_compute[279673]: 2025-11-28 10:13:53.744 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:13:53 localhost nova_compute[279673]: 2025-11-28 10:13:53.809 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 05:13:53 localhost nova_compute[279673]: 2025-11-28 10:13:53.810 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 05:13:53 localhost nova_compute[279673]: 2025-11-28 10:13:53.810 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 05:13:53 localhost nova_compute[279673]: 2025-11-28 10:13:53.841 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:13:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:13:54 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2441468294' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:13:54 localhost nova_compute[279673]: 2025-11-28 10:13:54.314 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:13:54 localhost nova_compute[279673]: 2025-11-28 10:13:54.319 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 05:13:54 localhost nova_compute[279673]: 2025-11-28 10:13:54.459 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 05:13:54 localhost nova_compute[279673]: 2025-11-28 10:13:54.462 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 05:13:54 localhost nova_compute[279673]: 2025-11-28 10:13:54.462 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.718s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:13:55 localhost nova_compute[279673]: 2025-11-28 10:13:55.458 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:13:55 localhost nova_compute[279673]: 2025-11-28 10:13:55.459 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:13:55 localhost nova_compute[279673]: 2025-11-28 10:13:55.459 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:13:55 localhost nova_compute[279673]: 2025-11-28 10:13:55.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:13:55 localhost nova_compute[279673]: 2025-11-28 10:13:55.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 05:13:55 localhost nova_compute[279673]: 2025-11-28 10:13:55.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 05:13:56 localhost nova_compute[279673]: 2025-11-28 10:13:56.189 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 05:13:56 localhost nova_compute[279673]: 2025-11-28 10:13:56.189 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 05:13:56 localhost nova_compute[279673]: 2025-11-28 10:13:56.190 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 05:13:56 localhost nova_compute[279673]: 2025-11-28 10:13:56.190 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:13:56 localhost nova_compute[279673]: 2025-11-28 10:13:56.589 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:13:56 localhost nova_compute[279673]: 2025-11-28 10:13:56.609 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 05:13:56 localhost nova_compute[279673]: 2025-11-28 10:13:56.609 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 05:13:56 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:13:57 localhost nova_compute[279673]: 2025-11-28 10:13:57.364 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.677 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.677 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.678 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.682 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d868a28-b4de-498a-8330-bd01da89c264', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:14:00.678540', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'f57a4cd8-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.850460591, 'message_signature': '0e9a66ed60666ebaf463052fa4504659386944ba1e65b876c2943beeb9fd3049'}]}, 'timestamp': '2025-11-28 10:14:00.682969', '_unique_id': '946597200112445c85504c9b03a2796a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.684 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.685 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.704 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dcb010ef-e19f-488e-bd2a-34ff4b0a5bf8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:14:00.685940', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'f57dc110-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.876722411, 'message_signature': 'dc8a7112a8a0d9344ae9363e88d07d63f9445151167f97a97e468037086db9b9'}]}, 'timestamp': '2025-11-28 10:14:00.705544', '_unique_id': 'c5fe267d912345e0a97fb2f51af99957'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.706 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.707 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.707 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 19480000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef96c593-ef44-4bfb-b761-7da0285c98c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19480000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:14:00.707843', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'f57e34ce-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.876722411, 'message_signature': 'e8abcbf78fee2041a703fc228c5c8601872cf45d9fe16a07135f972aa47a2ffe'}]}, 'timestamp': '2025-11-28 10:14:00.708552', '_unique_id': 'e4b4cf76f4134883bf345453ecb2a92b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.709 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.711 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.711 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.741 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.741 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ee8c098-8428-4ef1-98cc-ad85a1927530', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:14:00.711907', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f5834a4a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.88382609, 'message_signature': 'df459dfa72d4ee6510592829158712171d7bc183b306600b5d308864c68cf2d0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:14:00.711907', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f5835a94-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.88382609, 'message_signature': 'f8eed7b15760973bc9fdaceeba11be1733395805a67793569d75bf220029f93c'}]}, 'timestamp': '2025-11-28 10:14:00.742214', '_unique_id': 'b529ff80d4774d528e162e819294c3b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.743 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.744 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.744 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.744 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3c0c199-94fd-467f-8cd5-15697eee34c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:14:00.744459', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f583c538-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.88382609, 'message_signature': '4960ce9d0bde6ea33ca11c76502c36567dd206dc949cd08a97256621dd70a4e4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:14:00.744459', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f583d708-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.88382609, 'message_signature': '2dd80774e285e25a823cabf8b39fd95ec14d60df7e09bfd272caf3628e26b782'}]}, 'timestamp': '2025-11-28 10:14:00.745342', '_unique_id': 'a16028973ab64946ae5ab6361f3eff8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.746 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.747 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.747 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.747 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a5178e3-9e3e-41c9-b5fa-77fc2c918466', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:14:00.747452', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f58439aa-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.88382609, 'message_signature': '3e3b18363832cb2dcc32acef006134de0c57968da36a978a7373e551228e967f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:14:00.747452', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f5844a58-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.88382609, 'message_signature': 'a1a95370a42b972887813a30e03d4e995ec8805ec0ccc9231b4e2da4be62895a'}]}, 'timestamp': '2025-11-28 10:14:00.748292', '_unique_id': 'bae1250ddee3415a9139fed83b1f8395'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.749 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.750 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.750 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.750 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cde4e7c7-c2aa-4500-a861-30e005d177ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:14:00.750359', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f584ac28-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.88382609, 'message_signature': 'b6de245ec2c1e9a3a1f6a27e085966367ef5fec1afeab7ebe4db50d0ad3bdf5b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:14:00.750359', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f584bbaa-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.88382609, 'message_signature': '8d2609cb5b0b57c0bff55bc4c168a3a959819d2e1dabd3fb54fdc495fd552d9b'}]}, 'timestamp': '2025-11-28 10:14:00.751223', '_unique_id': 'c654cee9c45d4755bb576398b3b7a16b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.752 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.753 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.753 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64fd8631-5e19-4aeb-9352-c309f26b16d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:14:00.753270', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'f5851d0c-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.850460591, 'message_signature': '40735ca1ac35e761c263355040fa23096262f8eb4bbbdf8401a4670f1a6cedb0'}]}, 'timestamp': '2025-11-28 10:14:00.753713', '_unique_id': '9bda825ea1d443c3860453da8da30387'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.754 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.755 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.755 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73ddfabd-c2a6-4105-85ab-e42afdf5a2e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:14:00.755791', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'f5858152-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.850460591, 'message_signature': '12a5a2c6bc181f52b91e118db65d480d3b0586abd5fd39092492f9232c66658e'}]}, 'timestamp': '2025-11-28 10:14:00.756301', '_unique_id': '4bba39267bb447a798e520e5116ca556'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.757 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.758 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.758 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7b788b5-4031-4b4e-a8f8-31cdf42d7293', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:14:00.758503', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'f585ea3e-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.850460591, 'message_signature': 'a5b2fbdfeebecb3e5bebf4e90fc4d076749ba6e3a9e0eef8e17762ce381013d7'}]}, 'timestamp': '2025-11-28 10:14:00.758967', '_unique_id': '5d00f9acb34b4589b50acb4edfbb1d50'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.759 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.760 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.761 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eaa95f89-a1bb-4f85-8a36-3cf3872371f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:14:00.760993', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'f5864bdc-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.850460591, 'message_signature': '886e8b00e874328799d7b8b509f751c77eb32e039466b24fb1ae23bb3eb6acae'}]}, 'timestamp': '2025-11-28 10:14:00.761463', '_unique_id': '785626421b6e42d494ceda490f07bfde'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.762 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.763 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.763 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.763 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'acfb2f54-dc0f-43e2-8c02-40343f70b959', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:14:00.763499', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f586ac4e-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.88382609, 'message_signature': '33be5fb1b45144f6528e2447108167a385830d7f8ea9c4ecf0af090d956300d9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:14:00.763499', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f586bd2e-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.88382609, 'message_signature': '0d2c040b1ef393fc8b35d8c8da5ab99839ce239ef9022c50cb1dc5fe017e274e'}]}, 'timestamp': '2025-11-28 10:14:00.764339', '_unique_id': '93acf28e807f49d796f5fa71dd05cb6f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.765 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.766 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.766 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57499fb7-7a3c-43b8-86c9-3158e5ced404', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:14:00.766384', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'f5871ef4-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.850460591, 'message_signature': '8b29bb967bbb3a95f6a42471df4199b54c3fbc0775639748ade3611d2e9ff807'}]}, 'timestamp': '2025-11-28 10:14:00.766872', '_unique_id': '57bbc126314f44919b156bd4133fbecd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.767 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.768 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.779 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.780 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23dae271-5a5a-4254-b98e-a0e527836da2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:14:00.768871', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f5892870-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.940764365, 'message_signature': 'd9701a2cf6e20e5218867b33b6424adfad31e39c8f01c0dd29dfebe528552a14'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:14:00.768871', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f5893b1c-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.940764365, 'message_signature': 'a4d6aca46b55527bdfed4b898eae3814cfb54de4c7cd3c4014879d6526ffa889'}]}, 'timestamp': '2025-11-28 10:14:00.780676', '_unique_id': '63242bc9266e400f8a45e9cc1a65f369'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.781 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.782 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.782 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0f41099-8891-4996-a3f3-33dc114138f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:14:00.782876', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'f589a372-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.850460591, 'message_signature': '8f3763223d28785812c9c2c9a9dc8f0b0b96042ea351d2c07fe5d0601f7f0331'}]}, 'timestamp': '2025-11-28 10:14:00.783371', '_unique_id': '2423626ab29a4db4b2747c50e53331e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.785 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.785 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.785 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d568b5c-c0b4-45a5-876c-0e05d2255955', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:14:00.785385', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f58a038a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.940764365, 'message_signature': 'c95c6e8131729527a1a9f8a5a9fc24eb1ee6bea3f4c86c77bb9ddc556c13e913'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:14:00.785385', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f58a1320-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.940764365, 'message_signature': '474212aaafd6b616a1b82196d6e8615e1ff6889c831c949a6b5350dd878a672c'}]}, 'timestamp': '2025-11-28 10:14:00.786228', '_unique_id': '4f3e46ef535643e3915f9e2f658fc0eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.788 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.788 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aff9107c-d3cf-43e0-b851-146423c5049f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:14:00.788306', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'f58a759a-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.850460591, 'message_signature': '19ce452baebe69fe7718333fe163db4002b1284eac99d386cac8e2be294c4c54'}]}, 'timestamp': '2025-11-28 10:14:00.788755', '_unique_id': 'e09cbc05208f40398702bc3f06a16c51'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.789 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.790 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.790 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.790 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.791 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be1494b7-444d-497c-ac0e-70b6342ff71f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:14:00.790920', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f58add78-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.88382609, 'message_signature': '631f6fa29642b3b956e9a82f8b9613d7e01befc0e4d09509f0950f18a1bcffef'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:14:00.790920', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f58aed68-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.88382609, 'message_signature': '7b6de110ab609e0dd9d3b2870cf3a44ed0f25f807ca9ed188670defc17e55a3a'}]}, 'timestamp': '2025-11-28 10:14:00.791786', '_unique_id': 'c382b1f5c7884190a2cfa3884cafc2c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.792 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.793 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.794 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b8e0390-9842-4ed9-9a0c-743f007d8cd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:14:00.794101', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'f58b5848-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.850460591, 'message_signature': 'f709d8b0a41685d2f2249685b9b051a0ecd71a5564a4965a37179001955e33ca'}]}, 'timestamp': '2025-11-28 10:14:00.794575', '_unique_id': '0f96516ee9b643689d1e5fae3c8728eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.795 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.796 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.796 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2e3558c-9f41-4755-9f8a-901833a3082a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:14:00.796590', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': 'f58bb914-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.850460591, 'message_signature': 'bb693e1c9aa8a036838fcf2885c179c3dcbb9bf64d1884f7fe57e1b9d6d43692'}]}, 'timestamp': '2025-11-28 10:14:00.797061', '_unique_id': '198a2a6f893f41afaa2be52c61c716a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.797 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.799 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.799 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.799 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.799 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '262a87b5-9948-4ca8-8ff6-b742ec38acc0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:14:00.799285', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f58c2232-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.940764365, 'message_signature': '90d09fd1a461ff47df31fa875283422a3a7740a4b7ab4c002a58866890ab78cc'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:14:00.799285', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f58c31dc-cc42-11f0-a370-fa163eb02593', 'monotonic_time': 12674.940764365, 'message_signature': '6ff5ace67115f77e36d3a1775e09907d6d54827fda0150a07193f3ccf16c1f92'}]}, 'timestamp': '2025-11-28 10:14:00.800122', '_unique_id': '3fb142321a1e4e63ba4818e2b2be853f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:14:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:14:00.800 12 ERROR oslo_messaging.notify.messaging Nov 28 05:14:01 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:14:02 localhost nova_compute[279673]: 2025-11-28 10:14:02.366 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:14:02 localhost nova_compute[279673]: 2025-11-28 10:14:02.604 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:14:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:14:02 localhost systemd[1]: tmp-crun.K9V3gP.mount: Deactivated successfully. Nov 28 05:14:02 localhost podman[331551]: 2025-11-28 10:14:02.852560746 +0000 UTC m=+0.087278100 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, release=1755695350) Nov 28 05:14:02 localhost podman[331551]: 2025-11-28 10:14:02.868820788 +0000 UTC m=+0.103538182 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-type=git, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, distribution-scope=public, config_id=edpm, architecture=x86_64, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, container_name=openstack_network_exporter) Nov 28 05:14:02 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:14:03 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/8d719993-3b66-454f-a026-687de7e6b3e4/a35af86e-1bef-43ca-805f-c714f40e8411", "osd", "allow rw pool=manila_data namespace=fsvolumens_8d719993-3b66-454f-a026-687de7e6b3e4", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:14:03 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/8d719993-3b66-454f-a026-687de7e6b3e4/a35af86e-1bef-43ca-805f-c714f40e8411", "osd", "allow rw pool=manila_data namespace=fsvolumens_8d719993-3b66-454f-a026-687de7e6b3e4", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:14:03 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/8d719993-3b66-454f-a026-687de7e6b3e4/a35af86e-1bef-43ca-805f-c714f40e8411", "osd", "allow rw pool=manila_data namespace=fsvolumens_8d719993-3b66-454f-a026-687de7e6b3e4", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:14:03 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch Nov 28 05:14:03 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/8d719993-3b66-454f-a026-687de7e6b3e4/a35af86e-1bef-43ca-805f-c714f40e8411", "osd", "allow rw pool=manila_data namespace=fsvolumens_8d719993-3b66-454f-a026-687de7e6b3e4", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:14:03 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/8d719993-3b66-454f-a026-687de7e6b3e4/a35af86e-1bef-43ca-805f-c714f40e8411", "osd", "allow rw pool=manila_data namespace=fsvolumens_8d719993-3b66-454f-a026-687de7e6b3e4", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:14:03 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/8d719993-3b66-454f-a026-687de7e6b3e4/a35af86e-1bef-43ca-805f-c714f40e8411", "osd", "allow rw pool=manila_data namespace=fsvolumens_8d719993-3b66-454f-a026-687de7e6b3e4", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:14:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:14:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:14:04 localhost podman[331573]: 2025-11-28 10:14:04.851626694 +0000 UTC m=+0.084039481 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:14:04 localhost podman[331573]: 2025-11-28 10:14:04.856892106 +0000 UTC m=+0.089304873 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:14:04 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:14:04 localhost systemd[1]: tmp-crun.vmWrMA.mount: Deactivated successfully. Nov 28 05:14:04 localhost podman[331572]: 2025-11-28 10:14:04.952152851 +0000 UTC m=+0.188994055 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 28 05:14:04 localhost podman[331572]: 2025-11-28 10:14:04.964389858 +0000 UTC m=+0.201231032 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 05:14:04 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:14:06 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:14:07 localhost nova_compute[279673]: 2025-11-28 10:14:07.368 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:14:10 localhost podman[238687]: time="2025-11-28T10:14:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:14:10 localhost podman[238687]: @ - - [28/Nov/2025:10:14:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 28 05:14:10 localhost podman[238687]: @ - - [28/Nov/2025:10:14:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19283 "" "Go-http-client/1.1" Nov 28 05:14:11 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch Nov 28 05:14:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:14:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:14:11 localhost podman[331612]: 2025-11-28 10:14:11.874879967 +0000 UTC m=+0.108391871 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0) Nov 28 05:14:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:14:11 localhost podman[331612]: 2025-11-28 10:14:11.915537431 +0000 UTC m=+0.149049305 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Nov 28 05:14:11 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:14:11 localhost podman[331613]: 2025-11-28 10:14:11.932870094 +0000 UTC m=+0.162352663 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251125) Nov 28 05:14:11 localhost podman[331613]: 2025-11-28 10:14:11.995887236 +0000 UTC m=+0.225369805 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Nov 28 05:14:12 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:14:12 localhost nova_compute[279673]: 2025-11-28 10:14:12.372 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:14:13 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-241168775", "caps": ["mds", "allow rw path=/volumes/_nogroup/d56bb3f2-efa0-4328-9320-c5298bccaeb7/3fd14072-fd4e-434b-b433-15cdcb82070a", "osd", "allow rw pool=manila_data namespace=fsvolumens_d56bb3f2-efa0-4328-9320-c5298bccaeb7", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:14:13 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-241168775", "caps": ["mds", "allow rw path=/volumes/_nogroup/d56bb3f2-efa0-4328-9320-c5298bccaeb7/3fd14072-fd4e-434b-b433-15cdcb82070a", "osd", "allow rw pool=manila_data namespace=fsvolumens_d56bb3f2-efa0-4328-9320-c5298bccaeb7", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:14:13 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-241168775", "caps": ["mds", "allow rw path=/volumes/_nogroup/d56bb3f2-efa0-4328-9320-c5298bccaeb7/3fd14072-fd4e-434b-b433-15cdcb82070a", "osd", "allow rw pool=manila_data namespace=fsvolumens_d56bb3f2-efa0-4328-9320-c5298bccaeb7", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:14:13 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-241168775", "format": "json"} : dispatch Nov 28 05:14:13 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-241168775", "caps": ["mds", "allow rw path=/volumes/_nogroup/d56bb3f2-efa0-4328-9320-c5298bccaeb7/3fd14072-fd4e-434b-b433-15cdcb82070a", "osd", "allow rw pool=manila_data namespace=fsvolumens_d56bb3f2-efa0-4328-9320-c5298bccaeb7", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:14:13 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-241168775", "caps": ["mds", "allow rw path=/volumes/_nogroup/d56bb3f2-efa0-4328-9320-c5298bccaeb7/3fd14072-fd4e-434b-b433-15cdcb82070a", "osd", "allow rw pool=manila_data namespace=fsvolumens_d56bb3f2-efa0-4328-9320-c5298bccaeb7", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:14:13 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-241168775", "caps": ["mds", "allow rw path=/volumes/_nogroup/d56bb3f2-efa0-4328-9320-c5298bccaeb7/3fd14072-fd4e-434b-b433-15cdcb82070a", "osd", "allow rw pool=manila_data namespace=fsvolumens_d56bb3f2-efa0-4328-9320-c5298bccaeb7", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:14:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:14:17 localhost nova_compute[279673]: 2025-11-28 10:14:17.374 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:14:18 localhost openstack_network_exporter[240658]: ERROR 10:14:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:14:18 localhost openstack_network_exporter[240658]: ERROR 10:14:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:14:18 localhost openstack_network_exporter[240658]: ERROR 10:14:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:14:18 localhost openstack_network_exporter[240658]: ERROR 10:14:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:14:18 localhost openstack_network_exporter[240658]: Nov 28 05:14:18 localhost openstack_network_exporter[240658]: ERROR 10:14:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:14:18 localhost openstack_network_exporter[240658]: Nov 28 05:14:18 localhost ovn_controller[152322]: 2025-11-28T10:14:18Z|00499|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory Nov 28 05:14:20 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-241168775"} v 0) Nov 28 05:14:20 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-241168775"} : dispatch Nov 28 05:14:20 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-241168775"}]': finished Nov 28 05:14:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:14:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:14:20 localhost podman[331656]: 2025-11-28 10:14:20.851091598 +0000 UTC m=+0.084347879 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 05:14:20 localhost podman[331656]: 2025-11-28 10:14:20.884166418 +0000 UTC m=+0.117422679 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 05:14:20 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:14:20 localhost podman[331657]: 2025-11-28 10:14:20.904207365 +0000 UTC m=+0.135791805 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd) Nov 28 05:14:20 localhost podman[331657]: 2025-11-28 10:14:20.9397401 +0000 UTC m=+0.171324500 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 28 05:14:20 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:14:21 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-241168775"} : dispatch Nov 28 05:14:21 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-241168775", "format": "json"} : dispatch Nov 28 05:14:21 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-241168775"} : dispatch Nov 28 05:14:21 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-241168775"}]': finished Nov 28 05:14:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:14:22 localhost nova_compute[279673]: 2025-11-28 10:14:22.378 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:14:23 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.Joe"} v 0) Nov 28 05:14:23 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch Nov 28 05:14:23 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.Joe"}]': finished Nov 28 05:14:23 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch Nov 28 05:14:23 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch Nov 28 05:14:23 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch Nov 28 05:14:23 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.Joe"}]': finished Nov 28 05:14:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:14:27 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin", "format": "json"} : dispatch Nov 28 05:14:27 localhost nova_compute[279673]: 2025-11-28 10:14:27.380 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:14:30 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/4c25470d-c14c-4093-b430-b79c735aaf06/471f0b39-a16d-4e49-a0e3-afb597cde17a", "osd", "allow rw pool=manila_data namespace=fsvolumens_4c25470d-c14c-4093-b430-b79c735aaf06", "mon", "allow r"], "format": "json"} v 0) Nov 28 05:14:30 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/4c25470d-c14c-4093-b430-b79c735aaf06/471f0b39-a16d-4e49-a0e3-afb597cde17a", "osd", "allow rw pool=manila_data namespace=fsvolumens_4c25470d-c14c-4093-b430-b79c735aaf06", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:14:30 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/4c25470d-c14c-4093-b430-b79c735aaf06/471f0b39-a16d-4e49-a0e3-afb597cde17a", "osd", "allow rw pool=manila_data namespace=fsvolumens_4c25470d-c14c-4093-b430-b79c735aaf06", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:14:31 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch Nov 28 05:14:31 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/4c25470d-c14c-4093-b430-b79c735aaf06/471f0b39-a16d-4e49-a0e3-afb597cde17a", "osd", "allow rw pool=manila_data namespace=fsvolumens_4c25470d-c14c-4093-b430-b79c735aaf06", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:14:31 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/4c25470d-c14c-4093-b430-b79c735aaf06/471f0b39-a16d-4e49-a0e3-afb597cde17a", "osd", "allow rw pool=manila_data namespace=fsvolumens_4c25470d-c14c-4093-b430-b79c735aaf06", "mon", "allow r"], "format": "json"} : dispatch Nov 28 05:14:31 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/4c25470d-c14c-4093-b430-b79c735aaf06/471f0b39-a16d-4e49-a0e3-afb597cde17a", "osd", "allow rw pool=manila_data namespace=fsvolumens_4c25470d-c14c-4093-b430-b79c735aaf06", "mon", "allow r"], "format": "json"}]': finished Nov 28 05:14:31 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:14:32 localhost nova_compute[279673]: 2025-11-28 10:14:32.384 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:14:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:14:34 localhost podman[331698]: 2025-11-28 10:14:34.188771923 +0000 UTC m=+0.072389392 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, io.openshift.expose-services=, version=9.6) Nov 28 05:14:34 localhost podman[331698]: 2025-11-28 10:14:34.202374152 +0000 UTC m=+0.085991611 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, distribution-scope=public, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Nov 28 05:14:34 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:14:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:14:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:14:35 localhost podman[331720]: 2025-11-28 10:14:35.499952894 +0000 UTC m=+0.089781567 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible) Nov 28 05:14:35 localhost podman[331720]: 2025-11-28 10:14:35.532003362 +0000 UTC m=+0.121832045 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent) Nov 28 05:14:35 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:14:35 localhost podman[331719]: 2025-11-28 10:14:35.556356592 +0000 UTC m=+0.148979581 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 05:14:35 localhost podman[331719]: 2025-11-28 10:14:35.564226675 +0000 UTC m=+0.156849664 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 05:14:35 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:14:36 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 05:14:36 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:14:36 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:14:37 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 05:14:37 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:14:37 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch Nov 28 05:14:37 localhost nova_compute[279673]: 2025-11-28 10:14:37.386 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:14:40 localhost podman[238687]: time="2025-11-28T10:14:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:14:40 localhost podman[238687]: @ - - [28/Nov/2025:10:14:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 28 05:14:40 localhost podman[238687]: @ - - [28/Nov/2025:10:14:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19276 "" "Go-http-client/1.1" Nov 28 05:14:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 28 05:14:41 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:14:41 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:14:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:14:42 localhost nova_compute[279673]: 2025-11-28 10:14:42.390 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:14:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:14:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:14:42 localhost podman[331846]: 2025-11-28 10:14:42.850853632 +0000 UTC m=+0.079573752 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller) Nov 28 05:14:42 localhost podman[331846]: 2025-11-28 10:14:42.893895979 +0000 UTC m=+0.122616109 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS) Nov 28 05:14:42 localhost podman[331845]: 2025-11-28 10:14:42.911220973 +0000 UTC m=+0.141997587 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=edpm, io.buildah.version=1.41.3) Nov 28 05:14:42 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:14:42 localhost podman[331845]: 2025-11-28 10:14:42.95365962 +0000 UTC m=+0.184436294 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:14:42 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:14:43 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.david"} v 0) Nov 28 05:14:43 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch Nov 28 05:14:43 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.david"}]': finished Nov 28 05:14:43 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch Nov 28 05:14:43 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch Nov 28 05:14:43 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch Nov 28 05:14:43 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' cmd='[{"prefix": "auth rm", "entity": "client.david"}]': finished Nov 28 05:14:46 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:14:47 localhost nova_compute[279673]: 2025-11-28 10:14:47.393 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:14:48 localhost openstack_network_exporter[240658]: ERROR 10:14:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:14:48 localhost openstack_network_exporter[240658]: ERROR 10:14:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:14:48 localhost openstack_network_exporter[240658]: ERROR 10:14:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:14:48 localhost openstack_network_exporter[240658]: ERROR 10:14:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:14:48 localhost openstack_network_exporter[240658]: Nov 28 05:14:48 localhost openstack_network_exporter[240658]: ERROR 10:14:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:14:48 localhost openstack_network_exporter[240658]: Nov 28 05:14:49 localhost nova_compute[279673]: 2025-11-28 10:14:49.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:14:50 localhost nova_compute[279673]: 2025-11-28 10:14:50.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:14:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:14:50.852 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:14:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:14:50.852 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:14:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:14:50.853 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:14:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:14:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:14:51 localhost nova_compute[279673]: 2025-11-28 10:14:51.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:14:51 localhost nova_compute[279673]: 2025-11-28 10:14:51.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:14:51 localhost nova_compute[279673]: 2025-11-28 10:14:51.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 05:14:51 localhost podman[331887]: 2025-11-28 10:14:51.849623377 +0000 UTC m=+0.081455741 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd) Nov 28 05:14:51 localhost podman[331887]: 2025-11-28 10:14:51.864475114 +0000 UTC m=+0.096307458 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:14:51 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:14:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:14:51 localhost podman[331886]: 2025-11-28 10:14:51.945325236 +0000 UTC m=+0.179456261 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 05:14:51 localhost podman[331886]: 2025-11-28 10:14:51.978619092 +0000 UTC m=+0.212750117 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 05:14:51 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:14:52 localhost nova_compute[279673]: 2025-11-28 10:14:52.397 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:14:53 localhost nova_compute[279673]: 2025-11-28 10:14:53.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:14:53 localhost nova_compute[279673]: 2025-11-28 10:14:53.792 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:14:53 localhost nova_compute[279673]: 2025-11-28 10:14:53.793 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:14:53 localhost nova_compute[279673]: 2025-11-28 10:14:53.793 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:14:53 localhost nova_compute[279673]: 2025-11-28 10:14:53.793 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 05:14:53 localhost nova_compute[279673]: 2025-11-28 10:14:53.794 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:14:54 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:14:54 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2621744934' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:14:54 localhost nova_compute[279673]: 2025-11-28 10:14:54.242 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:14:54 localhost nova_compute[279673]: 2025-11-28 10:14:54.300 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:14:54 localhost nova_compute[279673]: 2025-11-28 10:14:54.301 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:14:54 localhost nova_compute[279673]: 2025-11-28 10:14:54.510 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 05:14:54 localhost nova_compute[279673]: 2025-11-28 10:14:54.511 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=10996MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 05:14:54 localhost nova_compute[279673]: 2025-11-28 10:14:54.511 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:14:54 localhost nova_compute[279673]: 2025-11-28 10:14:54.511 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:14:54 localhost nova_compute[279673]: 2025-11-28 10:14:54.584 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 05:14:54 localhost nova_compute[279673]: 2025-11-28 10:14:54.584 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 05:14:54 localhost nova_compute[279673]: 2025-11-28 10:14:54.585 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 05:14:54 localhost nova_compute[279673]: 2025-11-28 10:14:54.635 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:14:55 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:14:55 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3985923543' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:14:55 localhost nova_compute[279673]: 2025-11-28 10:14:55.110 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:14:55 localhost nova_compute[279673]: 2025-11-28 10:14:55.117 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 05:14:55 localhost nova_compute[279673]: 2025-11-28 10:14:55.141 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 05:14:55 localhost nova_compute[279673]: 2025-11-28 10:14:55.144 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 05:14:55 localhost nova_compute[279673]: 2025-11-28 10:14:55.144 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.633s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:14:56 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:14:57 localhost nova_compute[279673]: 2025-11-28 10:14:57.141 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:14:57 localhost nova_compute[279673]: 2025-11-28 10:14:57.142 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:14:57 localhost nova_compute[279673]: 2025-11-28 10:14:57.142 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:14:57 localhost nova_compute[279673]: 2025-11-28 10:14:57.399 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:14:57 localhost nova_compute[279673]: 2025-11-28 10:14:57.400 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:14:57 localhost nova_compute[279673]: 2025-11-28 10:14:57.400 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:14:57 localhost nova_compute[279673]: 2025-11-28 10:14:57.401 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:14:57 localhost nova_compute[279673]: 2025-11-28 10:14:57.402 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:14:57 localhost nova_compute[279673]: 2025-11-28 10:14:57.405 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:14:57 localhost nova_compute[279673]: 2025-11-28 10:14:57.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:14:57 localhost nova_compute[279673]: 2025-11-28 10:14:57.773 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 05:14:57 localhost nova_compute[279673]: 2025-11-28 10:14:57.773 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 05:14:58 localhost nova_compute[279673]: 2025-11-28 10:14:58.255 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 05:14:58 localhost nova_compute[279673]: 2025-11-28 10:14:58.256 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 05:14:58 localhost nova_compute[279673]: 2025-11-28 10:14:58.256 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 05:14:58 localhost nova_compute[279673]: 2025-11-28 10:14:58.257 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:14:58 localhost nova_compute[279673]: 2025-11-28 10:14:58.676 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:14:58 localhost nova_compute[279673]: 2025-11-28 10:14:58.691 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 05:14:58 localhost nova_compute[279673]: 2025-11-28 10:14:58.691 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 05:15:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:15:00.754 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:15:00 localhost nova_compute[279673]: 2025-11-28 10:15:00.754 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:15:00 localhost ovn_metadata_agent[158125]: 2025-11-28 10:15:00.755 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 28 05:15:01 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:15:02 localhost nova_compute[279673]: 2025-11-28 10:15:02.402 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:15:02 localhost nova_compute[279673]: 2025-11-28 10:15:02.406 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:15:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:15:04 localhost systemd[1]: tmp-crun.PUi2x2.mount: Deactivated successfully. Nov 28 05:15:04 localhost podman[331972]: 2025-11-28 10:15:04.849601801 +0000 UTC m=+0.084394793 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers) Nov 28 05:15:04 localhost podman[331972]: 2025-11-28 10:15:04.865517704 +0000 UTC m=+0.100310696 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, config_id=edpm, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, release=1755695350, managed_by=edpm_ansible, io.buildah.version=1.33.7) Nov 28 05:15:04 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:15:05 localhost ovn_metadata_agent[158125]: 2025-11-28 10:15:05.758 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:15:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:15:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:15:05 localhost systemd[1]: tmp-crun.olZtNb.mount: Deactivated successfully. Nov 28 05:15:05 localhost podman[331992]: 2025-11-28 10:15:05.839983307 +0000 UTC m=+0.068316306 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 05:15:05 localhost podman[331991]: 2025-11-28 10:15:05.900415157 +0000 UTC m=+0.128611822 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Nov 28 05:15:05 localhost podman[331991]: 2025-11-28 10:15:05.911552782 +0000 UTC m=+0.139749487 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 28 05:15:05 localhost podman[331992]: 2025-11-28 10:15:05.926381182 +0000 UTC m=+0.154714231 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:15:05 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:15:05 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:15:06 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:15:07 localhost nova_compute[279673]: 2025-11-28 10:15:07.404 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:15:07 localhost nova_compute[279673]: 2025-11-28 10:15:07.408 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:15:10 localhost podman[238687]: time="2025-11-28T10:15:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:15:10 localhost podman[238687]: @ - - [28/Nov/2025:10:15:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 28 05:15:10 localhost podman[238687]: @ - - [28/Nov/2025:10:15:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19285 "" "Go-http-client/1.1" Nov 28 05:15:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:15:12 localhost nova_compute[279673]: 2025-11-28 10:15:12.407 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:15:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:15:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:15:13 localhost podman[332031]: 2025-11-28 10:15:13.852791003 +0000 UTC m=+0.087901892 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Nov 28 05:15:13 localhost podman[332032]: 2025-11-28 10:15:13.900942663 +0000 UTC m=+0.131226013 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Nov 28 05:15:13 localhost podman[332031]: 2025-11-28 10:15:13.919391814 +0000 UTC m=+0.154502703 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2) Nov 28 05:15:13 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:15:13 localhost podman[332032]: 2025-11-28 10:15:13.994811959 +0000 UTC m=+0.225095249 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller) Nov 28 05:15:14 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:15:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:15:17 localhost nova_compute[279673]: 2025-11-28 10:15:17.410 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:15:18 localhost openstack_network_exporter[240658]: ERROR 10:15:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:15:18 localhost openstack_network_exporter[240658]: ERROR 10:15:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:15:18 localhost openstack_network_exporter[240658]: ERROR 10:15:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:15:18 localhost openstack_network_exporter[240658]: ERROR 10:15:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:15:18 localhost openstack_network_exporter[240658]: Nov 28 05:15:18 localhost openstack_network_exporter[240658]: ERROR 10:15:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:15:18 localhost openstack_network_exporter[240658]: Nov 28 05:15:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:15:22 localhost nova_compute[279673]: 2025-11-28 10:15:22.412 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:15:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:15:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:15:22 localhost podman[332075]: 2025-11-28 10:15:22.854661807 +0000 UTC m=+0.082394830 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125) Nov 28 05:15:22 localhost podman[332075]: 2025-11-28 10:15:22.86829704 +0000 UTC m=+0.096030063 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:15:22 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:15:22 localhost systemd[1]: tmp-crun.MNzzRV.mount: Deactivated successfully. Nov 28 05:15:22 localhost podman[332074]: 2025-11-28 10:15:22.969213743 +0000 UTC m=+0.199908258 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 05:15:22 localhost podman[332074]: 2025-11-28 10:15:22.979580114 +0000 UTC m=+0.210274609 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 05:15:22 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:15:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:15:27 localhost nova_compute[279673]: 2025-11-28 10:15:27.415 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:15:31 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:15:32 localhost nova_compute[279673]: 2025-11-28 10:15:32.418 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:15:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:15:35 localhost systemd[1]: tmp-crun.LMuaek.mount: Deactivated successfully. Nov 28 05:15:35 localhost podman[332117]: 2025-11-28 10:15:35.530674031 +0000 UTC m=+0.127469246 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Nov 28 05:15:35 localhost podman[332117]: 2025-11-28 10:15:35.542509588 +0000 UTC m=+0.139304783 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Nov 28 05:15:35 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:15:35 localhost nova_compute[279673]: 2025-11-28 10:15:35.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:15:35 localhost nova_compute[279673]: 2025-11-28 10:15:35.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Nov 28 05:15:35 localhost nova_compute[279673]: 2025-11-28 10:15:35.791 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Nov 28 05:15:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:15:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:15:36 localhost podman[332138]: 2025-11-28 10:15:36.850175734 +0000 UTC m=+0.086642563 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 28 05:15:36 localhost podman[332138]: 2025-11-28 10:15:36.858239234 +0000 UTC m=+0.094706113 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 05:15:36 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:15:36 localhost systemd[1]: tmp-crun.xIyuWZ.mount: Deactivated successfully. Nov 28 05:15:36 localhost podman[332139]: 2025-11-28 10:15:36.925740103 +0000 UTC m=+0.157352571 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:15:36 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:15:36 localhost podman[332139]: 2025-11-28 10:15:36.958476767 +0000 UTC m=+0.190089195 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:15:36 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:15:37 localhost nova_compute[279673]: 2025-11-28 10:15:37.422 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:15:37 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 05:15:37 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:15:38 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 05:15:38 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:15:40 localhost podman[238687]: time="2025-11-28T10:15:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:15:40 localhost podman[238687]: @ - - [28/Nov/2025:10:15:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 28 05:15:40 localhost podman[238687]: @ - - [28/Nov/2025:10:15:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19276 "" "Go-http-client/1.1" Nov 28 05:15:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 28 05:15:41 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:15:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:15:42 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:15:42 localhost nova_compute[279673]: 2025-11-28 10:15:42.423 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:15:42 localhost nova_compute[279673]: 2025-11-28 10:15:42.427 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:15:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:15:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:15:44 localhost systemd[1]: tmp-crun.W9VFKS.mount: Deactivated successfully. Nov 28 05:15:44 localhost podman[332264]: 2025-11-28 10:15:44.862219304 +0000 UTC m=+0.097571191 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm) Nov 28 05:15:44 localhost podman[332265]: 2025-11-28 10:15:44.906045981 +0000 UTC m=+0.138279021 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true) Nov 28 05:15:44 localhost podman[332264]: 2025-11-28 10:15:44.929489356 +0000 UTC m=+0.164841233 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0) Nov 28 05:15:44 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:15:45 localhost podman[332265]: 2025-11-28 10:15:45.016595273 +0000 UTC m=+0.248828353 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true) Nov 28 05:15:45 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:15:46 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:15:47 localhost nova_compute[279673]: 2025-11-28 10:15:47.426 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:15:48 localhost openstack_network_exporter[240658]: ERROR 10:15:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:15:48 localhost openstack_network_exporter[240658]: ERROR 10:15:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:15:48 localhost openstack_network_exporter[240658]: ERROR 10:15:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:15:48 localhost openstack_network_exporter[240658]: ERROR 10:15:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:15:48 localhost openstack_network_exporter[240658]: Nov 28 05:15:48 localhost openstack_network_exporter[240658]: ERROR 10:15:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:15:48 localhost openstack_network_exporter[240658]: Nov 28 05:15:50 localhost nova_compute[279673]: 2025-11-28 10:15:50.791 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:15:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:15:50.853 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:15:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:15:50.853 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:15:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:15:50.854 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:15:51 localhost nova_compute[279673]: 2025-11-28 10:15:51.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:15:51 localhost nova_compute[279673]: 2025-11-28 10:15:51.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:15:51 localhost nova_compute[279673]: 2025-11-28 10:15:51.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:15:51 localhost nova_compute[279673]: 2025-11-28 10:15:51.772 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 05:15:51 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:15:52 localhost nova_compute[279673]: 2025-11-28 10:15:52.429 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:15:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:15:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:15:53 localhost podman[332307]: 2025-11-28 10:15:53.850887727 +0000 UTC m=+0.081272307 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 05:15:53 localhost podman[332307]: 2025-11-28 10:15:53.858601736 +0000 UTC m=+0.088986346 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter) Nov 28 05:15:53 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:15:53 localhost podman[332308]: 2025-11-28 10:15:53.904595759 +0000 UTC m=+0.132915344 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS) Nov 28 05:15:53 localhost podman[332308]: 2025-11-28 10:15:53.914657961 +0000 UTC m=+0.142977556 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Nov 28 05:15:53 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:15:55 localhost nova_compute[279673]: 2025-11-28 10:15:55.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:15:55 localhost nova_compute[279673]: 2025-11-28 10:15:55.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:15:55 localhost nova_compute[279673]: 2025-11-28 10:15:55.793 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:15:55 localhost nova_compute[279673]: 2025-11-28 10:15:55.794 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:15:55 localhost nova_compute[279673]: 2025-11-28 10:15:55.794 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:15:55 localhost nova_compute[279673]: 2025-11-28 10:15:55.795 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 05:15:55 localhost nova_compute[279673]: 2025-11-28 10:15:55.795 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:15:56 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:15:56 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2526196820' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:15:56 localhost nova_compute[279673]: 2025-11-28 10:15:56.254 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:15:56 localhost nova_compute[279673]: 2025-11-28 10:15:56.314 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:15:56 localhost nova_compute[279673]: 2025-11-28 10:15:56.315 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:15:56 localhost nova_compute[279673]: 2025-11-28 10:15:56.521 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 05:15:56 localhost nova_compute[279673]: 2025-11-28 10:15:56.522 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=10992MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 05:15:56 localhost nova_compute[279673]: 2025-11-28 10:15:56.523 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:15:56 localhost nova_compute[279673]: 2025-11-28 10:15:56.523 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:15:56 localhost nova_compute[279673]: 2025-11-28 10:15:56.797 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 05:15:56 localhost nova_compute[279673]: 2025-11-28 10:15:56.798 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 05:15:56 localhost nova_compute[279673]: 2025-11-28 10:15:56.798 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 05:15:56 localhost nova_compute[279673]: 2025-11-28 10:15:56.892 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing inventories for resource provider 35fead26-0bad-4950-b646-987079d58a17 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Nov 28 05:15:56 localhost nova_compute[279673]: 2025-11-28 10:15:56.953 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Updating ProviderTree inventory for provider 35fead26-0bad-4950-b646-987079d58a17 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Nov 28 05:15:56 localhost nova_compute[279673]: 2025-11-28 10:15:56.954 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Updating inventory in ProviderTree for provider 35fead26-0bad-4950-b646-987079d58a17 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Nov 28 05:15:56 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:15:56 localhost nova_compute[279673]: 2025-11-28 10:15:56.971 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing aggregate associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Nov 28 05:15:56 localhost nova_compute[279673]: 2025-11-28 10:15:56.992 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Refreshing trait associations for resource provider 35fead26-0bad-4950-b646-987079d58a17, traits: COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,HW_CPU_X86_AESNI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_FMA3,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_ABM,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AMD_SVM,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_CLMUL,HW_CPU_X86_BMI2,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Nov 28 05:15:57 localhost nova_compute[279673]: 2025-11-28 10:15:57.026 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:15:57 localhost nova_compute[279673]: 2025-11-28 10:15:57.432 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:15:57 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:15:57 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1329289336' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:15:57 localhost nova_compute[279673]: 2025-11-28 10:15:57.485 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:15:57 localhost nova_compute[279673]: 2025-11-28 10:15:57.492 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 05:15:57 localhost nova_compute[279673]: 2025-11-28 10:15:57.510 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 05:15:57 localhost nova_compute[279673]: 2025-11-28 10:15:57.512 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 05:15:57 localhost nova_compute[279673]: 2025-11-28 10:15:57.513 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.989s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:15:58 localhost nova_compute[279673]: 2025-11-28 10:15:58.510 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:15:58 localhost nova_compute[279673]: 2025-11-28 10:15:58.511 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:15:58 localhost nova_compute[279673]: 2025-11-28 10:15:58.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:15:58 localhost nova_compute[279673]: 2025-11-28 10:15:58.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 05:15:58 localhost nova_compute[279673]: 2025-11-28 10:15:58.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 05:15:59 localhost nova_compute[279673]: 2025-11-28 10:15:59.301 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 05:15:59 localhost nova_compute[279673]: 2025-11-28 10:15:59.302 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 05:15:59 localhost nova_compute[279673]: 2025-11-28 10:15:59.302 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 05:15:59 localhost nova_compute[279673]: 2025-11-28 10:15:59.302 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:15:59 localhost nova_compute[279673]: 2025-11-28 10:15:59.685 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:15:59 localhost nova_compute[279673]: 2025-11-28 10:15:59.698 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 05:15:59 localhost nova_compute[279673]: 2025-11-28 10:15:59.698 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.677 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'name': 'test', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005538513.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9dda653c53224db086060962b0702694', 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'hostId': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.678 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.708 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 1439344424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.708 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.latency volume: 63315481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc7d0c19-01b9-4524-8b87-517a2a802529', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1439344424, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:16:00.678858', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3d04cab0-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.850750995, 'message_signature': 'eca5a7a586c4edd19f76eb9c8913fabf4522b7b3edc775edfb6b87309eae0824'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 63315481, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:16:00.678858', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3d04dd52-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.850750995, 'message_signature': '331a25e551b887fdb90f2147290296d02eb3d7bfecbbc3efd1132329218da339'}]}, 'timestamp': '2025-11-28 10:16:00.709196', '_unique_id': 'd339e3c9c9e543b8ae6fb9c564c3d260'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.710 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.711 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.712 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.712 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fded22d5-182f-4dc6-b1d9-1f7b988d2828', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:16:00.711996', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3d055e3a-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.850750995, 'message_signature': '637546bc4910fd054ea3c18438bd2dca00f9cfaef776125a7d767658d371d652'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:16:00.711996', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3d056e98-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.850750995, 'message_signature': 'b558d7ce00df5bc49d73538847cb0bfd20bbfedc5a55d1454209a07355314691'}]}, 'timestamp': '2025-11-28 10:16:00.712869', '_unique_id': 'c4a7a21e2a7a43188dd024dd2460c450'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.713 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.714 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.725 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.725 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd81ef56a-01c8-4d7c-875c-10d2a32b95ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:16:00.715006', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3d075f28-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.886933985, 'message_signature': '0d029be738a6979fa441af14a373feddb4b74a524008c7606a3105f5b8a17e39'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:16:00.715006', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3d076f7c-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.886933985, 'message_signature': '5cb98b17d841b7686c1cf4b977cd5926734db8309ee05103010245fdd79bbfcb'}]}, 'timestamp': '2025-11-28 10:16:00.725998', '_unique_id': 'b86cb7c01ad6412cbbce2c41775c6585'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.726 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.728 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.728 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.728 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab2d6e60-f0db-4017-9571-5689127754d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:16:00.728199', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3d07d5c0-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.886933985, 'message_signature': 'a5b76cb247d66104c0d5f8d68219792f2ae8aaeb34e3ba72f11dab3d994db871'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:16:00.728199', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3d07e556-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.886933985, 'message_signature': '3fe3126f4ceda88b6a7072abe378c88131512799ddb00e3cf53e815c98491336'}]}, 'timestamp': '2025-11-28 10:16:00.729014', '_unique_id': '322dfced9e63495e816977d19dd3b1f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.729 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.730 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.731 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.731 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.731 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28631402-744a-4228-be43-4f5f9daa1534', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:16:00.731285', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3d084e56-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.886933985, 'message_signature': 'b381f6b939e23b24a826a4806a1b456316535c4783de18b39891531801043373'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:16:00.731285', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3d085db0-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.886933985, 'message_signature': '4118cab07d29c2a4dd063cb6113d115122c89208379f5e8734ee578747a37b10'}]}, 'timestamp': '2025-11-28 10:16:00.732126', '_unique_id': 'e55258cff5f549529ba9c790638c62f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.732 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.734 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.737 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '098746a9-6c96-4743-b70f-50f62d64f6b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:16:00.734183', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '3d09463a-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.906077868, 'message_signature': 'c0f89fa1cac461ac72652cde0bd7d4476feca91ef6d41dcb0e4c0f0c7be64379'}]}, 'timestamp': '2025-11-28 10:16:00.738160', '_unique_id': '82c0a3cfa7ed425a9561e3edd1235f83'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.739 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.740 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.740 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'faeeb054-4a48-469d-920b-86b6ec48f3f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:16:00.740418', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '3d09b688-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.906077868, 'message_signature': '0ab9be3ee90259c65525c095e22f315d8803adaf0fd0274ca7e9adad883a4f74'}]}, 'timestamp': '2025-11-28 10:16:00.741051', '_unique_id': '640ce0e520b64d1c98008f3222b8ac05'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.741 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.743 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.743 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.743 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1cc957d3-394f-4cb3-b47e-c5e71fea0563', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:16:00.743467', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '3d0a2da2-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.906077868, 'message_signature': '76e6eb89f430c0081a884d814bde2d15b0460dc8b3fdb4af362afa3eeb05bb0c'}]}, 'timestamp': '2025-11-28 10:16:00.744011', '_unique_id': 'e3fdccba48514d1e81e9e39e1cab6721'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.744 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.746 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.764 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/memory.usage volume: 51.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1df59347-ed21-4bbf-93fa-707c89fdc4aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.671875, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:16:00.746261', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '3d0d5504-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.935910592, 'message_signature': '16a8c63f13aab10aa6a373bbcdc2990c3c6ed8c61b002385cf11f12055d83cb0'}]}, 'timestamp': '2025-11-28 10:16:00.764698', '_unique_id': '3c6d426a501149a09ade2dd46355bd7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.765 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.766 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.767 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.767 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92d48e41-1f81-4084-9a72-f0a0a64c2430', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:16:00.767143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3d0dcb24-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.850750995, 'message_signature': '6132aa63d6cf8921f8b342e45a7236dc5d7d71ca52e957eac8ddc7277e492f4f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:16:00.767143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3d0ddea2-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.850750995, 'message_signature': '7f32c0ffe4930dd91c9c44d65ea275dadcefbc4e6d077258adb463a8b76b2a10'}]}, 'timestamp': '2025-11-28 10:16:00.768238', '_unique_id': 'ad5b4f07ac90417fae8078a8f08f8565'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.769 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.770 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.770 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a8dad1c-be27-4d6e-838d-e2f07b5f5ba1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:16:00.770595', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '3d0e51a2-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.906077868, 'message_signature': '8bf6f78d1e7490a2ffc622f813b8ed9248beb28b989ed80d7657d2b01273e659'}]}, 'timestamp': '2025-11-28 10:16:00.771232', '_unique_id': '7454c1b1ed9743b198e0bd666fe02579'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.772 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.773 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.773 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.773 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '618379f5-2955-4139-9f6f-82e310344c48', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:16:00.773625', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '3d0ec7ae-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.906077868, 'message_signature': 'ae30a0b2e1df20793d913045a24944be2375f0f649d88cd912f50c1b87c732a5'}]}, 'timestamp': '2025-11-28 10:16:00.774242', '_unique_id': '288d44acce4142c5bbf64b936d86dd22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.775 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.776 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.776 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.777 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a52fc2ef-6b96-43fe-bf79-f96457505af7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:16:00.776625', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3d0f3c8e-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.850750995, 'message_signature': 'de63d32005b88ccc38c10f8a8b10387590f135aafd29e40bc8c0e2c8215812b4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:16:00.776625', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3d0f519c-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.850750995, 'message_signature': '9020f88c14f12332ce58186904a2b66dfed85c7c909573863eee0bff38394b81'}]}, 'timestamp': '2025-11-28 10:16:00.777681', '_unique_id': 'a16a7b29940d4f699a6312cc2bd250ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.778 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.779 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.780 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2fbe4e89-56a2-458d-9d75-6443ca9350f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:16:00.779959', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '3d0fc0dc-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.906077868, 'message_signature': '7ab5108a01831e368abfc4a97f71ce0dbb902538550cbf87813e219c84a122d4'}]}, 'timestamp': '2025-11-28 10:16:00.780604', '_unique_id': 'd2e65598e93d488bab9870c5a5277478'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.781 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.782 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.782 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 1124743927 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.783 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.latency volume: 22218411 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c68a2e24-d218-47a4-a79c-0e07345ec148', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1124743927, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:16:00.782836', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3d10309e-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.850750995, 'message_signature': '55d2e584586ef470696742971139677f66f42d3748501f052e5296cde288bea3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22218411, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:16:00.782836', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3d10444e-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.850750995, 'message_signature': 'c475628eadabe3990fb634701ce2493be77a231c8ede16fd56c5806cb8762e7e'}]}, 'timestamp': '2025-11-28 10:16:00.783890', '_unique_id': '8bbed43fb368475e8f217b865f8ff7c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.784 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.785 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.786 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f45d0660-7364-4690-ba49-72a141289b7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:16:00.786145', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '3d10b1ea-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.906077868, 'message_signature': '447719a389abf3a3e3d3559343c61cad8b3bfe868138c6ec1ba9ca9010ed85b4'}]}, 'timestamp': '2025-11-28 10:16:00.786767', '_unique_id': '1ff1e02d31e346f49f9aa6ad49a7be37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.787 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.788 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.789 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/cpu volume: 20070000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f622abe7-fc12-4452-8ad4-da81f3273ecc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20070000000, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'timestamp': '2025-11-28T10:16:00.789083', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '3d1123be-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.935910592, 'message_signature': 'd06a1e8d8c364151a0a69f6d08e5a300cd758a7f2a6ffe842425da42f66e5825'}]}, 'timestamp': '2025-11-28 10:16:00.789675', '_unique_id': 'b262a6bfdee84cb591bb279124909473'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.790 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.791 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.791 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.792 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f38c37ae-2bd1-411c-b1e7-8fd272225645', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:16:00.792154', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '3d119b8c-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.906077868, 'message_signature': '97f60b366e2d18daf121861101081c577c57e88447faa146c34f732a5647e6f3'}]}, 'timestamp': '2025-11-28 10:16:00.792699', '_unique_id': '7e0f4c28ef02420f8888f43cc008cda4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.793 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.794 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.795 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.795 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8d469f2-72a4-4930-87ef-5bd382c0da55', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vda', 'timestamp': '2025-11-28T10:16:00.795153', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3d120bbc-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.850750995, 'message_signature': '0d4db8536788fdeebdcd6b01972b6e3d31e7405dd71913cdda70fdf9b3afaa3e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-vdb', 'timestamp': '2025-11-28T10:16:00.795153', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3d12188c-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.850750995, 'message_signature': '7bc8f61052e82b30fe86cdd2fdc4c3352eede3b45b69660e38549f70a22d6f9c'}]}, 'timestamp': '2025-11-28 10:16:00.795799', '_unique_id': '9b449fdc93744848b05080b0c76ffd6e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.796 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.797 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.797 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '818d6f39-f1a7-46e7-b377-c6a5f5da23fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:16:00.797176', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '3d125afe-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.906077868, 'message_signature': 'b1611f46a964182e20c01f18bcb3fad57630f3ecbdcdcda479671c37f04e211e'}]}, 'timestamp': '2025-11-28 10:16:00.797561', '_unique_id': '7cd36745769e4aa2bbf337f3558ae700'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.798 12 DEBUG ceilometer.compute.pollsters [-] c2f0c7d6-df5f-4541-8b2c-bc1eaf805812/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '621c8aca-6187-490d-b9d4-5936489d8a7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '4d9169247d4447d0a8dd4c33f8b23dee', 'user_name': None, 'project_id': '9dda653c53224db086060962b0702694', 'project_name': None, 'resource_id': 'instance-00000002-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812-tap09612b07-51', 'timestamp': '2025-11-28T10:16:00.798917', 'resource_metadata': {'display_name': 'test', 'name': 'tap09612b07-51', 'instance_id': 'c2f0c7d6-df5f-4541-8b2c-bc1eaf805812', 'instance_type': 'm1.small', 'host': 'd3407dc0689483f363041b376d736f0f2680cfd0174e4796af277155', 'instance_host': 'np0005538513.localdomain', 'flavor': {'id': 'f3c44237-060e-4213-a926-aa7fdb4bf902', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '391767f1-35f2-4b68-ae15-e0b29db66dcb'}, 'image_ref': '391767f1-35f2-4b68-ae15-e0b29db66dcb', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:f4:fc:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09612b07-51'}, 'message_id': '3d12a04a-cc43-11f0-a370-fa163eb02593', 'monotonic_time': 12794.906077868, 'message_signature': 'b08e26e254e0846c8d59e5d19cb585a249a1eb3a30b4124e4bd72c773befdaaa'}]}, 'timestamp': '2025-11-28 10:16:00.799320', '_unique_id': '71121c48d8c647f3b51e43b9dedb2cc7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging yield Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging conn.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Nov 28 05:16:00 localhost ceilometer_agent_compute[236072]: 2025-11-28 10:16:00.799 12 ERROR oslo_messaging.notify.messaging Nov 28 05:16:01 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:16:02 localhost nova_compute[279673]: 2025-11-28 10:16:02.435 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:16:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:16:05 localhost podman[332393]: 2025-11-28 10:16:05.854376625 +0000 UTC m=+0.092994309 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Nov 28 05:16:05 localhost podman[332393]: 2025-11-28 10:16:05.866544731 +0000 UTC m=+0.105162375 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, release=1755695350, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc.) Nov 28 05:16:05 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:16:06 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e290 do_prune osdmap full prune enabled Nov 28 05:16:06 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e291 e291: 6 total, 6 up, 6 in Nov 28 05:16:06 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e291: 6 total, 6 up, 6 in Nov 28 05:16:06 localhost nova_compute[279673]: 2025-11-28 10:16:06.694 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:16:06 localhost nova_compute[279673]: 2025-11-28 10:16:06.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:16:06 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:16:07 localhost nova_compute[279673]: 2025-11-28 10:16:07.440 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:16:07 localhost nova_compute[279673]: 2025-11-28 10:16:07.442 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:16:07 localhost nova_compute[279673]: 2025-11-28 10:16:07.442 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:16:07 localhost nova_compute[279673]: 2025-11-28 10:16:07.442 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:16:07 localhost ovn_metadata_agent[158125]: 2025-11-28 10:16:07.445 158130 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '92:49:97', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ca:ab:0a:de:51:20'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Nov 28 05:16:07 localhost ovn_metadata_agent[158125]: 2025-11-28 10:16:07.446 158130 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Nov 28 05:16:07 localhost nova_compute[279673]: 2025-11-28 10:16:07.467 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:16:07 localhost nova_compute[279673]: 2025-11-28 10:16:07.467 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:16:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:16:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:16:07 localhost systemd[1]: tmp-crun.98xNFp.mount: Deactivated successfully. Nov 28 05:16:07 localhost podman[332413]: 2025-11-28 10:16:07.866428895 +0000 UTC m=+0.100917654 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 05:16:07 localhost podman[332413]: 2025-11-28 10:16:07.879647555 +0000 UTC m=+0.114136304 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter) Nov 28 05:16:07 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:16:07 localhost podman[332414]: 2025-11-28 10:16:07.95738578 +0000 UTC m=+0.186470632 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image) Nov 28 05:16:07 localhost podman[332414]: 2025-11-28 10:16:07.99548463 +0000 UTC m=+0.224569462 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent) Nov 28 05:16:08 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:16:10 localhost podman[238687]: time="2025-11-28T10:16:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:16:10 localhost podman[238687]: @ - - [28/Nov/2025:10:16:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 28 05:16:10 localhost podman[238687]: @ - - [28/Nov/2025:10:16:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19282 "" "Go-http-client/1.1" Nov 28 05:16:11 localhost ovn_metadata_agent[158125]: 2025-11-28 10:16:11.448 158130 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=c85299c6-8e38-42c8-8509-2eaaf15c050c, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Nov 28 05:16:11 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:16:12 localhost nova_compute[279673]: 2025-11-28 10:16:12.467 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:16:12 localhost nova_compute[279673]: 2025-11-28 10:16:12.469 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:16:14 localhost nova_compute[279673]: 2025-11-28 10:16:14.633 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:16:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:16:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:16:15 localhost systemd[1]: tmp-crun.KOH8q6.mount: Deactivated successfully. Nov 28 05:16:15 localhost podman[332455]: 2025-11-28 10:16:15.869499778 +0000 UTC m=+0.099159361 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Nov 28 05:16:15 localhost podman[332454]: 2025-11-28 10:16:15.962504077 +0000 UTC m=+0.194393158 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute) Nov 28 05:16:15 localhost podman[332455]: 2025-11-28 10:16:15.98814974 +0000 UTC m=+0.217809323 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, managed_by=edpm_ansible) Nov 28 05:16:16 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:16:16 localhost podman[332454]: 2025-11-28 10:16:16.005089574 +0000 UTC m=+0.236978655 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2) Nov 28 05:16:16 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0. Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.156162) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70 Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324976156204, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 2245, "num_deletes": 251, "total_data_size": 2059071, "memory_usage": 2102240, "flush_reason": "Manual Compaction"} Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324976169322, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 1989878, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39024, "largest_seqno": 41268, "table_properties": {"data_size": 1980900, "index_size": 5423, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 21281, "raw_average_key_size": 21, "raw_value_size": 1961837, "raw_average_value_size": 1971, "num_data_blocks": 233, "num_entries": 995, "num_filter_entries": 995, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764324812, "oldest_key_time": 1764324812, "file_creation_time": 1764324976, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}} Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 13216 microseconds, and 5762 cpu microseconds. Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.169374) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 1989878 bytes OK Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.169397) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.171688) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.171709) EVENT_LOG_v1 {"time_micros": 1764324976171703, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.171730) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 2049537, prev total WAL file size 2049537, number of live WAL files 2. Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.172453) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003133303532' seq:72057594037927935, type:22 .. '7061786F73003133333034' seq:0, type:0; will stop at (end) Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00 Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(1943KB)], [69(18MB)] Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324976172524, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 21596800, "oldest_snapshot_seqno": -1} Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 14828 keys, 20052075 bytes, temperature: kUnknown Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324976277069, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 20052075, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19964297, "index_size": 49597, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 37125, "raw_key_size": 394575, "raw_average_key_size": 26, "raw_value_size": 19710029, "raw_average_value_size": 1329, "num_data_blocks": 1867, "num_entries": 14828, "num_filter_entries": 14828, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764323559, "oldest_key_time": 0, "file_creation_time": 1764324976, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "49d3ae8b-2ff6-4713-88ed-5986b1f8221e", "db_session_id": "MM4LCQC4OTZXQR5A0TS6", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}} Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.277389) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 20052075 bytes Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.279129) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 206.4 rd, 191.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 18.7 +0.0 blob) out(19.1 +0.0 blob), read-write-amplify(20.9) write-amplify(10.1) OK, records in: 15361, records dropped: 533 output_compression: NoCompression Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.279160) EVENT_LOG_v1 {"time_micros": 1764324976279146, "job": 42, "event": "compaction_finished", "compaction_time_micros": 104636, "compaction_time_cpu_micros": 55212, "output_level": 6, "num_output_files": 1, "total_output_size": 20052075, "num_input_records": 15361, "num_output_records": 14828, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324976279572, "job": 42, "event": "table_file_deletion", "file_number": 71} Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005538513/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764324976282571, "job": 42, "event": "table_file_deletion", "file_number": 69} Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.172349) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.282660) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.282667) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.282670) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.282673) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:16:16 localhost ceph-mon[292954]: rocksdb: (Original Log Time 2025/11/28-10:16:16.282675) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Nov 28 05:16:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:16:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e291 do_prune osdmap full prune enabled Nov 28 05:16:16 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e292 e292: 6 total, 6 up, 6 in Nov 28 05:16:17 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e292: 6 total, 6 up, 6 in Nov 28 05:16:17 localhost nova_compute[279673]: 2025-11-28 10:16:17.470 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:16:17 localhost nova_compute[279673]: 2025-11-28 10:16:17.473 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:16:18 localhost openstack_network_exporter[240658]: ERROR 10:16:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:16:18 localhost openstack_network_exporter[240658]: ERROR 10:16:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:16:18 localhost openstack_network_exporter[240658]: ERROR 10:16:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:16:18 localhost openstack_network_exporter[240658]: ERROR 10:16:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:16:18 localhost openstack_network_exporter[240658]: Nov 28 05:16:18 localhost openstack_network_exporter[240658]: ERROR 10:16:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:16:18 localhost openstack_network_exporter[240658]: Nov 28 05:16:18 localhost nova_compute[279673]: 2025-11-28 10:16:18.772 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:16:18 localhost nova_compute[279673]: 2025-11-28 10:16:18.773 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Nov 28 05:16:21 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e292 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:16:22 localhost nova_compute[279673]: 2025-11-28 10:16:22.475 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:16:22 localhost nova_compute[279673]: 2025-11-28 10:16:22.477 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:16:22 localhost nova_compute[279673]: 2025-11-28 10:16:22.477 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:16:22 localhost nova_compute[279673]: 2025-11-28 10:16:22.477 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:16:22 localhost nova_compute[279673]: 2025-11-28 10:16:22.488 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:16:22 localhost nova_compute[279673]: 2025-11-28 10:16:22.489 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:16:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:16:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:16:24 localhost podman[332499]: 2025-11-28 10:16:24.853270588 +0000 UTC m=+0.084453216 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 05:16:24 localhost podman[332499]: 2025-11-28 10:16:24.864442674 +0000 UTC m=+0.095625262 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Nov 28 05:16:24 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:16:24 localhost podman[332500]: 2025-11-28 10:16:24.920949532 +0000 UTC m=+0.146882578 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Nov 28 05:16:24 localhost podman[332500]: 2025-11-28 10:16:24.933514211 +0000 UTC m=+0.159447227 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd) Nov 28 05:16:24 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:16:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e292 do_prune osdmap full prune enabled Nov 28 05:16:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e293 e293: 6 total, 6 up, 6 in Nov 28 05:16:26 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e293: 6 total, 6 up, 6 in Nov 28 05:16:26 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:16:27 localhost nova_compute[279673]: 2025-11-28 10:16:27.489 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:16:27 localhost nova_compute[279673]: 2025-11-28 10:16:27.491 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:16:27 localhost nova_compute[279673]: 2025-11-28 10:16:27.491 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:16:27 localhost nova_compute[279673]: 2025-11-28 10:16:27.492 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:16:27 localhost nova_compute[279673]: 2025-11-28 10:16:27.513 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:16:27 localhost nova_compute[279673]: 2025-11-28 10:16:27.513 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:16:31 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:16:32 localhost nova_compute[279673]: 2025-11-28 10:16:32.514 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:16:32 localhost nova_compute[279673]: 2025-11-28 10:16:32.515 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:16:32 localhost nova_compute[279673]: 2025-11-28 10:16:32.515 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:16:32 localhost nova_compute[279673]: 2025-11-28 10:16:32.515 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:16:32 localhost nova_compute[279673]: 2025-11-28 10:16:32.516 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:16:32 localhost nova_compute[279673]: 2025-11-28 10:16:32.517 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:16:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:16:36 localhost systemd[1]: tmp-crun.DVjIWv.mount: Deactivated successfully. Nov 28 05:16:36 localhost podman[332541]: 2025-11-28 10:16:36.861763329 +0000 UTC m=+0.098591662 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm) Nov 28 05:16:36 localhost podman[332541]: 2025-11-28 10:16:36.904699849 +0000 UTC m=+0.141528152 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, maintainer=Red Hat, Inc., vcs-type=git, name=ubi9-minimal, version=9.6, distribution-scope=public, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350) Nov 28 05:16:36 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:16:37 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:16:37 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e293 do_prune osdmap full prune enabled Nov 28 05:16:37 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 e294: 6 total, 6 up, 6 in Nov 28 05:16:37 localhost ceph-mon[292954]: log_channel(cluster) log [DBG] : osdmap e294: 6 total, 6 up, 6 in Nov 28 05:16:37 localhost nova_compute[279673]: 2025-11-28 10:16:37.518 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:16:37 localhost nova_compute[279673]: 2025-11-28 10:16:37.520 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:16:37 localhost nova_compute[279673]: 2025-11-28 10:16:37.521 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:16:37 localhost nova_compute[279673]: 2025-11-28 10:16:37.521 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:16:37 localhost nova_compute[279673]: 2025-11-28 10:16:37.544 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:16:37 localhost nova_compute[279673]: 2025-11-28 10:16:37.544 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:16:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:16:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:16:38 localhost systemd[1]: tmp-crun.ux6kgW.mount: Deactivated successfully. Nov 28 05:16:38 localhost podman[332580]: 2025-11-28 10:16:38.191156229 +0000 UTC m=+0.077464749 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Nov 28 05:16:38 localhost systemd[1]: tmp-crun.LEI9gn.mount: Deactivated successfully. Nov 28 05:16:38 localhost podman[332578]: 2025-11-28 10:16:38.257825103 +0000 UTC m=+0.141290735 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Nov 28 05:16:38 localhost podman[332580]: 2025-11-28 10:16:38.276002645 +0000 UTC m=+0.162311135 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Nov 28 05:16:38 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:16:38 localhost podman[332578]: 2025-11-28 10:16:38.296399866 +0000 UTC m=+0.179865518 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 05:16:38 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:16:38 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Nov 28 05:16:38 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:16:39 localhost ceph-mon[292954]: from='mgr.34481 172.18.0.108:0/3924249495' entity='mgr.np0005538515.yfkzhl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Nov 28 05:16:39 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:16:40 localhost podman[238687]: time="2025-11-28T10:16:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:16:40 localhost podman[238687]: @ - - [28/Nov/2025:10:16:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 28 05:16:40 localhost podman[238687]: @ - - [28/Nov/2025:10:16:40 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19281 "" "Go-http-client/1.1" Nov 28 05:16:41 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Nov 28 05:16:41 localhost ceph-mon[292954]: log_channel(audit) log [INF] : from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:16:41 localhost ceph-mon[292954]: from='mgr.34481 ' entity='mgr.np0005538515.yfkzhl' Nov 28 05:16:42 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:16:42 localhost nova_compute[279673]: 2025-11-28 10:16:42.545 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:16:42 localhost nova_compute[279673]: 2025-11-28 10:16:42.560 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:16:42 localhost nova_compute[279673]: 2025-11-28 10:16:42.560 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5015 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:16:42 localhost nova_compute[279673]: 2025-11-28 10:16:42.561 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:16:42 localhost nova_compute[279673]: 2025-11-28 10:16:42.561 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:16:42 localhost nova_compute[279673]: 2025-11-28 10:16:42.562 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:16:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:16:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:16:46 localhost podman[332688]: 2025-11-28 10:16:46.866259153 +0000 UTC m=+0.091342429 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Nov 28 05:16:46 localhost podman[332688]: 2025-11-28 10:16:46.94239914 +0000 UTC m=+0.167482426 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true) Nov 28 05:16:46 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:16:47 localhost podman[332687]: 2025-11-28 10:16:46.949724756 +0000 UTC m=+0.175716040 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629) Nov 28 05:16:47 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:16:47 localhost podman[332687]: 2025-11-28 10:16:47.030252369 +0000 UTC m=+0.256243613 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Nov 28 05:16:47 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:16:47 localhost nova_compute[279673]: 2025-11-28 10:16:47.563 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:16:47 localhost nova_compute[279673]: 2025-11-28 10:16:47.564 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:16:48 localhost openstack_network_exporter[240658]: ERROR 10:16:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:16:48 localhost openstack_network_exporter[240658]: ERROR 10:16:48 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:16:48 localhost openstack_network_exporter[240658]: ERROR 10:16:48 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:16:48 localhost openstack_network_exporter[240658]: ERROR 10:16:48 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:16:48 localhost openstack_network_exporter[240658]: Nov 28 05:16:48 localhost openstack_network_exporter[240658]: ERROR 10:16:48 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:16:48 localhost openstack_network_exporter[240658]: Nov 28 05:16:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:16:50.853 158130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:16:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:16:50.854 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:16:50 localhost ovn_metadata_agent[158125]: 2025-11-28 10:16:50.855 158130 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:16:51 localhost nova_compute[279673]: 2025-11-28 10:16:51.790 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:16:52 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:16:52 localhost nova_compute[279673]: 2025-11-28 10:16:52.565 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:16:52 localhost nova_compute[279673]: 2025-11-28 10:16:52.595 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:16:52 localhost nova_compute[279673]: 2025-11-28 10:16:52.596 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5030 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:16:52 localhost nova_compute[279673]: 2025-11-28 10:16:52.596 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:16:52 localhost nova_compute[279673]: 2025-11-28 10:16:52.596 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:16:52 localhost nova_compute[279673]: 2025-11-28 10:16:52.597 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:16:52 localhost nova_compute[279673]: 2025-11-28 10:16:52.598 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:16:52 localhost nova_compute[279673]: 2025-11-28 10:16:52.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:16:52 localhost nova_compute[279673]: 2025-11-28 10:16:52.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:16:53 localhost nova_compute[279673]: 2025-11-28 10:16:53.589 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:16:53 localhost nova_compute[279673]: 2025-11-28 10:16:53.612 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Triggering sync for uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Nov 28 05:16:53 localhost nova_compute[279673]: 2025-11-28 10:16:53.613 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:16:53 localhost nova_compute[279673]: 2025-11-28 10:16:53.613 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:16:53 localhost nova_compute[279673]: 2025-11-28 10:16:53.645 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.032s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:16:53 localhost nova_compute[279673]: 2025-11-28 10:16:53.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:16:53 localhost nova_compute[279673]: 2025-11-28 10:16:53.771 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Nov 28 05:16:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553. Nov 28 05:16:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9. Nov 28 05:16:55 localhost nova_compute[279673]: 2025-11-28 10:16:55.771 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:16:55 localhost systemd[1]: tmp-crun.5nCuzO.mount: Deactivated successfully. Nov 28 05:16:55 localhost podman[332730]: 2025-11-28 10:16:55.872819116 +0000 UTC m=+0.101872925 container health_status 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors ) Nov 28 05:16:55 localhost podman[332731]: 2025-11-28 10:16:55.93499874 +0000 UTC m=+0.163839713 container health_status 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.license=GPLv2) Nov 28 05:16:55 localhost podman[332731]: 2025-11-28 10:16:55.946602029 +0000 UTC m=+0.175443032 container exec_died 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:16:55 localhost podman[332730]: 2025-11-28 10:16:55.960726726 +0000 UTC m=+0.189780555 container exec_died 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm) Nov 28 05:16:55 localhost systemd[1]: 7f78f2b03aa38ec728955ec34b5d8db5a5ff0299f603513cb4ca0c676861a5b9.service: Deactivated successfully. Nov 28 05:16:55 localhost systemd[1]: 49f8e752f27c71dd72dcb04d456b84eaf9a7e57327e6820c024314f9cf1a7553.service: Deactivated successfully. Nov 28 05:16:57 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:16:57 localhost nova_compute[279673]: 2025-11-28 10:16:57.600 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:16:57 localhost nova_compute[279673]: 2025-11-28 10:16:57.602 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:16:57 localhost nova_compute[279673]: 2025-11-28 10:16:57.602 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:16:57 localhost nova_compute[279673]: 2025-11-28 10:16:57.603 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:16:57 localhost nova_compute[279673]: 2025-11-28 10:16:57.631 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:16:57 localhost nova_compute[279673]: 2025-11-28 10:16:57.632 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:16:57 localhost nova_compute[279673]: 2025-11-28 10:16:57.770 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:16:57 localhost nova_compute[279673]: 2025-11-28 10:16:57.803 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:16:57 localhost nova_compute[279673]: 2025-11-28 10:16:57.804 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:16:57 localhost nova_compute[279673]: 2025-11-28 10:16:57.804 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:16:57 localhost nova_compute[279673]: 2025-11-28 10:16:57.805 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Auditing locally available compute resources for np0005538513.localdomain (node: np0005538513.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Nov 28 05:16:57 localhost nova_compute[279673]: 2025-11-28 10:16:57.806 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:16:58 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:16:58 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/775532288' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:16:58 localhost nova_compute[279673]: 2025-11-28 10:16:58.296 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:16:58 localhost nova_compute[279673]: 2025-11-28 10:16:58.362 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:16:58 localhost nova_compute[279673]: 2025-11-28 10:16:58.363 279685 DEBUG nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Nov 28 05:16:58 localhost nova_compute[279673]: 2025-11-28 10:16:58.573 279685 WARNING nova.virt.libvirt.driver [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Nov 28 05:16:58 localhost nova_compute[279673]: 2025-11-28 10:16:58.574 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Hypervisor/Node resource view: name=np0005538513.localdomain free_ram=10991MB free_disk=41.83686447143555GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Nov 28 05:16:58 localhost nova_compute[279673]: 2025-11-28 10:16:58.574 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Nov 28 05:16:58 localhost nova_compute[279673]: 2025-11-28 10:16:58.575 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Nov 28 05:16:58 localhost nova_compute[279673]: 2025-11-28 10:16:58.664 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Instance c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Nov 28 05:16:58 localhost nova_compute[279673]: 2025-11-28 10:16:58.664 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Nov 28 05:16:58 localhost nova_compute[279673]: 2025-11-28 10:16:58.665 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Final resource view: name=np0005538513.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Nov 28 05:16:58 localhost nova_compute[279673]: 2025-11-28 10:16:58.721 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Nov 28 05:16:59 localhost ceph-mon[292954]: mon.np0005538513@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Nov 28 05:16:59 localhost ceph-mon[292954]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1769094031' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Nov 28 05:16:59 localhost nova_compute[279673]: 2025-11-28 10:16:59.187 279685 DEBUG oslo_concurrency.processutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Nov 28 05:16:59 localhost sshd[332817]: main: sshd: ssh-rsa algorithm is disabled Nov 28 05:16:59 localhost nova_compute[279673]: 2025-11-28 10:16:59.196 279685 DEBUG nova.compute.provider_tree [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed in ProviderTree for provider: 35fead26-0bad-4950-b646-987079d58a17 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Nov 28 05:16:59 localhost nova_compute[279673]: 2025-11-28 10:16:59.226 279685 DEBUG nova.scheduler.client.report [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Inventory has not changed for provider 35fead26-0bad-4950-b646-987079d58a17 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Nov 28 05:16:59 localhost nova_compute[279673]: 2025-11-28 10:16:59.230 279685 DEBUG nova.compute.resource_tracker [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Compute_service record updated for np0005538513.localdomain:np0005538513.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Nov 28 05:16:59 localhost nova_compute[279673]: 2025-11-28 10:16:59.231 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.657s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Nov 28 05:16:59 localhost systemd-logind[764]: New session 75 of user zuul. Nov 28 05:16:59 localhost systemd[1]: Started Session 75 of User zuul. Nov 28 05:16:59 localhost python3[332839]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-49a1-b30e-00000000000c-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 28 05:17:00 localhost nova_compute[279673]: 2025-11-28 10:17:00.228 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:17:00 localhost nova_compute[279673]: 2025-11-28 10:17:00.229 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:17:00 localhost nova_compute[279673]: 2025-11-28 10:17:00.229 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Nov 28 05:17:00 localhost nova_compute[279673]: 2025-11-28 10:17:00.229 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Nov 28 05:17:01 localhost nova_compute[279673]: 2025-11-28 10:17:01.347 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquiring lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Nov 28 05:17:01 localhost nova_compute[279673]: 2025-11-28 10:17:01.348 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Acquired lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Nov 28 05:17:01 localhost nova_compute[279673]: 2025-11-28 10:17:01.348 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Nov 28 05:17:01 localhost nova_compute[279673]: 2025-11-28 10:17:01.348 279685 DEBUG nova.objects.instance [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2f0c7d6-df5f-4541-8b2c-bc1eaf805812 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Nov 28 05:17:02 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:17:02 localhost nova_compute[279673]: 2025-11-28 10:17:02.587 279685 DEBUG nova.network.neutron [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updating instance_info_cache with network_info: [{"id": "09612b07-5142-4b0f-9dab-74bf4403f69f", "address": "fa:16:3e:f4:fc:6c", "network": {"id": "40d5da59-6201-424a-8380-80ecc3d67c7e", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.142", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "9dda653c53224db086060962b0702694", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09612b07-51", "ovs_interfaceid": "09612b07-5142-4b0f-9dab-74bf4403f69f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Nov 28 05:17:02 localhost nova_compute[279673]: 2025-11-28 10:17:02.604 279685 DEBUG oslo_concurrency.lockutils [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Releasing lock "refresh_cache-c2f0c7d6-df5f-4541-8b2c-bc1eaf805812" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Nov 28 05:17:02 localhost nova_compute[279673]: 2025-11-28 10:17:02.604 279685 DEBUG nova.compute.manager [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] [instance: c2f0c7d6-df5f-4541-8b2c-bc1eaf805812] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Nov 28 05:17:02 localhost nova_compute[279673]: 2025-11-28 10:17:02.605 279685 DEBUG oslo_service.periodic_task [None req-3864578a-a749-423f-8609-18203f8af2e4 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Nov 28 05:17:02 localhost nova_compute[279673]: 2025-11-28 10:17:02.633 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:17:02 localhost nova_compute[279673]: 2025-11-28 10:17:02.635 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:17:02 localhost nova_compute[279673]: 2025-11-28 10:17:02.636 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:17:02 localhost nova_compute[279673]: 2025-11-28 10:17:02.636 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:17:02 localhost nova_compute[279673]: 2025-11-28 10:17:02.637 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:17:02 localhost nova_compute[279673]: 2025-11-28 10:17:02.638 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:17:05 localhost systemd[1]: session-75.scope: Deactivated successfully. Nov 28 05:17:05 localhost systemd-logind[764]: Session 75 logged out. Waiting for processes to exit. Nov 28 05:17:05 localhost systemd-logind[764]: Removed session 75. Nov 28 05:17:07 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:17:07 localhost nova_compute[279673]: 2025-11-28 10:17:07.639 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:17:07 localhost nova_compute[279673]: 2025-11-28 10:17:07.733 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:17:07 localhost nova_compute[279673]: 2025-11-28 10:17:07.734 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5095 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:17:07 localhost nova_compute[279673]: 2025-11-28 10:17:07.734 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:17:07 localhost nova_compute[279673]: 2025-11-28 10:17:07.735 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:17:07 localhost nova_compute[279673]: 2025-11-28 10:17:07.735 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:17:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659. Nov 28 05:17:07 localhost podman[332842]: 2025-11-28 10:17:07.844172999 +0000 UTC m=+0.083446303 container health_status a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Nov 28 05:17:07 localhost podman[332842]: 2025-11-28 10:17:07.864117636 +0000 UTC m=+0.103390980 container exec_died a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, version=9.6, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container) Nov 28 05:17:07 localhost systemd[1]: a71d92dceeb681303d6cfaac7b29aabf5010262576690ade3752dc1c7be08659.service: Deactivated successfully. Nov 28 05:17:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2. Nov 28 05:17:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8. Nov 28 05:17:08 localhost podman[332862]: 2025-11-28 10:17:08.851233571 +0000 UTC m=+0.085079265 container health_status ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent) Nov 28 05:17:08 localhost podman[332862]: 2025-11-28 10:17:08.882547381 +0000 UTC m=+0.116393055 container exec_died ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Nov 28 05:17:08 localhost systemd[1]: ca54819f70b419cf9488d98fee53ca25c7067541eb9353f149fa4f67216296d8.service: Deactivated successfully. Nov 28 05:17:08 localhost systemd[1]: tmp-crun.RSk4qD.mount: Deactivated successfully. Nov 28 05:17:08 localhost podman[332861]: 2025-11-28 10:17:08.964719424 +0000 UTC m=+0.200971612 container health_status 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 05:17:08 localhost podman[332861]: 2025-11-28 10:17:08.972444733 +0000 UTC m=+0.208696951 container exec_died 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Nov 28 05:17:08 localhost systemd[1]: 3e6104b1c2cd68d45dc7b249827c06f9708b69582e19f06d2ebf93ad623926b2.service: Deactivated successfully. Nov 28 05:17:10 localhost podman[238687]: time="2025-11-28T10:17:10Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Nov 28 05:17:10 localhost podman[238687]: @ - - [28/Nov/2025:10:17:10 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155688 "" "Go-http-client/1.1" Nov 28 05:17:10 localhost podman[238687]: @ - - [28/Nov/2025:10:17:10 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19285 "" "Go-http-client/1.1" Nov 28 05:17:12 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:17:12 localhost nova_compute[279673]: 2025-11-28 10:17:12.736 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:17:12 localhost nova_compute[279673]: 2025-11-28 10:17:12.738 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:17:17 localhost ceph-mon[292954]: mon.np0005538513@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Nov 28 05:17:17 localhost nova_compute[279673]: 2025-11-28 10:17:17.739 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:17:17 localhost nova_compute[279673]: 2025-11-28 10:17:17.741 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Nov 28 05:17:17 localhost nova_compute[279673]: 2025-11-28 10:17:17.741 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Nov 28 05:17:17 localhost nova_compute[279673]: 2025-11-28 10:17:17.741 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:17:17 localhost nova_compute[279673]: 2025-11-28 10:17:17.773 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:17:17 localhost nova_compute[279673]: 2025-11-28 10:17:17.774 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Nov 28 05:17:17 localhost nova_compute[279673]: 2025-11-28 10:17:17.775 279685 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Nov 28 05:17:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0. Nov 28 05:17:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c. Nov 28 05:17:17 localhost systemd[1]: Starting dnf makecache... Nov 28 05:17:17 localhost podman[332902]: 2025-11-28 10:17:17.889710491 +0000 UTC m=+0.094222548 container health_status 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Nov 28 05:17:17 localhost podman[332902]: 2025-11-28 10:17:17.905264982 +0000 UTC m=+0.109777009 container exec_died 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible) Nov 28 05:17:17 localhost podman[332903]: 2025-11-28 10:17:17.940219524 +0000 UTC m=+0.144380480 container health_status 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true) Nov 28 05:17:17 localhost systemd[1]: 1a28e0a0dcac6235ce4b798148a8e2dc6b35cd49c3cc1f8c586a57c78453e7a0.service: Deactivated successfully. Nov 28 05:17:17 localhost podman[332903]: 2025-11-28 10:17:17.987546659 +0000 UTC m=+0.191707655 container exec_died 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Nov 28 05:17:18 localhost systemd[1]: 9c87da89bd487dae4b022feb60a68c93bf9e80a2bac73ed5bde020957a2cd46c.service: Deactivated successfully. Nov 28 05:17:18 localhost dnf[332904]: Updating Subscription Management repositories. Nov 28 05:17:18 localhost dnf[332904]: Unable to read consumer identity Nov 28 05:17:18 localhost dnf[332904]: This system is not registered with an entitlement server. You can use subscription-manager to register. Nov 28 05:17:18 localhost dnf[332904]: Metadata cache refreshed recently. Nov 28 05:17:18 localhost openstack_network_exporter[240658]: ERROR 10:17:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:17:18 localhost openstack_network_exporter[240658]: ERROR 10:17:18 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Nov 28 05:17:18 localhost openstack_network_exporter[240658]: ERROR 10:17:18 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Nov 28 05:17:18 localhost openstack_network_exporter[240658]: ERROR 10:17:18 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Nov 28 05:17:18 localhost openstack_network_exporter[240658]: Nov 28 05:17:18 localhost openstack_network_exporter[240658]: ERROR 10:17:18 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Nov 28 05:17:18 localhost openstack_network_exporter[240658]: Nov 28 05:17:18 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Nov 28 05:17:18 localhost systemd[1]: Finished dnf makecache. Nov 28 05:17:18 localhost sshd[332945]: main: sshd: ssh-rsa algorithm is disabled